Computer technology trends for 2016

It seems that PC market seems to be stabilizing in 2016. I expect that the PC market to shrinks slightly. While mobile devices have been named as culprits for the fall of PC shipments, IDC said that other factors may be in play. It is still pretty hard to make any decent profits with building PC hardware unless you are one of the biggest players – so again Lenovo, HP, and Dell are increasing their collective dominance of the PC market like they did in 2015. I expect changes like spin-offs and maybe some mergers with with smaller players like Fujitsu, Toshiba and Sony. The EMEA server market looks to be a two-horse race between Hewlett Packard Enterprise and Dell, according to Gartner. HPE, Dell and Cisco “all benefited” from Lenovo’s acquisition of IBM’s EMEA x86 server organisation.

Tablet market is no longer high grow market – tablet maker has started to decline, and decline continues in 2016 as owners are holding onto their existing devices for more than 3 years. iPad sales are set to continue decline and iPad Air 3 to be released in 1st half of 2016 does not change that. IDC predicts that detachable tablet market set for growth in 2016 as more people are turning to hybrid devices. Two-in-one tablets have been popularized by offerings like the Microsoft Surface, with options ranging dramatically in price and specs. I am not myself convinced that the growth will be as IDC forecasts, even though Company have started to make purchases of tablets for workers in jobs such as retail sales or field work (Apple iPads, Windows and Android tablets managed by company). Combined volume shipments of PCs, tablets and smartphones are expected to increase only in the single digits.

All your consumer tech gear should be cheaper come July as shere will be less import tariffs for IT products as World Trade Organization (WTO) deal agrees that tariffs on imports of consumer electronics will be phased out over 7 years starting in July 2016. The agreement affects around 10 percent of the world trade in information and communications technology products and will eliminate around $50 billion in tariffs annually.

Happy Computer Laptop

In 2015 the storage was rocked to its foundations and those new innovations will be taken into wider use in 2016. The storage market in 2015 went through strategic foundation-shaking turmoil as the external shared disk array storage playbook was torn to shreds: The all-flash data centre idea has definitely taken off as a vision that could be achieved so that primary data is stored in flash with the rest being held in cheap and deep storage.  Flash drives generally solve the dusk drive latency access problem, so not so much need for hybrid drives. There is conviction that storage should be located as close to servers as possible (virtual SANs, hyper-converged industry appliances  and NVMe fabrics). The existing hybrid cloud concept was adopted/supported by everybody. Flash started out in 2-bits/cell MLC form and this rapidly became standard and TLC (3-bits/cell or triple layer cell) had started appearing. Industry-standard NVMe drivers for PCIe flash cards appeared. Intel and Micron blew non-volatile memory preconceptions out of the water in the second half of the year with their joint 3D XPoint memory announcement. Boring old disk  disk tech got shingled magnetic recording (SMR) and helium-filled drive technology; drive industry is focused on capacity-optimizing its drives.  We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being. Tape industry developed a 15TB LTO-7 format.

The use of SSD will increase and it’s price will drop. SSDs will be in more than 25% of new laptops sold in 2015.  SSDs are expected to be in 31% of new consumer laptops in 2016 and more than 40% by 2017. The prices of mainstream consumer SSDs have fallen dramatically every year over the past three years while HDD prices have not changed much.  SSD prices will decline to 24 cents per gigabyte in 2016. In 2017 they’re expected to drop to 11-17 cents per gigabyte (means a 1TB SSD on average would retail for $170 or less).

Hard disk sales will decrease, but this technology is not dead. Sales of hard disk drives have been decreasing for several years now (118 million units in the third quarter of 2015), but according to Seagate hard disk drives (HDDs) are set to still stay relevant around for at least 15 years to 20 years.  HDDs remain the most popular data storage technology as it is cheapest in terms of per-gigabyte costs. While SSDs are generally getting more affordable, high-capacity solid-state drives are not going to become as inexpensive as hard drives any time soon. 

Because all-flash storage systems with homogenous flash media are still too expensive to serve as a solution to for every enterprise application workload, enterprises will increasingly turn to performance optimized storage solutions that use a combination of multiple media types to deliver cost-effective performance. The speed advantage of Fibre Channel over Ethernet has evaporated. Enterprises also start  to seek alternatives to snapshots that are simpler and easier to manage, and will allow data and application recovery to a second before the data error or logical corruption occurred.

Local storage and the cloud finally make peace in 2016 as the decision-makers across the industry have now acknowledged the potential for enterprise storage and the cloud to work in tandem. Over 40 percent of data worldwide is expected to live on or move through the cloud by 2020 according to IDC.

Happy Computer Laptop

Open standards for data center development are now a reality thanks to advances in cloud technology. Facebook’s Open Compute Project has served as the industry’s leader in this regard.This allows more consolidation for those that want that. Consolidation used to refer to companies moving all of their infrastructure to the same facility. However, some experts have begun to question this strategy as  the rapid increase in data quantities and apps in the data center have made centralized facilities more difficult to operate than ever before. Server virtualization, more powerful servers and an increasing number of enterprise applications will continue to drive higher IO requirements in the datacenter.

Cloud consolidation starts heavily in 2016: number of options for general infrastructure-as-a-service (IaaS) cloud services and cloud management software will be much smaller at the end of 2016 than the beginning. The major public cloud providers will gain strength, with Amazon, IBM SoftLayer, and Microsoft capturing a greater share of the business cloud services market. Lock-in is a real concern for cloud users, because PaaS players have the ancient imperative to find ways to tie customers to their platforms and aren’t afraid to use them so advanced users want to establish reliable portability across PaaS products in a multi-vendor, multi-cloud environment.

Year 2016 will be harder for legacy IT providers than 2015. In its report, IDC states that “By 2020, More than 30 percent of the IT Vendors Will Not Exist as We Know Them Today.” Many enterprises are turning away from traditional vendors and toward cloud providers. They’re increasingly leveraging open source. In short, they’re becoming software companies. The best companies will build cultures of performance and doing the right thing — and will make data and the processes around it self-service for all their employees. Design Thinking to guide companies who want to change the lives of its customers and employees. 2016 will see a lot more work in trying to manage services that simply aren’t designed to work together or even be managed – for example Whatever-As-A-Service cloud systems to play nicely together with their existing legacy systems. So competent developers are the scarce commodity. Some companies start to see Cloud as a form of outsourcing that is fast burning up inhouse ITops jobs with varying success.

There are still too many old fashioned companies that just can’t understand what digitalization will mean to their business. In 2016, some companies’ boards still think the web is just for brochures and porn and don’t believe their business models can be disrupted. It gets worse for many traditional companies. For example Amazon is a retailer both on the web and increasingly for things like food deliveries. Amazon and other are playing to win. Digital disruption has happened and will continue.
Happy Computer Laptop

Windows 10 is coming more on 2016. If 2015 was a year of revolution, 2016 promises to be a year of consolidation for Microsoft’s operating system. I expect that Windows 10 adoption in companies starts in 2016. Windows 10 is likely to be a success for the enterprise, but I expect that word from heavyweights like Gartner, Forrester and Spiceworks, suggesting that half of enterprise users plan to switch to Windows 10 in 2016, are more than a bit optimistic. Windows 10 will also be used in China as Microsoft played the game with it better than with Windows 8 that was banned in China.

Windows is now delivered “as a service”, meaning incremental updates with new features as well as security patches, but Microsoft still seems works internally to a schedule of milestone releases. Next up is Redstone, rumoured to arrive around the anniversary of Windows 10, midway through 2016. Also Windows servers will get update in 2016: 2016 should also include the release of Windows Server 2016. Server 2016 includes updates to the Hyper-V virtualisation platform, support for Docker-style containers, and a new cut-down edition called Nano Server.

Windows 10 will get some of the already promised features not delivered in 2015 delivered in 2016. Windows 10 was promised coming  to PCs and Mobile devices in 2015 to deliver unified user experience. Continuum is a new, adaptive user experience offered in Windows 10 that optimizes the look and behavior of apps and the Windows shell for the physical form factor and customer’s usage preferences. The promise was same unified interface for PCs, tablets and smart phones – but it was only delivered in 2015 for only PCs and some tablets. Mobile Windows 10 for smart phone is expected to start finally in 2016 – The release of Microsoft’s new Windows 10 operating system may be the last roll of the dice for its struggling mobile platform. Because Microsoft Plan A is to get as many apps and as much activity as it can on Windows on all form factor with Universal Windows Platform (UWP), which enables the same Windows 10 code to run on phone and desktop. Despite a steady inflow of new well-known apps, it remains unclear whether the Universal Windows Platform can maintain momentum with developer. Can Microsoft keep the developer momentum going? I am not sure. In addition there are also plans for tools for porting iOS apps and an Android runtime, so expect also delivery of some or all of the Windows Bridges (iOS, web app, desktop app, Android) announced at the April 2015 Build conference in hope to get more apps to unified Windows 10 app store. Windows 10 does hold out some promise for Windows Phone, but it’s not going to make an enormous difference. Losing the battle for the Web and mobile computing is a brutal loss for Microsoft. When you consider the size of those two markets combined, the desktop market seems like a stagnant backwater.

Older Windows versions will not die in 2016 as fast as Microsoft and security people would like. Expect Windows 7 diehards to continue holding out in 2016 and beyond. And there are still many companies that run their critical systems on Windows XP as “There are some people who don’t have an option to change.” Many times the OS is running in automation and process control systems that run business and mission-critical systems, both in private sector and government enterprises. For example US Navy is using obsolete operating system Microsoft Windows XP to run critical tasks. It all comes down to money and resources, but if someone is obliged to keep something running on an obsolete system, it’s the wrong approach to information security completely.

Happy Computer Laptop

Virtual reality has grown immensely over the past few years, but 2016 looks like the most important year yet: it will be the first time that consumers can get their hands on a number of powerful headsets for viewing alternate realities in immersive 3-D. Virtual Reality will become the mainstream when Sony, and Samsung Oculus bring consumer products on the market in 2016. Whole virtual reality hype could be rebooted as Early build of final Oculus Rift hardware starts shipping to devs. Maybe HTC‘s and Valve‘s Vive VR headset will suffer in the next few month. Expect a banner year for virtual reality.

GPU and FPGA acceleration will be used in high performance computing widely. Both Intel and AMD have products with CPU and GPU in the same chip, and there is software support for using GPU (learn CUDA and/or OpenCL). Also there are many mobile processors have CPU and GPU on the same chip. FPGAs are circuits that can be baked into a specific application, but can also be reprogrammed later. There was lots of interest in 2015 for using FPGA for accelerating computations as the nest step after GPU, and I expect that the interest will grow even more in 2016. FPGAs are not quite as efficient as a dedicated ASIC, but it’s about as close as you can get without translating the actual source code directly into a circuit. Intel bought Altera (big FPGA company) in 2015 and plans in 2016 to begin selling products with a Xeon chip and an Altera FPGA in a single packagepossibly available in early 2016.

Artificial intelligence, machine learning and deep learning will be talked about a lot in 2016. Neural networks, which have been academic exercises (but little more) for decades, are increasingly becoming mainstream success stories: Heavy (and growing) investment in the technology, which enables the identification of objects in still and video images, words in audio streams, and the like after an initial training phase, comes from the formidable likes of Amazon, Baidu, Facebook, Google, Microsoft, and others. So-called “deep learning” has been enabled by the combination of the evolution of traditional neural network techniques, the steadily increasing processing “muscle” of CPUs (aided by algorithm acceleration via FPGAs, GPUs, and, more recently, dedicated co-processors), and the steadily decreasing cost of system memory and storage. There were many interesting releases on this in the end of 2015: Facebook Inc. in February, released portions of its Torch software, while Alphabet Inc.’s Google division earlier this month open-sourced parts of its TensorFlow system. Also IBM Turns Up Heat Under Competition in Artificial Intelligence as SystemML would be freely available to share and modify through the Apache Software Foundation. So I expect that the year 2016 will be the year those are tried in practice. I expect that deep learning will be hot in CES 2016 Several respected scientists issued a letter warning about the dangers of artificial intelligence (AI) in 2015, but I don’t worry about a rogue AI exterminating mankind. I worry about an inadequate AI being given control over things that it’s not ready for. How machine learning will affect your business? MIT has a good free intro to AI and ML.

Computers, which excel at big data analysis, can help doctors deliver more personalized care. Can machines outperform doctors? Not yet. But in some areas of medicine, they can make the care doctors deliver better. Humans repeatedly fail where computers — or humans behaving a little bit more like computers — can help. Computers excel at searching and combining vastly more data than a human so algorithms can be put to good use in certain areas of medicine. There are also things that can slow down development in 2016: To many patients, the very idea of receiving a medical diagnosis or treatment from a machine is probably off-putting.

Internet of Things (IoT) was talked a lot in 2015, and it will be a hot topics for IT departments in 2016 as well. Many companies will notice that security issues are important in it. The newest wearable technology, smart watches and other smart devices corresponding to the voice commands and interpret the data we produce - it learns from its users, and generate appropriate  responses in real time. Interest in Internet of Things (IoT) will as bring interest to  real-time business systems: Not only real-time analytics, but real-time everything. This will start in earnest in 2016, but the trend will take years to play out.

Connectivity and networking will be hot. And it is not just about IoT.  CES will focus on how connectivity is proliferating everything from cars to homes, realigning diverse markets. The interest will affect job markets: Network jobs are hot; salaries expected to rise in 2016  as wireless network engineers, network admins, and network security pros can expect above-average pay gains.

Linux will stay big in network server marker in 2016. Web server marketplace is one arena where Linux has had the greatest impact. Today, the majority of Web servers are Linux boxes. This includes most of the world’s busiest sites. Linux will also run many parts of out Internet infrastructure that moves the bits from server to the user. Linux will also continue to rule smart phone market as being in the core of Android. New IoT solutions will be moist likely to be built mainly using Linux in many parts of the systems.

Microsoft and Linux are not such enemies that they were few years go. Common sense says that Microsoft and the FOSS movement should be perpetual enemies.  It looks like Microsoft is waking up to the fact that Linux is here to stay. Microsoft cannot feasibly wipe it out, so it has to embrace it. Microsoft is already partnering with Linux companies to bring popular distros to its Azure platform. In fact, Microsoft even has gone so far as to create its own Linux distro for its Azure data center.

Happy Computer Laptop

Web browsers are coming more and more 64 bit as Firefox started 64 bit era on Windows and Google is killing Chrome for 32-bit Linux. At the same time web browsers are loosing old legacy features like NPAPI and Silverlight. Who will miss them? The venerable NPAPI plugins standard, which dates back to the days of Netscape, is now showing its age, and causing more problems than it solves, and will see native support removed by the end of 2016 from Firefox. It was already removed from Google Chrome browsers with very little impact. Biggest issue was lack of support for Microsoft’s Silverlight which brought down several top streaming media sites – but they are actively switching to HTML5 in 2016. I don’t miss Silverlight. Flash will continue to be available owing to its popularity for web video.

SHA-1 will be at least partially retired in 2016. Due to recent research showing that SHA-1 is weaker than previously believed, Mozilla, Microsoft and now Google are all considering bringing the deadline forward by six months to July 1, 2016.

Adobe’s Flash has been under attack from many quarters over security as well as slowing down Web pages. If you wish that Flash would be finally dead in 2016 you might be disappointed. Adobe seems to be trying to kill the name by rebranding trick: Adobe Flash Professional CC is now Adobe Animate CC. In practive it propably does not mean much but Adobe seems to acknowledge the inevitability of an HTML5 world. Adobe wants to remain a leader in interactive tools and the pivot to HTML5 requires new messaging.

The trend to try to use same same language and tools on both user end and the server back-end continues. Microsoft is pushing it’s .NET and Azure cloud platform tools. Amazon, Google and IBM have their own set of tools. Java is on decline. JavaScript is going strong on both web browser and server end with node.js , React and many other JavaScript libraries. Apple also tries to bend it’s Swift programming language now used to make mainly iOS applications also to run on servers with project Perfect.

Java will still stick around, but Java’s decline as a language will accelerate as new stuff isn’t being written in Java, even if it runs on the JVM. We will  not see new Java 9 in 2016 as Oracle’s delayed the release of Java 9 by six months. The register tells that Java 9 delayed until Thursday March 23rd, 2017, just after tea-time.

Containers will rule the world as Docker will continue to develop, gain security features, and add various forms of governanceUntil now Docker has been tire-kicking, used in production by the early-adopter crowd only, but it can change when vendors are starting to claim that they can do proper management of big data and container farms.

NoSQL databases will take hold as they be called as “highly scalable” or “cloud-ready.” Expect 2016 to be the year when a lot of big brick-and-mortar companies publicly adopt NoSQL for critical operations. Basically NoSQL could be seem as key:value store, and this idea has also expanded to storage systems: We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being.

In the database world Big Data will be still big but it needs to be analyzed in real-time. A typical big data project usually involves some semi-structured data, a bit of unstructured (such as email), and a whole lot of structured data (stuff stored in an RDBMS). The cost of Hadoop on a per-node basis is pretty inconsequential, the cost of understanding all of the schemas, getting them into Hadoop, and structuring them well enough to perform the analytics is still considerable. Remember that you’re not “moving” to Hadoop, you’re adding a downstream repository, so you need to worry on systems integration and latency issues. Apache Spark will also get interest as Spark’s multi-stage in-memory primitives provides more performance  for certain applications. Big data brings with it responsibility – Digital consumer confidence must be earned.

IT security continues to be a huge issue in 2016. You might be able to achieve adequate security against hackers and internal threats but every attempt to make systems idiot proof just means the idiots get upgraded. Firms are ever more connected to each other and the general outside world. So in 2016 we will see even more service firms accidentally leaking critical information and a lot more firms having their reputations scorched by incompetence fuelled security screw-ups. Good security people are needed more and more – a joke doing the rounds of ITExecs doing interviews is “if you’re a decent security bod, why do you need to look for a job”

There will still be unexpected single points of failures in big distributed networked system. The cloud behind the silver lining is that Amazon or any other cloud vendor can be as fault tolerant, distributed and well supported as you like, but if a service like Akamai or Cloudflare was to die, you still stop. That’s not a single point of failure in the classical sense but it’s really hard to manage unless you go for full cloud agnosticism – which is costly. This is hard to justify when their failure rate is so low, so the irony is that the reliability of the content delivery networks means fewer businesses work out what to do if they fail. Oh, and no one seems to test their mission-critical data centre properly, because it’s mission criticalSo they just over-specify where they can and cross their fingers (= pay twice and get the half the coverage for other vulnerabilities).

For IT start-ups it seems that Silicon Valley’s cash party is coming to an end. Silicon Valley is cooling, not crashing. Valuations are falling. The era of cheap money could be over and valuation expectations are re-calibrating down. The cheap capital party is over. It could mean trouble for weaker startups.

 

933 Comments

  1. Tomi Engdahl says:

    Compatibility before purity: Microsoft tweaks .NET Core again
    Open source .NET will add legacy APIs to make porting easier
    http://www.theregister.co.uk/2016/05/31/microsoft_tweaks_dot_net_core_again/

    Microsoft’s open source fork of the .NET platform, called .NET Core, will be modified for better compatibility with existing applications, says Program Manager Immo Landwerth in a recent post.

    When the company embarked on its .NET Core effort, it took a minimalist approach, stripping out legacy code in order to get the best performance and smallest footprint for cross-platform server applications.

    The consequence is that porting existing applications is hard, because so many of the existing .NET Framework APIs are missing.

    “We’ve decided to drastically simplify the porting effort by unifying the core APIs with other .NET platforms, specifically the .NET Framework and Mono/Xamarinm” says Landwerth.

    This will be achieved by providing “source and binary compatibility for applications that target the core Base Class Libaries (BCL) across all platforms. The Base Class Libraries are those that existed in mscorlib, System, System.Core, System.Data, and System.Xml and that are not tied to a particular application model and are not tied to a particular operating system implementation,” Landwerth writes.

    Reply
  2. Tomi Engdahl says:

    You deleted the customer. What now? Human error – deal with it
    To err is human, to double err is career limiting
    http://www.theregister.co.uk/2016/05/30/best_practices_tech_processes/

    Everyone I speak to about system security seems to panic about malware, cloud failure system crashes and bad patches. But the biggest threat isn’t good or bad code, or systems that may or may not fail. It’s people. What we call Liveware errors range from the mundane to the catastrophic and they happen all the time at all levels of business.

    We have all had that pit-of-the-stomach feeling when we hit the wrong key or pull the wrong drive or cable. One of the more mundane examples I have experienced was a secretary trying to delete an old file but accidentally nuking the whole client folder.

    Catalogue of human error

    Unfortunately, human error scale ups. I have seen very large companies lose hundreds of machines due to a stupid file-deletion default of “*” within a maintenance application.

    The root cause of failure is often human mistakes. Even when human interaction is not the direct cause, it usually plays some role in the failure. The reasons behind human failure are also, contrary to popular belief, rarely based on malice or retribution for perceived slights, but are much more likely to come down to common or garden-variety human screw-ups.

    Unfortunately, as IT becomes more demanding, IT staff and budgets are shrinking, leaving more work to be done by fewer people. This unrelenting pressure of continually fixing systems as quickly as possible can lead to mistakes.

    It can happen all too easily. One quick click of a button and you can be in a situation that is incredibly hard to recover from. Thank goodness for confirmation dialogs!

    Document your best practices – properly

    Failure to document procedures is in itself a completely avoidable human error. All organisations should have a set of up-to-date, fully documented procedures and processes that are available and easy to implement.

    Reply
  3. Tomi Engdahl says:

    Microsoft’s Universal Windows Platform? It’s an uphill battle – key partner
    We like ‘any device, any platform’, but not UWP, says component vendor
    http://www.theregister.co.uk/2016/05/31/microsofts_universal_windows_platform_uphill_battle_says_key_partner/

    Keeping pace with Microsoft’s ever-changing developer story has not been easy. Just ask Infragistics exec Jason Beres, Senior VP Development Tools.

    What about the Universal Windows Platform (UWP), introduced with Windows 10 with the promise that developers can write an application that runs everywhere Windows runs, subject to device constraints?

    “It’s an uphill battle for Microsoft,” Beres told me. “The reason to build a UWP app gets smaller and smaller. The Store in Windows 10 becomes almost an irrelevant icon that you remove. They don’t have adoption on phones. So the players in the space are iOS, Android and a hybrid experience.

    “If I’m building an Xbox application, a HoloLens application, UWP is actually pretty cool. But even on a desktop, WPF has higher performance, it has more features, it has a bigger ecosystem, it has more stability than UWP.”

    The odd thing is that although UWP remains prominent in Microsoft’s development platform, it is overshadowed by another strategy, which is to support any device on any platform in order to promote its cloud services, Azure and Office 365.

    Reply
  4. Tomi Engdahl says:

    Samsung: Don’t install Windows 10. REALLY
    Microsoft and Samsung celebrate Windows 10 year of driver FAIL
    http://www.theregister.co.uk/2016/05/31/windows_10_samsung_fail/

    Samsung is advising customers against succumbing to Microsoft’s nagging and installing Windows 10.

    The consumer electronics giant’s support staff have admitted drivers for its PCs still don’t work with Microsoft’s newest operating system and told customers they should simply not make the upgrade.

    That’s nearly a year after Microsoft released Windows 10 and with a month to go until its successor – Windows 10 Anniversary Update – lands.

    Samsung’s customers have complained repeatedly during the last 12 months of being either unable to install Microsoft’s operating system on their machines or Windows 10 not working properly with components if they do succeed.

    Reply
  5. Tomi Engdahl says:

    Intel’s new plan: A circle that starts in your hand and ends in the cloud
    Atom-powered home kit so ISPs can pipe VMs into your house
    http://www.theregister.co.uk/2016/05/31/intels_new_plan_a_circle_that_starts_in_your_hand_and_ends_in_the_cloud/

    As predicted by The Register, Intel has created an x86-powered reference platform for home gateways that makes the box you use for broadband services an Atom-powered target for virtual machines delivered by carriers.

    Announced today at Computex in Taipei, the new AnyWAN GRX750 is a system-on-a-chip that can serve as the basis for modern/routers that is ready to build into devices capable of connecting over DSL, fiber optics, G.fast or 4g/5G wireless. Chipzilla has also created a new Wi-Fi chipset, the XWAY WAV500, that it expects will often reside in the AnyWAN. The company’s said the combination can serve 100 Wi-Fi-connected devices at up to 1Gbps.

    The multi-connection cocktail alone will put the cat among the pigeons in the home gateway market, a field in which many players specialise in one carriage standard or another.

    “Being able to deploy on-premises applications is a far more efficient way than upgrading devices,” she told The Register. She also used the opportunity to call on carriers to embark on a “cloudification” of their infrastructure to reduce dependency on proprietary kit and instead just use servers for everything

    Making home gateways a target for NFV also plays into Intel’s new strategy, outlined today by Bryant and Client Computing Group corporate veep and general manager Navin Shenoy. The pair explained that Intel’s post-PC plan starts with helping device-makers to cook up cool kit – be it PCs, telematics modules for cars or drones packing an Edison board and Intel’s RealSense depth-viewing camera.

    That hardware should, whenever possible, feed data to, rely on or integrate with cloud services running Xeons inside servers… lots of servers. In Intel’s ideal world, that Xeon-powered back-endery is so much fun to consume that we all buy more devices, which means more servers, and before you can worry about the decline of the PC market, Chipzilla will have sold more stuff into cars and Internet of Things things than it ever dreamed of selling into PCs or smartphones.

    High-resolution or virtual video can run on any device, but Intel is betting that more grunt makes a difference and will mean people use PCs for 4K and virtual viewing.

    Reply
  6. Tomi Engdahl says:

    Intel Broadens Xeon Video Offerings
    E3-1500-v5 diversifies portfolio
    http://www.eetimes.com/document.asp?doc_id=1329790&

    Intel’s first real-time high-efficiency video codec (HEVC) transcoding processor—the E3-1500 version five (E3v5)—combines Xeon 14-nanometer complementary-metal-oxide-semiconductor (CMOS) performance with acceleration from its companion Iris Pro Graphics processing unit (GPU). Intel’s patented Quick-Sync Video hardware provides real-time high-definition (HD) and ultra-HD (UHD, also called 4k-pixel) to provide high-efficiency video codec services.

    As PC sales continue to diminish and Xeon datacenter and supercomputer sales continue to grow, Intel is diversifying processor offerings into dedicated video transcoding. Today, there is a gaping need for video transcoding, that is adjusting real-time video streams encoding and pixel see to fit screens sized tiny smartphones to 1080 pixel high-definition (HD) to 4k-pixel ultra-high-definition (UHD) flat panels.

    “According to the analysts, 80 percent of the Internet traffic will be video and of that there will be three times as many 4k screens by 2020,” said Jennifer Huffstetler, director Datacenter Product Marketing in a preview of the E3 last month. “That amounts to as much as 129 percent compound average growth rate [CAGR] for high-efficiency video encoding [HEVC].”

    Intel expects the E3v5 to be used by internet video service providers, graphics oriented data centers, workstations, virtualized environments (with up to seven users sharing each E3v5) and cloud-based video delivery systems. Each E3v5 with GPU can handle 18 advanced video communications channels simultaneously serving 1080p 30 frame-per-second video.

    Reply
  7. Tomi Engdahl says:

    Intel’s Broadwell-E gaming CPU is a stunner, offering 10 cores for a whopping $1,723
    Meet the next flagship chip for overclocking, gaming, and transcoding.
    http://www.pcworld.com/article/3076158/components/intels-broadwell-e-gaming-cpu-is-a-stunner-offering-10-cores-for-a-whopping-1723.html

    On paper, Intel’s massive new 10-core—yes, 10—Broadwell-E gaming chips top out at 3.8GHz. But buying one is like purchasing a lottery ticket to free overclocking—without voiding your warranty.

    The flagship feature of the new Broadwell-E line is something Intel calls Turbo Boost Max Technology 3.0, a technology that tests each individual core on the chip and figures out which one can be safely pushed beyond its normal limits.That’s on top of a 20- to 25-percent improvement over its previous gaming chip, the Haswell-E generation, Intel claims.

    “It’s the biggest, baddest CPU we’ve ever done,” said Frank Soqui, the general manager of the enthusiast client platform division at Intel

    Intel’s new message is “megatasking.” The Broadwell-E chip can handle scenarios where the chip is being asked to one, process 4K games at 60 frames per second; two, encode that gameplay; and three, stream it out at 1080p resolution to a live stream.

    the new Broadwell-E chips are drop-in replacements to the previous Haswell-E gaming chips, using the same LGA 2011-v3 socket and X99 chipset. Just make sure that the board vendor enables the appropriate BIOS update to take advantage of the new Turbo Boost Max Technology 3.

    Reply
  8. Tomi Engdahl says:

    Steve Baker / Quora:
    Why motion sickness will remain a serious problem for many users of VR products and create product liability issues for the industry — • This answer won a Most Viewed Writer in — INTRODUCTION: — I’ve been working with helmet mounted displays in military flight simulation for several decades …
    https://www.quora.com/How-big-an-issue-is-the-nausea-problem-for-Virtual-Reality-products/answer/Steve-Baker-9?share=1

    Reply
  9. Tomi Engdahl says:

    Juli Clover / MacRumors:
    Samsung announces 512GB NVMe SSD weighing only 1 gram with read and write speeds of 1500MB/s and 900MB/s, available starting in June

    Samsung Announces 512GB NVMe SSD That’s Smaller Than a Stamp
    http://www.macrumors.com/2016/05/31/samsung-ultra-small-nvme-512gb-ssd/

    Reply
  10. Tomi Engdahl says:

    Jordan Novet / VentureBeat:
    Microsoft announces general availability of SQL Server 2016, with deep integration of the R programming language

    Microsoft’s SQL Server 2016 becomes available to everyone
    http://venturebeat.com/2016/06/01/microsofts-sql-server-2016-becomes-available-to-everyone/

    Microsoft today is announcing the general availability (GA) of its SQL Server database software. For more than a year now, Microsoft has been rolling out public previews and release candidates of the software, and now the final version is out. A month ago, Microsoft said SQL Server 2016 would hit GA on June 1, and that statement has proven to be accurate.

    This edition stands out from SQL Server 2014, SQL Server 2012, and earlier releases in a few ways, but probably the most significant is the deep integration of the R programming language that’s used for data science. This type of deep integration was made possible by Microsoft’s 2015 acquisition of R distribution vendor Revolution Analytics.

    Reply
  11. Tomi Engdahl says:

    CoreOS launches Torus, a new open source distributed storage system
    http://techcrunch.com/2016/06/01/coreos-launches-torus-a-new-open-source-distributed-storage-system/

    CoreOS today announced the launch of Torus, its latest open source project. Just like CoreOS’s other projects, Torus is all about giving startups and enterprises access to the same kind of technologies that web-scale companies like Google already use internally. In the case of Torus, that’s distributed storage.

    The idea behind Torus is to give developers access to a reliable and scalable storage system for applications that have been deployed on containers using the Google-incubated Kubernetes container management service.

    Reply
  12. Tomi Engdahl says:

    Mike Murphy / Quartz:
    Facebook announces Deep Text, an AI engine to understand meaning and sentiment behind posts, so the platform’s content can be categorized, searched effectively — Today, Facebook announced Deep Text, an AI engine it’s building to understand the meaning and sentiment behind all of the text posted by users to Facebook.

    Facebook is using artificial intelligence to become a better search engine
    http://qz.com/696827/facebook-is-using-artificial-intelligence-to-become-a-better-search-engine/

    In a blog post, Facebook said that it was building the system to help it surface content that people may be interested in, and weed out spam.

    This might sound like a minor improvement, but it actually has the potential—in theory—to transform the social network most of us use every day into something else we use daily: a powerful search engine.

    “We want Deep Text to be used in categorizing content within Facebook to facilitate searching for it and also surfacing the right content to users,”

    Facebook already uses demographic information shared by users (whether directly or through their interactions with brands on the site), but right now, the majority of the text-based information Facebook has on its servers is unstructured, meaning Facebook doesn’t know users’ intent in posting, or even what users meant. Deep Text will help categorize and provide meaning for all that text, and could turn all that unstructured data into information it can use—and users can search.

    Based on neural networks, Deep Text is unlike other systems designed to understand written language. Facebook says it can understand the meaning of thousands of posts per second, in 20 languages, “with near-human accuracy.” The system tries to understand the semantic relationships and similarities between words, meaning it realizes that “brother” and “bro” are often used in similar situations.

    Deep Text is already powering some aspects of Facebook, the company says. For example, some chat bots on Facebook Messenger

    Facebook said that it plans to use the millions of Facebook pages users have created to build up more training data for Deep Text.

    Reply
  13. Tomi Engdahl says:

    Mary Jo Foley / ZDNet:
    Microsoft opens Windows Holographic, the platform behind HoloLens, for partners to create AR/VR devices, is working with Intel, AMD, HTC, Acer, and others

    Microsoft to open Windows Holographic to virtual reality vendors
    http://www.zdnet.com/article/microsoft-to-open-windows-holographic-to-virtual-reality-vendors/

    Microsoft is courting virtual-reality hardware makers with its Windows Holographic platform, hoping to grow the base of mixed-reality-capable devices.

    Sounds good. But what do those words really mean?

    Microsoft’s HoloLens goggles are an example of an augmented reality (AR) device that is centered around Windows Holographic.

    Microsoft officials said today at Computex 2016 that the company isn’t just limiting the Windows 10 variant tuned to work with HoloLens — a k a the Windows Holographic platform — to the HoloLens. Beginning today, Microsoft is opening up the Windows Holographic platform to any companies that want to build devices that can handle “mixed reality.”

    Mixed reality is not necessarily synonymous with HoloLens

    Microsoft is looking to its partners to build a range of devices, including tethered, untethered, fully opaque and transparent displays.

    Microsoft is enabling ASUS and other OEM partners to build mixed reality devices that make use of its Windows Holographic platform.

    Reply
  14. Tomi Engdahl says:

    ARM-Based Server Processors Finally Hitting Stride
    http://www.eetimes.com/document.asp?doc_id=1329822&

    Vendors of ARM-based server-class microprocessors will begin to gain traction with their newest designs next year following years of ecosystem development, according to a new report by International Data Corp. (IDC).

    The long-foreseen forecasted shift in fortunes for ARM-based server processors is expected to come after a period of muted growth for combined ARM- and X86-based server processors, which are projected to log a compound annual growth rate (CAGR) of 2.2% between 2015 and 2020, according to the report.

    “With an expanding system total available market, expanding workload base, and emerging competition, the next five years of the server-class microprocessor market will see more system- and workload-specific designs, moderation in pricing, and some modest change in market share,”

    According to IDC, ARM processor vendors such as Applied Micro and Cavium have—after years of processor designs that failed to gain traction in the data center—have begun garnering notable design wins and partnerships from communications service providers and systems vendors representing a wide spectrum of end customers and workloads.

    Reply
  15. Tomi Engdahl says:

    The least stressful job in the US? Information security analyst, duh
    That’s not a typo, we’ve checked and checked again
    http://www.theregister.co.uk/2016/06/02/least_stressful_job_is_infosec_analyst/

    Everyone knows that being an infosec analyst is a cushy job – but did you know quite how much? Because according to job website CareerCast, it is literally the least stressful job in the country.

    The company measured 11 stress factors, including the amount of travel, deadlines, competitiveness, physical demands, risk to your life, and being in the public eye and concluded that the best of all possible worlds was in infosec.

    “The proliferation of sensitive content stored online, as well as the growing importance of cloud computing, is fueling the demand for more information security analysts. Job prospects and competitive pay make this new addition to the Jobs Rated report one of the best jobs for 2016,” CareerCast argues.

    In fact, CareerCast seems to have a thing for infosec analysts, putting it not only bottom of the stress league but also listing it as the third best job to have in the United States, with a median salary of $89,000 and a healthy 18 per cent growth outlook. It’s beaten only by statistician and data scientist.

    And at the other end of the scale? Newspaper reporter.

    That’s right, being a newspaper reporter is literally the worst job in America, according to CareerCast. And the ninth most stressful.

    Reply
  16. Tomi Engdahl says:

    Why Oracle will win its Java copyright case – and why you’ll be glad when it does
    Open source needs strong copyright; weak copyright only helps bullies
    http://www.theregister.co.uk/2016/06/02/google_oracle_comment/

    Oracle will ultimately prevail in its Java copyright lawsuit against Google. And if you’re a free software developer or supporter, you should be cheering them all the way to the wire.

    Google threw out so many diversions and red herrings that free and open-source software (FOSS) supporters were even cheering for a verdict that kicks away the legal basis for open source and free software. In the eyes of the “civil society” NGOs (non-governmental organizations) and compliant academics (many of whom are funded by Google), and backed by a chorus of bloggers and tech journalists who prefer a simplified, cartoon view of the world, the story was indeed simple. In their eyes, the good guys won, and that’s all there was to the case.

    So why is the jury’s broad application of fair use in reality bad news for open source? How did Google win last week? And why will Oracle ultimately prevail?

    Oracle will ultimately prevail over Google for a very simple reason: Google is guilty. Google copied 11,000 lines of someone else’s copyrighted code without a license to do so. It could have chosen some other code to copy; or it could have obtained a license; or it could have not copied anything and created every single line of Android code from scratch. All three were options that Google didn’t take. It’s really as simple as that.

    In a nutshell, free and open source software depends on simple, strong copyright law. Access to justice is also a factor: if you’re a small developer, you should be able to go to court to defend your license with the presumption in your favor. Contract integrity is also pretty important, but that’s a given; in a judicial system in which contracts mean nothing, there is little justice for anyone, in any context.

    Copyleft itself is founded on strong copyright, as Richard M Stallman has pointed out
    Challenges to the GNU GPL have been infrequent and unsuccessful

    But just as there was no reason for FOSS supporters to celebrate last week’s verdict, there’s no reason for FOSS supporters to be overly alarmed, either. Do not panic.

    The circumstances of the verdict are strange and narrow.

    In effect, Alsup was using the jury to mark his earlier homework, but advising them that there was only one acceptable answer: that Alsup was right.

    In this context, all the jury’s verdict means is: “We Really Don’t Care.” We don’t care that Google didn’t have a license, they were saying, and we don’t care how much it copied.

    Copyrightability of APIs is FUD.

    A great deal of FUD has been generated by Google (and its supporters, like the activist Professor Pamela Samuelson), over the issue of the “copyrightability” of APIs.

    Reply
  17. Tomi Engdahl says:

    Minecraft Tops 100 Million Sales
    https://games.slashdot.org/story/16/06/03/0215243/minecraft-tops-100-million-sales

    Mojang has announced today that its game ‘Minecraft’ has passed 100 million sales across all platforms, including PC, Mac, consoles and mobile. Nearly 53,000 copies of the game have been sold every single day around the world since the beginning of the year.

    https://mojang.com/2016/06/weve-sold-minecraft-many-many-times-look/

    Reply
  18. Tomi Engdahl says:

    A New Version of Rust Hits the Streets
    http://www.linuxjournal.com/content/new-version-rust-hits-streets-0

    Version 1.9 of the Rust programming language has been released. Rust is a new language with a small but enthusiastic community of developers.

    Rust is a systems programming language. It combines the low-level power of C or C++ with features that are more common in high-level languages, like Python, Ruby and Haskell. What’s more, it takes a modern approach to memory management.

    Older languages, such as C, place the burden of memory management on the programmer. This allows developers to create highly optimized code, but it also makes it possible to introduce serious bugs, especially in complex multithreaded applications. Furthermore, these bugs are not related to the business logic of the system. They’re related to the language.

    Rust borrows ideas from high-level languages, which manage memory for the programmer. This can prevent many of the bugs that are common in C code. Memory bugs can cause the software to crash and often open security holes that can be exploited.

    Reply
  19. Tomi Engdahl says:

    IDC, IDC, o crystal ball. Who’s storage’s biggest cheese of them all?
    Good news for some, and bad for hyperscaler folk
    http://www.theregister.co.uk/2016/06/03/idc_storage_market_statistics_q2/

    IDC’s number crunchers have crunched their quarterly enterprise storage numbers and found HPE has done very well, while the storage market has declined somewhat and ODM supply to hyperscalers has plunged downwards.

    The overall enterprise storage market was worth $8.2bn in the first 2016 quarter, down seven per cent on a year ago.

    HPE and EMC tied for top place
    Dell was third
    NetApp fourth
    Hitachi and IBM tied for fifth and sixth positions

    The largest supplier category was “Others”

    Reply
  20. Caribbean Resorts says:

    Heya i’m for the primary time here. I came across this
    board and I to find It truly helpful & it helped me out much.
    I hope to provide one thing again and help others such as you aided me.

    Reply
  21. Tomi Engdahl says:

    No Kinetic energy at DataDirect Networks: Ethernet drives snubbed
    No customer interest, we’re told
    http://www.theregister.co.uk/2016/03/10/kinetic_nogo_for_ddn/

    DataDirect Networks has no plans to support Kinetic disk drives in its WOS object storage products.

    Kinetic drives are directly accessed over Ethernet, each having their own network address, and they receive Get and Put commands to read and write to a key:value store on the disks. They have been mostly associated with Seagate, although Toshiba and Western Digital/HGST are members of the Kinetic Open Storage Project (KOSP).

    Exablox is supportive of Kinetic drives. Object storage suppliers CleverSafe (bought by IBM), NetApp (StorageGRID) and Scality are also members of the KOSP clan, but DDN is not.

    Michael King, DDN’s senior director for marketing strategy and operations, said that there was no customer interest; customers were not seeing any good benefits from having a WOS system using Kinetic drives.

    The Kinetic use benefit is seen as IO stack simplification, with a set of array controller software and server storage IO stack software being no longer needed, as applications will access the drives directly.

    Yet, with object systems storing trillions of objects, there still needs to be management to look after object placement, protection and so forth.

    Reply
  22. Tomi Engdahl says:

    Storage greybeard: DevOps, plagiarism and horrible wrongness
    There’s nothing new under the Sun
    http://www.theregister.co.uk/2016/06/03/devops_storage/

    Recently I’ve been spending time thinking about what DevOps really means to my teams and to me. A lot of reading has been done and a lot of pondering of the navel.

    The most important conclusion that I have come to is that the DevOps movement is nothing new; the second conclusion I have come to is that it can mean pretty much what you want it to, and hence there is no right way to do it, but there might well be horribly wrong ways to do it.

    As a virtual greybeard, I started my IT career in the mainframe world as a PL/1 programmer but I also did some mainframe systems programming. As an application programmer, I was expected to do the deployment, support the deployment and be involved with application from the cradle to undeath.

    As a systems programmer, we scripted and automated in a variety of languages; we extended and expanded functionality of system programs. User exits were/are an incredibly powerful tool for the systems programmer. VSAM and VTAM – the software-defined storage and networking of their time.

    We plagiarised and shared scripts – mostly internally but also, at times, scripts would make their way round the community via the contractor transmission method.

    Many DevOps engineers would look at how we worked and find it instantly familiar…

    I’ve boiled DevOps and the idea of the Site Reliability Engineering function down in my mind to the following:

    Fix Bad Stuff
    Stop Bad Stuff happening
    Do Good Stuff
    Make Good Stuff easier to do

    It turns out that my teams are already pretty much working in this way

    Reply
  23. Tomi Engdahl says:

    The least stressful job in the US? Information security analyst, duh
    That’s not a typo, we’ve checked and checked again
    http://www.theregister.co.uk/2016/06/02/least_stressful_job_is_infosec_analyst/

    Everyone knows that being an infosec analyst is a cushy job – but did you know quite how much? Because according to job website CareerCast, it is literally the least stressful job in the country.

    “The proliferation of sensitive content stored online, as well as the growing importance of cloud computing, is fueling the demand for more information security analysts. Job prospects and competitive pay make this new addition to the Jobs Rated report one of the best jobs for 2016,” CareerCast argues.

    Reply
  24. Tomi Engdahl says:

    Don’t go chasing waterfalls, please stick… Hang on. They’re back
    Culture wars, generation shift… hipsters
    http://www.theregister.co.uk/2016/06/07/problems_for_agile/

    Since the publication of the Agile Manifesto, there’s been a steady acceptance that Agile is the way to go when it comes to software development. The old waterfall method was seen as something rather quaint and old-fashioned, the equivalent of hanging onto your vinyl LPs when the rest of the world was downloading onto their iPods.

    And yet, just as vinyl is making a comeback so we see that waterfall is still clinging on tenaciously. In fact, it’s not just clinging on but positively ruling the roost. According to Gartner’s IT Key Metrics Data, waterfall methods were employed on 56 per cent of development projects in 2015.

    So, given that it’s generally accepted that agile projects offer much scope for effective development, why are organisations still clinging to the old methods?

    Old school is cool

    “There is the organisational drift back to Waterfall where a company has moved to agile practices and not changed the culture.”
    The hold-up with agile is not due to technical issues – agile deployment remains the best option for most software projects – rather cultural obstacles need to be overcome.

    Middle management wall

    Translation: middle management, which is like permafrost – something that’s hard to penetrate. “If they don’t change, then we say that the agile antibodies aren’t strong.”
    “You get these guys in their 50s, with no social skills and they’re the ones who maintain the systems. The company can’t get rid of them as the kids haven’t got the right skills,”
    Typically what happens, our analyst explained, is that management will decide to “go agile”.
    “They’ll drop in some new age guru who will use agile to develop a web-based system at the front end. At some point, however, they have to speak to the backend and connect to the CICS, DB2 and COBOL applications and that’s where it all breaks down,”
    In such situations, it’s important that companies move gradually.

    Church of Agile

    One of the problems, however, is that there’s a tendency for there to be an almost evangelical belief in agile as a methodology and that this sets up conflict.
    He sees that agile has become accepted because it’s fashionable: “Agile is a hipster methodology – it’s when methodology becomes a religion that you get problems.”
    Adam stresses he absolutely agrees that agile development is the way forward but – and this is the big but – he points out that the methodology isn’t really that important. What is important, he says, is that there’s a strong leader – one with vision.
    It’s not only management that can hinder agile rollout. According to Wayne Harris, a product owner at Chroma Sports, agile doesn’t sit very easily with the Prince 2 project management methodology, for example. “Trying to run an agile project with Prince 2 project management is a nightmare,” he says.
    There are other methodologies out there, too, that agile must contend with: PMP and CAPM to name just two.

    There is no reason why new development projects need revert to waterfall but the business culture has to alter first: without this, companies will be stuck in the old ways.

    Reply
  25. Tomi Engdahl says:

    Citrix to unleash containerised NetScaler this month
    Microservices make a lot of traffic that needs taming, which can get expensive and in-locky
    http://www.theregister.co.uk/2016/06/07/citrix_to_unleash_containerised_netscaler_this_month/

    Citrix is mere weeks away from releasing the containerised version of its NetScaler application delivery controller is revealed last December.

    “NetScaler CPX” was shown off at the company’s Synergy conference last month, but NetScaler veep and general manager Ash Chowdappa today told The Register the software has snuck into a “we’ll sell it if you really must have it now” version and will be generally available by month’s end.

    Chowdappa said the swift release is attributable to strong demand: apparently folks footling with containers find they make a lot of East/West traffic as containers spawn wherever they can. NetScaler’s traffic-grooming features come in handy to stop LANs melting down as container counts climb.

    NetScaler CPX is a Docker container. Chowdappa explained that, ideally, when containers are created and/or orchestrated with the likes of Kubernetes it makes sense to fire up a CPX Container too, so that the whole collection of containers in a microservice can start to enjoy its light traffic-tickling touch.

    Reply
  26. Tomi Engdahl says:

    Why Britain banned mobile apps
    Interview with Ben Terrett, former design chief at the GDS.
    https://govinsider.asia/smart-gov/why-britain-banned-mobile-apps/

    That sounds horrendous,” says Ben Terrett, the former head of design at the UK Government Digital Service.

    GovInsider has just told him about the Indonesian city with a target of 300 mobile apps built by government per year. As citizens increasingly use smartphones, officials believe this is the best way to reach them.

    “We banned apps at GDS, I just said no,” Terrett says. The UK GDS was the first government digital service in the world, and is held up as a global pioneer for its award-winning approach. As the founding head of design, Terrett is responsible for creating services that have been mimicked across the world.

    So why did the GDS ban apps? It wasn’t because they weren’t technically savvy enough to build them.

    Cost, he says. Apps are “very expensive to produce, and they’re very very expensive to maintain because you have to keep updating them when there are software changes,” Terrett says. “I would say if you times that by 300, you’re suddenly talking about a huge team people and a ton of money to maintain that ecosystem”.

    How did the UK reach an increasingly mobile population? Responsive websites, he replies. “For government services that we were providing, the web is a far far better way… and still works on mobile.”

    Sites can adapt to any screen size, work on all devices, and are open to everyone to use regardless of their device. “If you believe in the open internet that will always win,” he says. And they’re much cheaper to maintain, he adds, because when an upgrade is required, only one platform needs recoding.

    Key to the GDS’ approach is designing for user needs, not organizational requirements, Terrett says. “That is how good digital services designed and built these days. That is how everyone does it, whether that’s google or facebook or British Airways or whoever.”

    The problem is that public sector agencies tend not to design with citizens in mind. “Things are just designed to suit the very silos that the project sits in, and the user gets lost in there,” Terrett adds.

    For example, opening a restaurant might require multiple permits from different agencies. A good digital service should combine them all in one place.

    How does Britain measure digital success? It isn’t necessarily the popularity of a digital service, Terrett says. “It’d be nice if they like it, don’t get me wrong, but liking is not really a useful metric.” Instead his team looked to see if users have completed an online transaction, or stopped halfway through.

    Agile then allows this team to quickly build prototypes in a few weeks that they can test out on volunteers and see if it’ll work. Once they’ve gathered feedback, they can quickly scale things up, he adds.

    This approach will save significant sums of money, Terrett says. “You’re not spending money on huge IT contracts or huge teams of people, so a team of 12 might be replacing a team of 100. And you’re not building features that no one wants and no one uses and you’re not wasting time duplicating.”

    The GDS believed that central controls were crucial for saving money and reducing duplication of service provision. “Some of it you just have to say: ‘Sorry it’s just got to be. I know you all had your own thing, but now we’re going to have one’.”

    Reply
  27. Tomi Engdahl says:

    Scott Stein / CNET:
    Lenovo partners with Movidius to use its Myriad 2 visual processing chip for various undisclosed VR projects, the first of which are slated for 2nd half of 2016

    Lenovo is set to reveal next-gen VR/AR hardware: here’s what we know so far
    Lenovo and Movidius are working on VR together, using chips used in DJI’s latest drones.
    http://www.cnet.com/au/news/lenovo-is-set-to-reveal-next-gen-vrar-hardware-heres-what-we-know-so-far/

    Lenovo looks like it’s set to debut new virtual reality hardware in a year already full of VR surprises, and it’s doing it with the help of a chipmaker named Movidius.

    A few days in advance of Lenovo’s Tech World event June 9 in San Francisco, chipmaker Movidius has announced via press release that its hardware will be inside a series of AR (augmented reality) and VR (virtual reality) products made by Lenovo. The first product in this partnership will debut in the second half of 2016. According to Movidius, this is a dedicated mobile VR device that’s not a phone at all.

    This device isn’t the Google Tango-powered phone that Lenovo has also been working on.

    Reply
  28. Tomi Engdahl says:

    All technologies will eventually come to an end. Now called Objective Analysis research institute predicts that the current structures based on 2D flash memory fades away over five years. It takes place based on the 3D architecture of flash technology.

    All technologies will eventually come to an end. Now called Objective Analysis research institute predicts that the current structures based on 2D flash memory fades away over five years. It takes place based on the 3D architecture of flash technology.

    The only solution is to build cells also vertically

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4542&via=n&datum=2016-06-08_10:46:55&mottagare=30929

    Reply
  29. Tomi Engdahl says:

    ReactOS is a Free Community Opensource Collaborative Compatible Free operating system.

    Imagine running your favorite Windows applications and drivers in an open-source environment you can trust. That’s ReactOS. Not just an Open but also a Free operating system.

    Source: https://www.reactos.org/

    Reply
  30. Tomi Engdahl says:

    AMD Details Next-Gen APUs
    Bristol Ridge, Stony Ridge target wide range of market segments
    http://www.eetimes.com/document.asp?doc_id=1329876&

    AMD still believes in PCs and recently announced its seventh generation of A-series processors targeted for that market. Code-named Bristol Ridge and Stony Ridge, the processors are designed to boost productivity, enhance multimedia, and improve energy efficiency.

    “There’s a lot of angst in the market about what’s going on with the PC TAM; we see our competitors on both graphics and the PC side really pulling back from the PC market,” said Kevin Lensing, AMD’s corporate vice president and general manager of its client business unit. “But we love the PC because it’s a huge revenue fan…and a place where we have a history of innovating.”

    Bristol’s CPU performance is a 50% increase from AMD’s offering 2 years ago and shows a 2.4x improvement in performance per Watt.

    Reply
  31. Tomi Engdahl says:

    Talking Star Trek
    http://hackaday.com/2016/06/08/talking-star-trek/

    Speech generation and recognition have come a long way.

    Now speech on phones is good enough you might never use the keyboard unless you want privacy. Every time we ask Google or Siri a question and get an answer it makes us feel like we are living in Star Trek.

    [Smcameron] probably feels the same way. He’s been working on a Star Trek-inspired bridge simulator called “Space Nerds in Space” for some time. He decided to test out the current state of Linux speech support by adding speech commands and response to it.

    For speech output, he used pico2wave and espeak. There’s also Festival

    PocketSphinx for speech recognition

    If your system isn’t as powerful as a full Linux box, consider uSpeech for the Arduino. You might also check out Jasper.

    Speech Recognition and Natural Language Processing in Space Nerds In Space
    https://scaryreasoner.wordpress.com/2016/05/14/speech-recognition-and-natural-language-processing-in-space-nerds-in-space/

    Reply
  32. Tomi Engdahl says:

    NASA developed under the auspices artificial intelligence processor

    AI has now suddenly become an important theme of the semiconductor industry. Former NASA scientist surprised many with the launch this week 10 years in secret cooked architecture, the basis of which has already been completed 256 processor core circuit.

    The matter is a former NASA designer Daniel Goldin, whose Knuedge company was founded back in 2005.

    It is an open DSP-processor platform, which interfaces are completely free to use. This Hermosa chips Artificial intelligence can be developed, their use then requires undoubtedly buying quite expensive processors. According to the company Hermosa sold as artificial intelligence processor is about eight times more potent than the corresponding graphics processor.

    At the same family includes Knurdl company, which provides the basis Goldin technology developed on the basis of a web-based voice recognition task of the person.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4561:nasan-suojissa-kehitettiin-keinoalyprosessori&catid=13&Itemid=101

    More: https://www.knupath.com/

    Reply
  33. Tomi Engdahl says:

    USB C is the last physical connector

    The new smart phones and laptops and peripherals usually used to treat both charging and data is associated to USB-C connector. The rapid emergence of technology has come at the same time last consumer electronics physical connector. Since then, the connections will be wireless.

    USB-C in addition to ABI Research to bring Thunderbolt 3.0, but in addition to these two major physical connectors no longer come to see. Development will still be quite fast, because by the year 2020 already 93 per cent of new laptops include C-type USB connector.

    Also smartphones new connector comes quickly. ABI predicts that after three years nearly half of all new smartphones use USB type C.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4573:usb-c-on-viimeinen-fyysinen-liitin&catid=13&Itemid=101

    Reply
  34. Tomi Engdahl says:

    IDC Cuts PC Shipment Forecast
    http://www.eetimes.com/document.asp?doc_id=1329881&

    Global PC shipments are now expected to decline more than previously predicted following a dismal first quarter, according to a revised forecast by market research firm International Data Corp. (IDC).

    Shipments are now expected to total about 256 million for the year, a decline of more than 7% compared with 2015, according to the revised forecast. First quarter shipments declined 12.5% from the same period of 2015, according to IDC (Framingham, Mass.).

    IDC blamed a host of factors for shipment constraints, including weak currencies, depressed commodity prices, political uncertainty and delayed projects.

    IDC expects the pace of shipment declines to progressively slow through 2017, followed by flat shipments in 2018.

    PC shipments, once the main driver of the semiconductor industry, have been in general decline for several years as consumers have warmed to new forms of mobile computing. However, declining growth rates for smartphones and tablets in recent months

    expects some PC market drivers to accelerate, including the rapid ascent of Chromebooks in U.S. schools enterprise upgrades to Windows 10

    Reply
  35. Tomi Engdahl says:

    E3: PlayStation VR has Star Wars and Resident Evil 7
    http://www.bbc.com/news/technology-36524103

    Sony has unveiled a raft of new games that will be playable in virtual reality on the PS4.

    New Star Wars and Resident Evil games were among those revealed at the E3 video games trade show in Los Angeles.

    Virtual reality clips for Batman and Final Fantasy games were also shown off.

    The titles will require gamers to use the PlayStation VR headset, which will cost $399 (£280) when it is released on October 13.

    “It’s not for the faint of heart in terms of the type of game experiences they’re going for, it’s really that hardcore ethos,” games industry analyst Lewis Ward at IDC told the BBC.

    Reply
  36. Tomi Engdahl says:

    Juli Clover / MacRumors:
    Apple to disable plug-ins like Flash, Java, Silverlight, and Quicktime by default in Safari 10, to focus on HTML5 content

    Safari in macOS Sierra Deactivates Flash and Other Plug-ins By Default
    http://www.macrumors.com/2016/06/14/safari-macos-sierra-plugins-disabled-default/

    In Safari 10, set to ship with macOS Sierra, Apple plans to disable common plug-ins like Adobe Flash, Java, Silverlight, and QuickTime by default in an effort to focus on HTML5 content and improve the overall web browsing experience.

    As explained by Apple developer Ricky Mondello in a post on the WebKit blog, when a website offers both Flash and HTML5 content, Safari will always deliver the more modern HTML5 implementation. On a website that requires a plug-in like Adobe Flash to function, users can activate it with a click as can be done in Google’s Chrome browser.

    Next Steps for Legacy Plug-ins
    https://webkit.org/blog/6589/next-steps-for-legacy-plug-ins/

    Reply
  37. Tomi Engdahl says:

    Jordan Novet / VentureBeat:
    Google to discontinue its Swiffy tool for converting Flash files into HTML5 on July 1; pre-existing run times will continue to work

    Google is killing its Swiffy tool for converting Flash files into HTML5 on July 1
    http://venturebeat.com/2016/06/15/google-is-killing-its-swiffy-tool-for-converting-flash-files-into-html5-on-july-1/

    Google today announced that it will discontinue Swiffy, a tool that people can use to convert .SWF Adobe Flash files into HTML5, on July 1. The Swiffy Flash extension will also stop working.

    “We will continue to serve the Swiffy runtimes, so any file you convert before the sunset date will continue to play,” Danial Klimkin, representing Google’s AdWords application programming interface (API) team, wrote in a blog post.

    Swiffy first became available 5 years ago, initially as a Google Labs product. At the time, the motivation was to make Flash animations work on devices that didn’t have Adobe Flash installed.

    Microsoft, Apple, and other browser makers have taken steps to discourage the use of Flash across the Web in general and promote HTML5 instead. There are security, performance, and web development advantages that come with this shift.

    Klimkin directed people to use Adobe Animate and Google Web Designer if they want to convert Flash SWF files into HTML5. And for playing Flash SWF files, Klimkin encouraged people to try Mozilla’s open source Shumway Flash virtual machine.

    http://mozilla.github.io/shumway/

    Reply
  38. Tomi Engdahl says:

    Stephanie Condon / ZDNet:
    IBM, The Weather Company use machine learning to create hyper-local, short-term weather forecasts with a resolution of between 0.2-1.2 miles

    IBM, The Weather Company use machine learning to predict impact of weather
    http://www.zdnet.com/article/ibm-the-weather-company-use-machine-learning-to-predict-impact-of-weather/

    Their new, hyper-local predictive model, called Deep Thunder, will use historical weather data to train machine learning models.

    Reply
  39. Tomi Engdahl says:

    Adrian Kingsley-Hughes / ZDNet:
    Around 40% of iPads in use will not be supported by iOS 10, according to Localytics, whereas all but the first-gen iPad supported iOS 9

    iOS 10 will make 40 percent of all iPads obsolete
    http://www.zdnet.com/article/ios-10-will-make-40-percent-of-all-ipads-obsolete/

    iPad sales have been declining for over two years, but iOS 10 could finally be the catalyst needed to trigger an wave of upgrades. Or it could push the iPad into irrelevance.

    Here’s an interesting fact: Every iPad that Apple had sold, with the exception of the iPad 1 (which first went on sale in April 2010 and was discontinued a year later, and over that time some 15 million units were sold), can run the latest iOS release.

    At yesterday’s WWDC 2016 keynote speech, that all changed.

    Apple announced that it was dropping support for three aging iPads: the iPad 2, the iPad 3, and the first-generation iPad mini.

    That means that come fall when iOS 10 is released, million of iPads will become obsolete.

    The iPad 2 is a very popular device (it was sold between March 2011 and March 2014), and is only one percentage point behind the iPad Air (18 percent share, compared to 17 percent).

    Now the question is – will people upgrade their now obsolete iPads, or decide that while the iPad was once a cool bit of kit, they can now live without it?

    If owners of obsolete iPads decide to upgrade then this could represent a huge wave of upgrades for Apple, and a strengthening of sales. However, if people decide that the iPad isn’t for them anymore, it could mean a massive decline in iPad market share within the tablet ecosystem and with it waning in its importance.

    Reply
  40. Tomi Engdahl says:

    Larry Dignan / ZDNet:
    Cavium, chip maker for data centers, buys networking infrastructure provider QLogic, in cash-and-stock deal valued at about $1.36B

    Cavium buys QLogic in $1.36 billion data center processor deal
    http://www.zdnet.com/article/cavium-buys-qlogic-in-1-36-billion-data-center-processor-deal/

    The two companies aim to scale better and target large customers such as HPE, Dell, Cisco, IBM and others.

    Cavium, which makes processors for enterprise data centers, said it will buy QLogic, a provider of networking infrastructure, in a deal valued at about $1.36 billion.

    The purchaseallows the combined companies to better target data center OEMs and offer a more complete stack.

    In a statement, Cavium noted that QLogic’s connectivity and storage hardware will complement its networking, compute and security gear. The portfolio is designed to better target enterprise, cloud, storage and telco markets. The combined company also said its customer base will be more diversified.

    The combined company will count HPE, Dell, Lenovo, Pure Storage, IBM, Oracle, EMC and NetApp as customers. The two companies have less than 10 percent revenue overlap in the customer base.

    Reply
  41. Tomi Engdahl says:

    jQuery 3.0 Final Released!
    https://blog.jquery.com/2016/06/09/jquery-3-0-final-released/

    What’s New in jQuery 3
    http://developer.telerik.com/featured/whats-new-in-jquery-3/

    It’s been ten years since jQuery started rocking the web and it has stuck around for very good reasons. jQuery offers its users an easy-to-use interface to interact with the DOM, perform Ajax requests, create animations, and much more. In addition, unlike the DOM API, jQuery implements the composite pattern. Because of that, you can call jQuery methods on a jQuery collection regardless of the amount of elements included in it (zero, one, or many).

    jQuery 3 will offer the possibility to iterate over the DOM elements of a jQuery collection using the for…of loop. This new iterator is part of the ECMAScript 2015 (a.k.a. ECMAScript 6) specification. It lets you loop over iterable objects (including Array, Map, Set, and so on).

    All modern browsers, including Internet Explorer 10 and above, support requestAnimationFrame. Behind the scenes, jQuery 3 will use this API when performing animations, with the goal of having smoother and less CPU-intensive animations.

    jQuery 3 also modifies the behavior of some of its features.

    In addition to the improvements described so far, jQuery also removes and deprecates a few of its features.

    Reply
  42. Tomi Engdahl says:

    Checked C
    http://research.microsoft.com/en-us/projects/checkedc/

    The Checked C research project is investigating how to extend the C programming language so that programmers can write more secure and reliable C programs. The project is developing an extension to C called Checked C that adds checking to C to detect or prevent common programming errors such as buffer overruns, out-of-bounds memory accesses, and incorrect type casts. The extension is designed to be used for existing system software written in C.

    Checked C is an extension of C that adds bounds checking to C. This repo contains the specification for the extension, test code, and samples
    https://github.com/Microsoft/checkedc

    Reply
  43. Tomi Engdahl says:

    IBM Watson Health Tackles Diabetes
    http://www.informationweek.com/big-data/big-data-analytics/ibm-watson-health-tackles-diabetes/d/d-id/1325917?

    IBM Watson Health is teaming with the American Diabetes Association to apply cognitive computing to the ADA’s 66 years worth of research and data. The results will be used to help entrepreneurs, developers, healthcare providers, and patients learn more about diabetes, prevention, complications, and care.

    Reply
  44. Tomi Engdahl says:

    World’s Tiniest Violin Uses Radar and Machine Learning
    http://hackaday.com/2016/06/15/worlds-tiniest-violin-using-radar-and-machine-learning/

    The folks at [Design I/O] have come up with a way for you to play the world’s tiniest violin by rubbing your fingers together and actually have it play a violin sound. For those who don’t know, when you want to express mock sympathy for someone’s complaints you can rub your thumb and index finger together and say “You hear that? It’s the world’s smallest violin and it’s playing just for you”, except that now they can actually hear the violin, while your gestures control the volume and playback.

    [Design I/O] combined a few technologies to accomplish this. The first is Google’s Project Soli, a tiny radar on a chip. Project Soli’s goal is to do away with physical controls by using a miniature radar for doing touchless gesture interactions.

    Project Soli’s radar is the input side for this other intriguing technology: the Wekinator, a free open source machine learning software intended for artists and musicians. The examples on their website paint an exciting picture. You give Wekinator inputs and outputs and then tell it to train its model.

    http://www.wekinator.org/
    http://www.wekinator.org/walkthrough/

    https://atap.google.com/soli/
    https://vimeo.com/155570863

    Reply
  45. Tomi Engdahl says:

    Apple Developer Conference: A More Open Siri, and Other Upgrades
    http://www.nytimes.com/2016/06/14/technology/apple-wwdc-highlights-siri-music.html

    When Steven P. Jobs ran Apple, the company’s devices were distinguished by their polished software and famous iTunes store. Today, critics and even loyal fans are taking shots at Apple’s buggy software and Apple Music, its new streaming music service.

    With its hardware sales now slowing, Apple is under pressure to fix its software and online services, which have become increasingly important to consumers. So at its annual conference for software developers on Monday, the iPhone maker tried to demonstrate that it was still a purveyor of high-quality software and services.

    Apple announced:

    ■ Improvements in the Apple Watch operating system.

    ■ Changes in the operating system for Apple TV, called tvOS.

    ■ A rebranding of its Mac operating system.

    ■ An expansion of Apple Pay.

    ■ Opening up Siri to developers.

    ■ Improvements in photos and maps.

    ■ Subscriptions through Apple News.

    ■ Opening up its messaging service to developers

    ■ A new interface for Apple Music.

    Reply
  46. Tomi Engdahl says:

    Intel’s miracle memory market this year

    ntel and Micron jointly developed xpoint is one of the most anticipated storage techniques, for example, compared to its head above a moving NAND flasheihin as much as 1000 times the performance. Intel has told very few details xpoint memory. The biggest question is the price of memory.

    According to the latest data SDD memories that use that technology will be sold under name Optane. Taiwanese website has published leaked Intel’s development plan, according to which the Optane will be on sale at the end of this year. First generation Optane memory sticks have been given the code name of Stony Beach, next year we will get Carson Beach. Both are PCIe bus driven solutions.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4595:intelin-ihmemuisti-markkinoille-jo-tana-vuonna&catid=13&Itemid=101

    Reply
  47. Tomi Engdahl says:

    Google Launches AI, Machine Learning Research Center
    http://www.eetimes.com/document.asp?doc_id=1329933&

    Google is diving deeper into artificial intelligence, with the company opening a dedicated machine learning research center in its Zurich office, the search company announced on Thursday, June 16.

    The Google Research Europe center will focus on three areas: Machine intelligence, natural language processing, and understanding and machine perception.

    “Google’s ongoing research in machine intelligence is what powers many of the products being used by hundreds of millions of people a day — from Translate to Photo Search to Smart Reply for Inbox,”

    More:
    http://www.informationweek.com/big-data/big-data-analytics/google-launches-ai-machine-learning-research-center-/d/d-id/1325942

    Reply
  48. Tomi Engdahl says:

    The Future Of Memory
    Experts at the table, part 1: DDR5 spec being defined; new SRAM under development.
    http://semiengineering.com/the-future-of-memory/

    SE: We’re seeing a number of new entrants in the memory market. What are the problems they’re trying to address, and is this good for chip design?

    Greenberg: The memory market is fracturing into High-Bandwidth Memory (HBM), HMC, and even flash on memory bus. DRAM has been around for many years. The others will be less predictable because they’re new.

    Minwell: The challenge is bandwidth. The existing memory interface technologies don’t give us the bandwidth that we need. Along with that, with additional power we’re having to go into stacking. That’s being driven by high-bandwidth memory. But there’s also a need to have embedded SRAM on chip in large enough quantities so there is low latency.

    Reply
  49. Tomi Engdahl says:

    CPU, GPU, or FPGA?
    http://semiengineering.com/cpu-gpu-or-fpga/

    Need a low-power device design? What type of processor should you choose?

    There are advantages to each type of compute engine. CPUs offer high capacity at low latency. GPUs have the highest per-pin bandwidth. And FPGAs are designed to be very general.

    But each also has its limitations. CPUs require more integration at advanced process nodes. GPUs are limited by the amount of memory that can be put on a chip.

    “FPGAs can attach to the same kind of memories as CPUs,” said Steven Woo, vice president of enterprise solutions technology and distinguished inventor at Rambus. “It’s a very flexible kind of chip. For a specific application or acceleration, they can provide improved performance and better [energy] efficiency.”

    Intel Corp.’s $16.7 billion acquisition of Altera, completed late last year, points to the flexible computing acceleration that FPGAs can offer. Microsoft employed FPGAs to improve the performance of its Bing search engine because of the balance between cost and power. But using FPGAs to design a low-power, high-performance device isn’t easy.

    “It’s harder and harder to get one-size-fits-all,” Woo said. “Some design teams start with an FPGA, then turn it into an ASIC to get a hardened version of the logic they put into an FPGA. They start with an FPGA to see if that market grows. That could justify the cost of developing an ASIC.”

    “It’s a cost-performance-power balance,” Woo said. “CPUs are really good mainstays, very flexible.” When it comes to the software programs running on them, “it doesn’t have to be vectorized code.”

    GPUs are much better graphical interfaces. They are more targeted than general-purpose CPUs. And FPGAs straddle multiple markets.

    Reprogrammable and reconfigurable FPGAs can be outfitted for a variety of algorithms, “without going through the pain of designing an ASIC.”

    Programmability, but not everywhere
    FPGAs fall into a middle area between CPUs and GPUs. That makes them suitable for industrial, medical, and military devices, where they have thrived. But even there the lines are beginning to blur.

    “The choice is between low volume, high value,” he notes. “Off-the-shelf silicon is more general purpose than you can want or afford.”

    Rowen adds, “For many of these applications, there are any number of application-specific products, this cellphone app processor or that cellphone app processor.”

    So should designers choose a CPU, GPU, or FPGA? “The right answer, in many cases, is none of the above – it’s an ASSP,” Rowen said. “You need a hybrid or an aggregate chip.”

    The industry is accustomed to integration at the board level, according to Rowen. “Board-level integration is certainly a necessity in some cases,” he said. The downside of that choice is “relatively high cost, high power [consumption].”

    So, what will it be: CPU, GPU, FPGA, ASSP, ASIC? The best answer remains: It depends.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*