Computer trends 2018

IT seems to be growing again. Gartner forecasts worldwide IT spending will increase 4.5% this year to $3.68 trillion, driven by artificial intelligence, big data analytics, blockchain technology, and the IoT.

Digital transformations are fashionable. You won’t find an enterprise that isn’t leveraging some combination of cloud, analytics, artificial intelligence and machine learning to better serve customers or streamline operations. But here’s a hard truth about digital transformations: Many are failing outright or are in danger of failing. Typical reasons for failing are not understanding what is digital transformation (different people understand it differently), lack of CEO sponsorship, talent deficiency, resistance to change. Usually a technology-first approach to digital transformation is a recipe for disaster. Truing to just push trough technically unfeasible transformation idea is another way to fail.

The digital era requires businesses to move with speed, and that is causing IT organizations to rethink how they work. A lot of  IT is moving off premises to SaaS providers and the public cloud. Research outfit 451 standout finding was that 60 per cent of the surveyed enterprises say they will run the majority of their IT outside the confines of enterprise data centres by the end of 2019. From cost containment to hybrid strategies, CIOs are getting more creative in taking advantage of the latest offerings and the cloud’s economies of scale.

In 2018 there seems to be a growing Software Engineering Talent Shortage in both quantity and quality. For the past nine years, software engineers have been at the top of the hardest to fill jobs in the United States. And same applies to many other countries including Finland. Forrester projects that firms will pay 20% above market for quality engineering talent in 2018. Particularly in-demand skills  are data scientists, high-end software developers and information security analysts. There is real need for well-studied, experienced engineers with a formal and deep understanding of software engineering. Recruiting and retaining tech talent remains IT’s biggest challenge today. Most CIOs are migrating applications to public cloud services, offloading operations and maintenance of computing, storage and other capabilities so they can reallocate staff to focus on what’s strategic to their business.

The enterprise no longer is at the center of the IT universe. It seems that reports of the PC’s demise have been greatly exaggerated and the long and painful decline in PC sales of the last half-decade as tailed off, at least momentarily. As the sales of smartphones and tablets have risen, consumers had not stopped using PCs, but merely replaced them less often. FT reports that PC is set to stage a comeback in 2018, after the rise of smartphones sent sales of desktop and laptop computers into decline in recent years. If that does not happen, then PC market could return to growth in 2019. But the end result is that PC is no longer seen as the biggest growth driver for chip makers. An extreme economic shift has chipmakers focused on hyperscale clouds.

Microservices are talked about a lot. Software built using microservices is easier to deliver and maintain than the big and brittle architectures or old; these were difficult to scale and might take years to build and deliver. Microservices are small and self-contained, so therefore easy to wrap up in a virtual machine or a container (but don’t have to live in containers). Public cloud providers increasingly differentiate themselves through the features and services they provide. But it turns out that microservices are far from being one-size-fit-for-all silver bullet for IT challenges.

Containers will try to make break-trough again in 2018. Year 2017 was supposed to be the year of containers! It wasn’t? Oops. Maybe year 2018 is better. Immature tech still has a bunch of growing up to do. Linux Foundation’s Open Containers Initiative (OCI) finally dropped two specifications that standardise how containers operate at a low level. The needle in 2018 will move towards containers running separately from VMs, or entirely in place of VMs. Kubernates gains traction. It seems that the containers are still at the point where the enterprise is waiting to embrace them.

Serverless will be talked about. Serverless computing is a cloud computing execution model in which the cloud provider dynamically manages the allocation of machine resources. Serverless architectures refer to applications that significantly depend on third-party services (knows as Backend as a Service or “BaaS”) or on custom code that’s run in ephemeral containers (Function as a Service or “FaaS”), the best known vendor host of which currently is AWS Lambda.

Automation is what everybody with many computers wants. Infrastructure automation creates and destroys basic IT resources such as compute instances, storage, networking, DNS, and so forth. Security automation helps keeping systems secure. It bosses want to create self-driving private clouds. The journey to self-driving clouds needs to be gradual. The vision of the self-driving cloud makes sense, but the task of getting from here to there can seem daunting. DevOps automation with customer control: Automatic installation and configuration, Integration that brings together AWS and VMWare, workflows migration controlled by users, Self-service provisioning based on templates defined by users, Advanced machine learning to automate processes, and Automated upgrades.

Linux is center of many cloud operations: Google and Facebook started building their own gear and loading it with their own software. Google has it’s own Linux called gLinux.  Facebook networking uses Linux-based FBOSS operating system. Even Microsoft has developed its own Linux for cloud operations. Software-defined networking (SDN) is a very fine idea.

Memory business boomed in 2017 for both NAND and DRAM. The drivers for DRAM are smartphones and servers. Solid-state drives (SSDs) and smartphones are fueling the demand for NANDNAND Market Expected to Cool in Q1 from the crazy year 2017, but it is still growing well because there is increasing demand. Memory — particular DRAM — was largely considered a commodity business.

Lots of 3D NAND will go to solid state drives in 2018. IDC forecasts strong growth for the solid-state drive (SSD) industry as it transitions to 3D NAND.  SSD industry revenue is expected to reach $33.6 billion in 2021, growing at a CAGR of 14.8%. Sizes of memory chips increase as number of  layer in 3D NAND are added. The traditional mechanical hard disk based on magnetic storage is in hard place in competition, as the speed of flash-based SSDs is so superior

There is search for faster memory because modern computers, especially data-center servers that skew heavily toward in-memory databases, data-intensive analytics, and increasingly toward machine-learning and deep-neural-network training functions, depend on large amounts of high-speed, high capacity memory to keep the wheels turning. The memory speed has not increased as fast as the capacity. The access bandwidth of DRAM-based computer memory has improved by a factor of 20x over the past two decades. Capacity increased 128x during the same period. For year 2018 DRAM remains a near-universal choice when performance is the priority. There is search going on for a viable replacement for DRAM. Whether it’s STT-RAM or phase-change memory or resistive RAM, none of them can match the speed or endurance of DRAM.



PCI Express 4.0 is ramping up. PCI-standards consortium PCI-SIG (Special Interest Group) has ratified and released specifications for PCIe 4.0 Specification Version 1. Doubling PCIe 3.0’s 8 GT/s (~1 GB/s) of bandwidth per lane, PCIe 4.0 offers a transfer rate of 16 GT/s. The newest version of PCI Express will start appearing on motherboards soon. PCI-SIG has targeted Q2 2019 for releasing the finalized PCIe 5.0 specification, so PCIe 4.0 won’t be quite as long-lived as PCIe 3.0 has been. So we’ll See PCIe 4.0 this year in use and PCIe 5.0 in 2019.

USB type C is on the way to becoming the most common PC and peripheral interface. The USB C connector has become faster more commonplace than any other earlier interface. USB C is very common on smartphones, but the interface is also widespread on laptops. Sure, it will take some time before it is the most common. In 2021, the C-type USB connector has almost five billion units, IHS estimates.

It seems that the after-shocks of Meltdown/Spectre vulnerabilities on processors will be haunting us for quite long time this year. It is now three weeks since The Register revealed the chip design flaws that Google later confirmed and the world still awaits certainty about what it will take to get over the silicon slip-ups. Last pieces of farce has been that Intel Halts Spectre, Meltdown CPU Patches Over Unstable Code and Linux creator Linus Torvalds criticises Intel’s ‘garbage’ patches. Computer security will not be the same after all this has been sorted out.

What’s Next With Computing? IBM discusses AI, neural nets and quantum computing. Many can agree that those technologies will be important. Public cloud providers increasingly provide sophisticated flavours of data analysis and increasingly Machine Learning (ML) and Artificial Intelligence (AI). Central Banks Are Using Big Data to Help Shape Policy. Over the past few years, machine learning (ML) has evolved from an interesting new approach that allows computers to beat champions at chess and Go, into one that is touted as a panacea for almost everything. 2018 will be the start of what could be a longstanding battle between chipmakers to determine who creates the hardware that artificial intelligence lives on.

ARM processor based PCs are coming. As Microsoft and Qualcomm jointly announced in early December that the first Windows 10 notebooks with ARM-based Snapdragon 835 processors will be officially launched in early 2018, there will be more and more PCs with ARM processor architecture hitting the market. Digitimes Research expects that ARM-based models may dominate lower-end PC market, but don’t hold your breath on this. It is rumoured that “wireless LTE connectivity” function will be incorporated into all the entry-level Window 10 notebooks with ARM processors, branded by Microsoft as the “always-connected devices.” HP and Asustek have released some ARM-based notebooks with Windows 10S.

Ohjelmistoalan osaajapula pahenee – kasvu jatkuu

PC market set to return to growth in 2018

PC market could return to growth in 2019

PC sales grow for the first time in five years

USBC yleistyy nopeasti

PCI-SIG Finalizes and Releases PCIe 4.0, Version 1 Specification: 2x PCIe Bandwidth and More

Hot Chips 2017: We’ll See PCIe 4.0 This Year, PCIe 5.0 In 2019

Serverless Architectures

Outsourcing remains strategic in the digital era

8 hot IT hiring trends — and 8 going cold

EDA Challenges Machine Learning

The Battle of AI Processors Begins in 2018

How to create self-driving private clouds

ZeroStack Lays Out Vision for Five-Step Journey to Self-Driving Cloud

2017 – the year of containers! It wasn’t? Oops. Maybe next year

Hyperscaling The Data Center

Electronics trends for 2018

2018′s Software Engineering Talent Shortage— It’s quality, not just quantity

Microservices 101

How Central Banks Are Using Big Data to Help Shape Policy

Digitimes Research: ARM-based models may dominate lower-end PC market

Intel Halts Spectre, Meltdown CPU Patches Over Unstable Code

Spectre and Meltdown: Linux creator Linus Torvalds criticises Intel’s ‘garbage’ patches

Meltdown/Spectre week three: World still knee-deep in something nasty

What’s Next With Computing? IBM discusses AI, neural nets and quantum computing.

The Week in Review: IoT

PCI Express 4.0 as Fast As Possible

Microsoft has developed its own Linux!

Microsoft Built Its Own Linux Because Everyone Else Did

Facebook has built its own switch. And it looks a lot like a server

Googlella on oma sisäinen linux

Is the writing on the wall for on-premises IT? This survey seems to say so

12 reasons why digital transformations fail

7 habits of highly effective digital transformations



  1. Tomi Engdahl says:

    Is implementing and managing Linux applications becoming a snap?

    Learn what “snaps” are and why are they making the development and installation of Linux applications so trouble-free.

  2. Tomi Engdahl says:

    Paul Alcorn / Tom’s Hardware:
    How AMD stays within its x86 IP cross-licensing agreements with Intel yet enables Chinese firm Hygon to design and sell AMD Zen-based x86 server chips in China — Chinese-designed “Dhyana” x86 processors based on AMD’s Zen microarchitecture are beginning to surface from Chinese chip producer Hygon.

    China Finds Zen: Begins Production Of x86 Processors Based On AMD’s IP,37417.html

    Chinese-designed “Dhyana” x86 processors based on AMD’s Zen microarchitecture are beginning to surface from Chinese chip producer Hygon. The processors come as the fruit of AMD’s x86 IP licensing agreements with its China-based partners and break the decades-long stranglehold on x86 held by the triumvirate of Intel, AMD and VIA Technologies. Details are also emerging that outline how AMD has managed to stay within the boundaries of the x86 licensing agreements but still allow Chinese-controlled interests to design and sell processors based on the Zen design.

    AMD’s official statements indicate the company does not sell its final chip designs to its China-based partners. Instead, AMD allows them to design their own processors tailored for the Chinese server market. But the China-produced Hygon “Dhyana” processors are so similar to AMD’s EPYC processors that Linux kernel developers have listed vendor IDs and family series numbers as the only difference. In fact, Linux maintainers have simply ported over the EPYC support codes to the Dhyana processor and note that they have successfully run the same patches on AMD’s EPYC processors, implying there is little to no differentiation between the chips.

    The new chips are surfacing against the backdrop of the trade war between the US and China that could escalate quickly, likely reinforcing China’s long-held opinion that a lack of native processor production could be a strategic liability.

    That makes it even more surprising that AMD has managed to establish a franchise that allows Chinese processor vendors to develop and sell x86 processors in spite of US regulations and the licensing restrictions with Intel

    HMC owns the x86 IP and ends up producing the chips, which satisfies the AMD and Intel x86 cross-licensing agreements because the IP remains with a company owned primarily by AMD.

    HMC licenses the IP to Hygon, which designs the x86 chips and then sells the design back to HMC.

    HMC then employs a foundry to fab the end product (likely China Foundries or TSMC). Confusingly, HMC then transfers the chips back to Hygon (the same company that designed them), which then sells the Dhyana processors.

    And thus, AMD’s licensing of the x86 IP stays within the legal boundaries. According to the agreement, the final products can only be sold within China’s borders. That opens up a huge opportunity for AMD via royalties due to the exploding China data center market

  3. Tomi Engdahl says:

    Alibaba Group Holding Ltd. is in talks with BT Group Plc about a cloud services partnership as the Chinese internet giant challenges Inc.’s dominance in Europe.

    An agreement between Alibaba and the IT consulting unit of Britain’s former phone monopoly could be similar to Alibaba’s existing arrangement with Vodafone Group Plc in Germany, according to a person familiar with the matter, who asked not to be identified as the talks are private.

    Meet Surface Go, starting at $399 MSRP, it’s the smallest and most affordable Surface yet

  4. Tomi Engdahl says:

    Visualizing The World’s 20 Largest Tech Giants

    Large companies can be located all over the globe.

    For example, massive auto companies can be found practically anywhere on a map.

    While the banking, pharma, energy, and retail industries also have geographic spread as well, the same cannot be said for the rapidly-growing tech industry

    Of the 20 largest tech giants globally, a total of zero are located outside of the United States and China.

  5. Tomi Engdahl says:

    Patient Uninterrupted: 4 Things in Healthcare IT that Must be Monitored

    In recent years, medical healthcare infrastructure has gone through digitization, just as many other industries have. The IoT has brought an array of new connected medical devices that are revolutionizing the medical field. This digitization has led to medical devices and the traditional IT infrastructure becoming more and more intertwined. This subsequently means that medical IT has more points of failure than ever before.

  6. Tomi Engdahl says:

    Broadcom acquires CA Technologies for $18.9B in cash

    Broadcom, the massive semiconductor supplier you may remember from its failed attempt to acquire Qualcomm, today announced that it has reached a definitive agreement with CA Technologies, a major IT management software and solutions provider.

  7. Tomi Engdahl says:

    Getting Ready for Java 11

    With the latest version of your favorite programming language just around the corner, it’s time to get ahead of the curve and figure out if updating is worthwhile. In this article, we’ll take a look at what’s being added to Java that will have an immediate impact on how you write code and the overall performance of your applications.

  8. Tomi Engdahl says:

    Why self-regulation is better than legislative regulation

    We are moving toward a society controlled by algorithms, but very few of us actually understand how they work. This asymmetry of information is a recipe for disaster. Case in point: Recently in the U.K., an algorithmic failure put the lives of 450,000 woman at risk through a technical error that inhibited their ability to detect breast cancer.

    Unfortunately, this is not an anomaly, and if the tech industry doesn’t take the lead on imposing oversights to our algorithms, the government may create its own regulations — causing roadblocks to innovation.

  9. Tomi Engdahl says:

    Announcing updated Red Hat Developer Studio and Container Development Kit

    the release of Red Hat Container Development Kit (CDK) 3.5 and Red Hat Developer Studio 12. Whether you are developing traditional or cloud-based applications and microservices, you can run these tools on your Windows, macOS, or Red Hat Enterprise Linux laptop to streamline development

  10. Tomi Engdahl says:

    Facebook, Google and more unite to let you transfer data between apps

    The Data Transfer Project is a new team-up between tech giants to let you move your content, contacts, and more across apps. Founded by Facebook, Google, Twitter, and Microsoft, the DTP today revealed its plans for an open source data portability platform any online service can join. While many companies already let you download your information, that’s not very helpful if you can’t easily upload and use it elsewhere

  11. Tomi Engdahl says:

    Nintendo Sues Console ROM Sites For ‘Mass’ Copyright Infringement
    BY ERNESTO ON JULY 20, 2018 C: 75

    Nintendo has filed a lawsuit against the alleged operator of the popular console ROM sites and The sites are among the most notorious online hubs for pirated games, according to Nintendo, and face millions of dollars in potential damages

  12. Tomi Engdahl says:

    Python creator Guido van Rossum sys.exit()s as language overlord

    ‘Benevolent dictator for life’ tired of the hate, leaves behind no successor or governance

    Guido van Rossum – who created the Python programming language in 1989, was jokingly styled as its “benevolent dictator for life”, and ushered it to global ubiquity – has stepped down, and won’t appoint a successor.

    “Now that PEP 572 is done, I don’t ever want to have to fight so hard for a PEP and find that so many people despise my decisions.”

    van Rossum’s said: “I’m tired, and need a very long break.” Hence the decision “to remove myself entirely from the decision process.”

    He’s left behind no governing principles or a successor, but said a debate on those issues was coming

    Van Rossum’s achievements are hard to overstate: Python is among the most-used languages in the world. It’s advanced as an ideal beginners’ language, and has also been used in heavyweight enterprise apps.

    CodingDojo recently rated it the second-most-in-demand skill in job ads for developers. Stack Overflow’s 2018 developer survey ranked Python as the seventh-most popular “Programming, Scripting, and Markup Language”, ahead of C#, Ruby and PHP.

  13. Tomi Engdahl says:

    Linux containers, virtualization, and services: “Three’s a party” in modern IT

    Linux containers are often positioned as disruptive to traditional virtualization, frequently culminating in the question: Will containers kill virtualization? It’s a fair question, given the shared similarities in workload isolation, resource utilization, and so on, but the answer is a hard “no.” They’re complementary, each solving a unique challenge for the enterprise; that said, historically they don’t actually integrate or work well together. This means separate application stacks, separate developer workflows, and so on.

  14. Tomi Engdahl says:

    EU fines Asus, Denon & Marantz, Philips and Pioneer $130M for online price fixing

    The European Union’s antitrust authorities have issued a series of penalties, fining consumer electronics companies Asus, Denon & Marantz, Philips and Pioneer more than €110 million (~$130M) in four separate decisions for imposing fixed or minimum resale prices on their online retailers in breach of EU competition rules.

    It says the four companies engaged in so-called “fixed or minimum resale price maintenance (RPM)”

    Asus has been hit with the largest fine (€63.5M), followed by Philips (€29.8M). The other two fines were €10.1M for Pioneer, and €7.7M for Denon & Marantz.

    “The online commerce market is growing rapidly and is now worth over 500 billion euros in Europe every year. More than half of Europeans now shop online

  15. Tomi Engdahl says:

    Tencent is investing $150M a year in the $13B esports market and working with the NBA and Under Armour on revenue deals, as the gaming mania spreads in China

    Inside Tencent’s Gambit to Dominate a $13 Billion Esports Arena

  16. Tomi Engdahl says:

    Big tech warns of ‘Japan’s millennium bug’ ahead of Akihito’s abdication

    Emperor’s 2019 exit will be first era change of information age, and switchover could be as big as Y2K say industry figures

    On 30 April 2019, Emperor Akihito of Japan is . The decision was announced in December 2017 so as to ensure an orderly transition to Akihito’s son, Naruhito, but the coronation could cause concerns in an unlikely place: the technology sector.

    The Japanese calendar counts up from the coronation of a new emperor, using not the name of the emperor, but the name of the era they herald.

    many systems have never had to deal with a switchover in era. For another, the official name of Naruhito’s era has yet to be announced, causing concern for , calendar printers and international standards bodies.

    It’s why some are calling it “Japan’s problem”.

    “The magnitude of this event on computing systems using the Japanese Calendar may be similar to the Y2K event with the Gregorian Calendar,”

  17. Tomi Engdahl says:

    Windows 10 will try not to reboot when you’re just grabbing a cup of coffee

    New system will try not to reboot when you’re expected to return to your computer soon.

    The next semi-annual update to Windows 10 will use machine learning models to make automatic rebooting for updates a bit less annoying. The models will attempt to predict when you’re likely to return to your PC and not update if you’re expected back soon.

  18. Tomi Engdahl says:

    Why moving all your workloads to the cloud is a bad idea

    In the third installment in this series on common pitfalls of moving to the cloud, learn how to figure out which applications you shouldn’t migrate.

  19. Tomi Engdahl says:

    Christopher Mims / Wall Street Journal:
    New study says that top companies like Amazon, Google, and Microsoft have higher productivity growth than others because of higher proprietary IT spending

    Why Do the Biggest Companies Keep Getting Bigger? It’s How They Spend on Tech

    The secret of success for Amazon, Google and Microsoft is how much they invest in their own technology

    Your suspicions are correct: The biggest companies in every field are pulling away from their peers faster than ever, sucking up the lion’s share of revenue, profits and productivity gains.

    Economists have proposed many possible explanations: top managers flocking to top firms, automation creating an imbalance in productivity, merger-and-acquisition mania, lack of antitrust regulation and more.

    But new data suggests that the secret of the success of the Amazons, Googles and Facebook s of the world—not to mention the Walmart s, CVSes and UPSes before them—is how much they invest in their own technology.

    There are different kinds of IT spending. For the first few decades of the PC revolution, most companies would buy off-the-shelf hardware and software. Then, with the advent of the cloud, they switched to services offered by the likes of Amazon, Google and Microsoft . Like the difference between a tailored suit and a bespoke one, these systems can be customized, but they aren’t custom.

    IT spending that goes into hiring developers and creating software owned and used exclusively by a firm is the key competitive advantage.

    Today’s big winners went all in

    Tech companies such as Google, Facebook, Amazon and Apple—as well as other giants including General Motors and Nissan in the automotive sector, and Pfizer and Roche in pharmaceuticals—built their own software and even their own hardware, inventing and perfecting their own processes instead of aligning their business model with some outside developer’s idea of it.

    The result is our modern economy, and the problem with such an economy is that income inequality between firms is similar to income inequality between individuals: A select few monopolize the gains, while many fall increasingly behind.

    This also has implications for wages—the rise in the wage gap since 1978 is almost entirely attributed to an increase at more-productive firms that occurred as pay at less-productive firms remained relatively static

    When new technologies were developed in the past, they would diffuse to other firms fast enough so that productivity rose across entire industries.

    Samuel Slater, the “father of America’s industrial revolution,” was able to more or less single-handedly bring England’s pioneering power-loom technology to the U.S. by apprenticing himself to an English weaver and memorizing the design of his looms and mills.

    20 years ago, firms could adopt Microsoft Office or Adobe ’s desktop publishing software and instantly disrupt larger firms that were slower to adopt this new technology.

    But imagine instead of power looms, someone is trying to copy and reproduce Google’s cloud infrastructure itself.

    One explanation for how this came to be is that things have just gotten too complicated.

    While in the past it might have been possible to license, steal or copy someone else’s technology, these days that technology can’t be separated from the systems of which it’s a part.

    Just spending money on technology doesn’t cut it, however.

    This seemingly insurmountable competitive advantage that comes with big companies’ IT intensity may explain the present-day mania for mergers and acquisitions

    It may be difficult or impossible to obtain critical technologies any other way.

    biggest firms are becoming more productive across many countries—in both the U.S. and in Europe

    It’s not clear just how long this phenomenon will drive the biggest firms in each sector to grow faster than their competitors.

  20. Tomi Engdahl says:

    Australia’s Digital Transformation Stumbles Badly

    An Australian Senate committee published a 146-page report assessing the government’s progress toward its goal of becoming “one of the top three digital governments in the world…that other nations can look to for guidance and inspiration,” by 2025. Given what is in the report, other nations may want to look elsewhere for their inspiration.

    The report stated, for instance, that “it has become clear to the committee that digital transformation is a policy area beset by soaring rhetoric and vague aspirations by government, largely unconnected to the actual policy activities actually undertaken.”

    In addition, the Liberal-National coalition government’s execution of its digital services has been marked by a “litany of failures largely unprecedented in scale and degree,”

    Failures cited in the report include outages at the Australian Taxation Office, significant operational difficulties with the Department of Human Services’ Online Compliance Intervention program, and the cancellation of the Australian Apprenticeship Management System project

    several familiar causes of these digital woes, such as a lack of leadership vision, direction, and accountability; government personnel who lack the necessary expertise; and an overdependence on outsourcing and consultants; overly optimistic project objectives, costs, and schedules along with procurement strategies that do not reflect the myriad of risks involved

    does speak to many of the fundamental problems that Australia, as well as many other governments, are experiencing while trying to implement “digital transformation.” This is especially true when the goal is to attain the Estonian level of government digitalization, which is considered the gold standard.

    A fundamental premise of digitized government is that it will increase citizens’ trust in government by showing that it is fair and transparent in its decisions.

    Many Australians, especially the poor, now see their government using digital technology as an indiscriminate, uncaring, and illegal club to beat them with. The government’s planned use of facial recognition to determine if a welfare recipient should receive benefits will do nothing to change their minds.

  21. Tomi Engdahl says:

    Australians Say No Thanks to Electronic Health Records

    A political firestorm erupted last week over the Australian government’s move to create a shareable national electronic health record for all 24.7 million of its citizens by December of this year. Unless an individual opts out of having a My Health Record by 15 October 2018, the government will create one for them that will be kept for 30 years after the person dies, or for 130 years after a person’s birth if their death date is unknown.

    The government and many health care associations
    have been touting the benefits of the system.

    However, numerous privacy advocates, health practitioners, and even the former director of the Government’s Digital Transformation Agency contend that the medical benefits being claimed don’t stack up, and further assert there are significant security and privacy risks involved.

    Government ministers are struggling mightily to reassure Australians that their My Health Record will be secure and their privacy respected, but it is unclear how much credibility anyone puts into those promises.

    Even though most Australians will likely end up having a My Health Record by the end of the year, it doesn’t mean the government can declare victory by any means. Its e-health record system must quickly prove more beneficial and easier to use for health care practitioners and individuals than is currently the case, while avoiding any significant data breaches or privacy leaks.

  22. Tomi Engdahl says:

    Who’s Hiring? (Amazon, Walmart) Who’s Firing? (HP Inc., IBM)

    the world, with Amazon, Walmart, and PayPal all making big plans to expand their engineering workforces. In e-commerce, AI researchers are in huge demand—and e-commerce isn’t alone there, there’s love for AI engineers in many fields. You’ll also notice that Canada seems to be getting more of these new engineering jobs than Silicon Valley, at least according to recent announcements.

    HP Inc., the personal computer and printer side of the former Hewlett-Packard Co., in June indicated that it would be cutting 5,000 jobs in 2018 and 2019; about 1,000 to 2,000 more than previously discussed.

    Tesla in June announced plans to cut 3,000

    Broadcom, based in San Jose, Calif., in June indicated that it has cut 1,100 employees, and more cuts may be coming

    IBM in May slashed workers across its Watson Health division; the layoffs mostly targeted those who came into IBM through acquisitions.

    Qualcomm in June gave notice that it would be laying off 241 people

    Western Digital in July announced plans to cut 79 workers from San Jose operations involved in hard disk drives.

    Intel in June filed plans to lay off 65 workers in Silicon Valley.

  23. Tomi Engdahl says:

    Lidl software disaster another example of Germany’s digital failure

    It was to be a great digital leap for Germany’s biggest discount grocer. Instead, after seven years and €500 million, Lidl’s new inventory management system with SAP is dead on arrival. Now everybody‘s asking why.

    It was to be a grand, transformative project for the company, the biggest in grocery chain Lidl’s history. And success appeared certain. Lidl and German software giant SAP are leaders in their respective fields. About a thousand staff and hundreds of consultants implemented a new company-wide system for inventory control for the discount grocery chain, which has close to €100 billion in annual revenue.

    But by May 2017, Lidl’s head of IT, Alexander Sonnenmoser, had left, and in July this year, eLWIS was pronounced dead on arrival. Lidl would need to revert to its old inventory system. “We are practically starting from scratch,”

    All this after spending an estimated €500 million on eLWIS.

    Apparently one of the biggest problems was a “but this is how we always do it” mentality at Lidl. Changing the software necessitated reassessing almost every process at the company, insiders say. But Lidl’s management was not prepared to do that.

    Unlike many of their competitors, Lidl bases its inventory management system on purchase prices. The standard SAP for Retail software uses retail prices. Lidl didn’t want to change, so the software had to be adapted.

    Performance fell, costs rose. Altering existing software is like changing a prefab house, IT experts say — you can put the kitchen cupboards in a different place, but when you start moving the walls, there’s no stability.

    “The strategic goals as originally defined were not possible at an acceptable expense,”

    In-house software systems are often cobbled together over decades and outdated. Not using standard software already makes “the adaptation of new technologies and transformation difficult,”

    Communication may have also been a problem.

    Who’s actually to blame?

    The cancellation of the project at Lidl is doubly annoying to SAP: They’ve lost some big business, and all the rumors are casting doubt on their own work.

    One of Lidl’s biggest challenges now, apart from keeping its old inventory management system afloat, will be retaining all the IT experts it hired. Software engineers are highly in-demand all over Europe, and talented professionals may not want to keep fiddling with 30-year-old in-house code.

  24. Tomi Engdahl says:

    Do Businesses Really Need to Hire CS Majors?

    A a new article in CIO magazine argues that when it comes to computer science, “few of us really need much of any of it.” Slashdot reader itwbennett offers this summary:
    At the heart of the matter is the fact that most businesses don’t really need programmers to be deep thinkers. For them, it’s “just as worthwhile to hire someone from a physics lab who just used Python to massage some data streams from an instrument. They can learn the shallow details just as readily as the CS genius,” according to the article.

    CIO’s anonymous author promises an incomplete list of “why we may be better off ignoring CS majors.” Some of the highlights:

    Theory distracts and confuses. “Many computer scientists are mathematicians at heart and the theorem-obsessed mindset permeates the discipline.”

    Academic languages are rarely used. “…the academy breeds snobbery and a love for arcane solutions.”

    Many CS professors are mathematicians, not programmers. “One of the dirty secrets about most computer science departments is that most of the professors can’t program computers. Their real job is giving lectures and wrangling grants….”

    Many required subjects are rarely used. “…it’s too bad few of us use many data structures any more.”

    Institutions breed arrogance. “…the very nature of academic degrees are designed to give graduates the ability to argue one’s superiority with authority. ”

    Many modern skills are ignored. “If you want to understand Node.js, React, game design or cloud computation, you’ll find very little of it in the average curriculum… It’s very common for computer science departments to produce deep thinkers who understand some of the fundamental challenges without any shallow knowledge of the details that dominate the average employee’s day.”

    “It’s not that CS degrees are bad,” the article concludes. “It’s just that they’re not going to speak to the problems that most of us need to solve.”

  25. Tomi Engdahl says:

    ‘The Problem With Programming and How To Fix It’

    Posted by EditorDavid on Saturday August 04, 2018 @10:34AM from the move-slow-and-fix-things dept.
    Jonathan Edwards has been programming since 1969 (starting on a PDP-11/20). “Programming today,” he writes, “is exactly what you’d expect to get by paying an isolated subculture of nerdy young men to entertain themselves for fifty years. You get a cross between Dungeons & Dragons and Rubik’s Cube, elaborated a thousand-fold.”

    theodp summarizes the rest:
    To be a ‘full stack’ developer, Edwards laments, one must master the content of something like a hundred thousand pages of documentation. “Isn’t the solution to design technology that doesn’t require a PhD…?” he asks. “What of the #CSForAll movement? I have mixed feelings. The name itself betrays confusion — what we really want is #ProgrammingForAll. Computer science is not a prerequisite for most programming, and may in fact be more of a barrier to many. The confusion of computer science with programming is actually part of the problem, which seems invisible to this movement.”

  26. Tomi Engdahl says:

    Windows 10 Leak Exposes Microsoft’s New Monthly Charge

    642,402 views|Aug 4, 2018,9:30 pm
    Windows 10 Leak Exposes Microsoft’s New Monthly Charge
    Gordon KellyContributor
    Consumer Tech
    I write about technology’s biggest companies
    Ever since its creation, Microsoft has described Windows 10 “as a service”. The fear has always been that this meant Microsoft would start charging users a monthly fee to maintain the operating system, and now a new leak has confirmed this is exactly what will happen…

    In a new report, CNet’s well connected Microsoft specialist Mary Jo Foley reports the company will soon launch ‘Microsoft Managed Desktop’ which will charge a monthly fee to configure computers running Windows 10 and keep them running smoothly as new updates are released.

  27. Tomi Engdahl says:

    Thomas Claburn / The Register:
    Intel says it earned $1B in 2017 from sales of Xeon processors running AI workloads in data centers and provides an update for its data center chip roadmap — Optane DC persistent memory ships to Google, Xeon roadmap, and more revealed — At Intel’s Data-Centric Innovation Summit today in Santa Clara …

    Intel: Yeah, yeah, 10nm. It’s on the todo list. Now, let’s talk about AI…
    Optane DC persistent memory ships to Google, Xeon roadmap, and more revealed

  28. Tomi Engdahl says:

    The age of hard drives is over as Samsung cranks out consumer QLC SSDs
    Chaebol to tease desktop units with up to 4TB later this year

    Samsung has started mass production of the world’s first QLC (quad-level cell) consumer SSD.

    QLC is 4 bits/cell flash technology and a next step in cell bit capacity from the current TLC (3 bits/cell). Foundries belonging to Intel/Micron, SK Hynix and WD/Toshiba have started producing QLC chips and, in some cases, SSDs already. For example, Micron has a 7.68TB QLC SSD called the 5210 ION. SK Hynix has not yet confirmed any QLC technology developments.

    Samsung announced a QLC SSD development earlier this month. It has followed that up by saying it is now mass-producing a 4TB QLC consumer SSD.

    Samsung Electronic’s Jaesoo Han, exec veep for memory sales and marketing, proclaimed: “Samsung’s new 4-bit SATA SSD will herald a massive move to terabyte-SSDs for consumers. As we expand our lineup across consumer segments and to the enterprise, 4-bit terabyte-SSD products will rapidly spread throughout the entire market.”

    That may well be the case, but this “product” has no name, no price, no availability, and no detailed IOPS and endurance information, making the announcement possibly a tad premature.

    With this QLC tech, Samsung said it will “introduce several 4-bit consumer SSDs later this year with 1TB, 2TB, and 4TB capacities” in the 2.5-inch form factor. We look forward to getting more details then.

  29. Tomi Engdahl says:

    From developer to IT leader: 3 keys to make the leap

    Transitioning from developer to IT leader? Lean on three skills that are already in your personal code

  30. Tomi Engdahl says:

    New interconnect technologies tackle higher data storage demands

    New connector and cable technologies deliver higher data rates, higher signal integrity, and higher density for next-generation data centers

    Our civilization depends on data. Almost all technology and devices connected to the internet of things depend on storing data in the cloud. The massive demand for digital storage has led to an equally massive increase in demand. Every year, data storage requirements increase by 40%. As the need for reliable and powerful data storage centers increases, so does the technology. Recently, new technologies have been developed to handle the expected 40 zettabytes of information that will need to be stored by 2020.

    Several factors are contributing to the growing need for data storage. Cloud storage has become widespread in recent years with more and more people storing information online rather than on their devices. Artificial intelligence (AI) is another contributor as it requires massive amounts of memory to operate. Deep-learning and machine-learning computing systems also require extensive storage to operate.

  31. Tomi Engdahl says:

    Nvidia unveils Turing architecture and GPUs with dedicated ray-tracing hardware

    Nvidia has unveiled its new Turing architecture along with details of the first GPUs to use it. Turing includes dedicated “RT Core” hardware designed to drive ray tracing, a complex technique that can deliver extremely realistic lighting effects but has been prohibitively resource-intensive to render in real time. Nvidia calls the new Turing-based Quadro RTX the “world’s first ray-tracing GPU” and claims it’s the biggest leap since the company introduced CUDA in 2006.

  32. Tomi Engdahl says:

    NVIDIA Reveals Next-Gen Turing GPU Architecture: NVIDIA Doubles-Down on Ray Tracing, GDDR6, & More

    at NVIDIA’s SIGGRAPH 2018 keynote presentation, company CEO Jensen Huang formally unveiled the company’s much awaited (and much rumored) Turing GPU architecture.

  33. Tomi Engdahl says:


    The traditional data center, a telecommunication facility that houses bare-metal IT equipment, used to deliver various services, has been rapidly changing for the last 15 years. The change started with the virtualization technologies, which made possible for the IT services – HTTP(s), FTP, Email, AntiSpam, AntiVirus, Application Hosting and many other services – to be delivered from virtualized computing instances, rather than from a bare-metal servers. The adoption of virtualization techniques helped administrators to create hardware agnostic and more consistent technological frameworks, which can be scaled on demand without a dependence on vendor lock-in hardware.

  34. Tomi Engdahl says:

    The Most Epic Demo in Computer History Is Now an Opera

    If there’s such a thing as a Big Bang moment for modern computing, it happened on December 9, 1968. On that day, in an underground convention center in the heart of San Francisco, Doug Engelbart gave The Mother Of All Demos, introducing the world to an astonishing slew of technologies including word processing, video conferencing, windows, links, and the humble mouse. Over the course of the 90-minute demonstration, Engelbart laid the foundation for computing for decades to come.

  35. Tomi Engdahl says:

    Back To School Classroom Tech
    12 ways that technology is enhancing education and the classroom.

  36. Tomi Engdahl says:

    Nvidia’s new Turing architecture is all about real-time ray tracing and AI

    In recent days, word about Nvidia’s new Turing architecture started leaking out of the Santa Clara-based company’s headquarters. So it didn’t come as a major surprise that the company today announced during its Siggraph keynote the launch of this new architecture and three new pro-oriented workstation graphics cards in its Quadro family.

  37. Tomi Engdahl says:

    Nvidia’s Turing Makes Graphics Dreams Come True

    Nvidia’s new Turing architecture promises speeds that would make real time ray tracing possible.

    The history of real-time graphics used in video games and interactive media is a history of compromises.

    The goal of graphics vendors has been to create images as realistic as possible within a frame time (nominally 1/30th of a second). But when it comes to truly realistic images, the gold standard has been ray tracing — where computers model the flight of light rays within a scene bouncing off surfaces where it gains surface color and texturing.

    For a high-resolution, complex scene, with many rays per surface point, each frame can take hours to fully render. This is how movie studios render their computer generated images (CGI) for digital special effects.

    Most of that rendering takes place in server farms filled with Intel Xeon processors today. At this year’s Siggraph (the yearly technical conference on graphics technology), Nvidia may have changed all that. The company introduced its new Turing architecture, which includes a ray-tracing engine that speed up the ray tracing process by six times the performance of its existing Pascal GPU. Nvidia rates the Turing at 10 giga-light-rays/second. With that speed, real-time ray tracing is possible.

  38. Tomi Engdahl says:

    Intel in the Cloud Post-Moore’s Law
    X86 giant expands its bag of tech tricks

    The decline of Moore’s Law affects the entire semiconductor industry, but perhaps no company more viscerally than the one that Gordon Moore co-founded. Here at its headquarters, Intel projected an upbeat image at a data center event, and it showed how profoundly the company is changing with the times.

    The x86 giant is increasingly relying on a basketful of technologies to deliver performance increases that it used to get with a turn of the crank at its fabs. And it is offering its customers a fat cookbook of systems and silicon recipes in place of its old formula of the next big CPU.

    These days, what’s most interesting at Intel is its work in memories, machine learning — and some of its rock star engineers like Jim Keller. All three were on vivid display at the event.

    Details of the next big processors — Cascade Lake, Cooper Lake, and Ice Lake — were part of the event. But part of that news was their added AI features, and they stood alongside Optane DIMMs now shipping to Google Cloud and plans for smart networking cards.

    Wall Street analysts were hungry for information on Intel’s much-delayed 10-nm process.

  39. Tomi Engdahl says:

    Disk will eat itself: Flash price crash just around the over-supplied block
    Cheaper SSDs could accelerate disk cannibalisation leading to Seagate downturn

    Wells Fargo senior analyst Aaron Rakers, using IDC and DRAMeXchange data, estimated total NAND flash pricing stands today at ~$0.30/GB. Rakers noted Objective Analysis expects a 45 per cent per annum growth in NAND flash capacity shipped.

    Some 70 per cent of total industry flash is 3D NAND, with the remainder the older 2D or planar NAND. Handy believed this manufacturing capacity could be migrated to making DRAM instead – and warned this could result in DRAM capacity over-supply in the future.

    Deep StorageNet’s chief scientist Howard Marks, also speaking at the summit, suggested that a 5x differentiation in $/GB between enterprise SSDs and HDDs should be considered the crossover point to move toward SSD cannibalization.

    Rakers noted enterprise SSDs are at a ~3-4x $/GB premium relative to mission-critical HDDs. Meanwhile, enterprise SSDs stand at ~15-17x $/GB premium relative to nearline/high-cap enterprise HDDs.

    Recently announced QLC (4bits/cell) SSDs from Intel and Micron are aimed at taking share from nearline disk drives for read-intensive applications.

    If Handy’s chip price correction prediction is accurate, SSD pricing will go down, too. He said he sees a trend for NAND prices to fall to roughly 25 per cent of their current prices. If SSD prices go down in lock-step, then we are looking at a 75 per cent cost reduction in their prices.


Leave a Comment

Your email address will not be published. Required fields are marked *