Computer trends 2019

Here are some ICT trends for year 2019 picked from various sources (linked to sources) and edited by me:

General: From AI to Moore’s Law, the entire industry is deep in the throes of massive changes. The future will be characterized by smart devices delivering increasingly insightful digital services everywhere. While CPUs continue to evolve, performance is no longer limited to a single processor type or process geometry.

Business: There seems to be a clear evidence from this research that businesses are adopting and looking to capitalise on the benefits of Big Data, the Internet Of Things and Sensor technology for their mobile workforces.

Open source: 2019 Will Be the Year of Open Source in software and even in hardware. We saw more activity in open source than ever before in 2018. And the momentum isn’t likely to slow down in 2019.

Web is mobile: According to the statistics of FICORA, Ofcom, the PC has lost its place as the first device and platform for web browsing. Almost half of the web browses the web with a smart phone, which places a requirement on all online services from shops to news sites.

Multiple devices: As the number of different IT devices continues to grow, there are more and more devices in use at the same time.
 Situations and tasks that utilize and use multiple devices together have become commonplace. We need to think how how user interfaces could better support multi-device sharing.

Artificial intelligence: It seems that AI Market Ramps Everywhere. The AI term creates hope for some, fear for others, and confusion for all. Artificial intelligence (AI) is what the Internet of Things was two years ago – overhyped and not very well understood. The obvious shift is the infusion of AI (and its subcategories, machine learning and deep learning) into different markets. It seems that you don’t need to be artificial intelligence wizard anymore to use some AI – at best, implementation can be picked up by GitHub without really understanding anything. AI Still Has Trust Issues for many. There are also views that now hot artificial intelligence is the bubble that broke out last in the 1990s because at present, artificial intelligence and man form a bad cyborg. You need to separate AI Hype From Reality because it seems to be a miraculous thing where almost nobody knows what it is.

AI chips: While GPUs are well-positioned in machine learning, data type flexibility and power efficiency are making FPGAs increasingly attractive. Today, selling custom chips for artificial intelligence is still a small business. Intel, the largest manufacturer of computer processors, has appraised the current market at $2.5 billion, one half of one percent of the estimated value of the 2018 global semiconductor market. At a press event at the 2019 Consumer Electronics Show, Intel announced the Nervana Neural Network Processor (NNP-I), an AI chip for inference-based workloads that fits into a GPU-like form factor. Google and NXP advance artificial intelligence with the Edge TPU.

AI-driven development: AI-driven development looks at tools, technologies and best practices for embedding AI into applications and using AI to create AI-powered tools for the development process.

Huge data: It seems that It’s All About The Data. Data creation, management and processing always have been a winning business formula. It takes lots of data to train AI systems and IoT systems generate a lot of data.Data scientists now have increasing amounts of data to prepare, analyze and group — and from which to draw conclusions. The entire tech industry has changed in several fundamental ways over the past year due to the massive growth in data. Many data science tasks will be automated. Hardware and software are no longer the starting points for technology design. It’s now about data processing, flow and throughput.

Digital twins: A digital twin is a digital representation that mirrors a real-life object, process or system. Digital twins can also be linked to create twins of larger systems, such as a power plant or city. The idea of a digital twin is not new but is has become hot when AI and IoT were added to the mix.

Edge computing: Edge computing is a topology where information processing and content collection and delivery are placed closer to the sources of the information, with the idea that keeping traffic local will reduce latency. Currently, much of the focus of this technology is a result of the need for IoT systems to deliver disconnected or distributed capabilities into the embedded IoT world.

Power consumption: Globally, ICT today consumes 8% of all electricity and doubles every year. I think we needs new semiconductor technologies and maybe also more optimized software that does more but consumes less power.

Memories: DRAM market growth stops in 2019. GDDR6 and HBM2 impacts system design. There is disparity between the different types of DRAM, from GDDR to HBM.

Faster storage: Apacer has CFexpress card, which supports PCIe and the new NVMe 1.3 protocol, transfers data at a rate of two gigabytes per second.

Heterogeneous architectures: Need for increased computing power requires new multi-processor architectures (hybrid processors). Heterogeneous design is changing the starting point for chip design so that integration is now more the real challenge rather than the processor core. Many ARM processors already use hybrid architecture. Intel has unveiled a new Foveros architecture that addresses the challenge of Arm processors.

Immersive technologies: Users can interact with the world with immersive technologies such as augmented reality (AR), mixed reality (MR) and virtual reality (VR). AR brings new possibilities. A smart space is a physical or digital environment in which humans and technology-enabled systems interact in increasingly open, connected, coordinated and intelligent ecosystems.

Open hardware: Can RISC-V – Linux of Microprocessors – Start an Open Hardware Renaissance? RISC-V is an open source processor command set that can be used with the same principles as Linux code. RISCV is now being firmly linked to Linux as the Linux Foundation and the RISC-V Foundation have agreed to work together to promote open code development and RISC-V deployment. For the first time, Arm architecture will be a serious challenger in millions, even billions of embedded devices. Companies like Hi-Five, NVIDIA and WDplan to release product with RISC-V in them. This year RISC-V does not compete with traditional CPUs on PCs. Also MIPS hardware architecture is opening up.

Containers: Is Kubernetes the new application server? If you thought there was a lot of chatter about Kubernetes in 2018, you ain’t seen nothing yet.

Software robotics: Software robotics becomes widely available. Robot Framework will be important on this.

Intel processors: Intel Announces Faster Processors Patched for Meltdown and Spectre, New Intel Architectures and Technologies Target Expanded Market Opportunities. Intel Demonstrates 10nm-based PCs, Data Center and Networking Systems, Next-Gen ‘Sunny Cove’ Architecture with AI and Crypto Acceleration, and 3D Logic Chip Packaging Technology. 5 Observations From Intel’s Event article says that mysterious locations, codenames and process delays are on the top of the list. Intel’s Foveros Lakefield technology for making smaller chips.

AMD processors: Ryzen mobile processors would begin showing up in ultrathin and gaming laptops by the end of the first quarterAMD starts to use 7nm technology: Radeon VII GPU will be available and it is promised to be 27% to 62% faster, third-generation Ryzen desktop processor and second-generation EPYC server processor will be available starting later this year. AMD is challenging Intel in Chromebooks with A-Series CPUs and launching Ryzen Mobile 3000-Series chips with 2nd-generation Ryzen Mobile parts.

ARM processors: Taking aim at Intel, Qualcomm launches chip for business PCs. The Snapdragon 8cx series is Qualcomm’s first chip specifically designed for computersQualcomm’s pitch is that laptops using its chips will go days without needing to be plugged in, and will always be connected to the internet via cellular networks. The Snapdragon 8cx is also the world’s first 7-nanometer PC processor platform and promises superior performance for laptop. Intel’s position on laptops is very strong and Qualcomm has a big hill to get up if it really wants to challenge Intel’s PC side. Huawei Rolls 7nm ARM Server CPU Kunpeng 920 that is said to outperform ThunderX2, Ampere by 25%. Rumors are circulating that Apple will obsolete x86-based computers in favor of its own SoC-powered successors.

NVIDIA: RTX 2060 GPU was introduced. GeForce RTX™ graphics cards are powered by the Turing GPU architecture and the all-new RTX platform. This promises to give you up to 6X the performance of previous-generation graphics cards and brings the power of real-time ray tracing and AI to your favorite games. GeForce RTX 20 Series GPUs to gaming laptops.

Microsoft hardware: Microsoft reportedly working on Xbox and Windows webcams for 2019.

Windows security: Microsoft officially announces ‘Windows Sandbox’ for running applications in isolation.Microsoft’s coming ‘Windows Sandbox’ feature is a lightweight virtual machine that allow users to run potentially suspicious software in isolation. It could debut in Windows 10 19H1,

Storage: NVMe Hits a Tipping Point. A show dedicated to NVM Express (NVMe) next month solidifies an industry-wide sentiment that the host controller interface and storage protocol hit a tipping point in the last year. It is expected that we’re going to see the majority of new products coming out with NVMe. There are already relatively young NVM Express Over Fabrics (NVMe-oF) specification and even some hard disk enclosures using NVMe.

Fibre channel: Broadcom Nudges Fibre Channel to 64G using 64G optical modules (just starting to sample) and PCIe Gen 4 connections that are not yet generally available on x86 servers.

Faster PCIe:PCIe 4.0 is ready. The PCISIG organization has completed the new 4.0 version of the PCIe bus, and now the technology is expected to be deployed on the devices. It is possible to to get the full PCIe 4.0 speed with both copper and fiber. It seems that this year PCIe 4.0 comes to wider us for x86 servers.

FPGA: FPGA Graduates To First-Tier Status because FPGAs are better for certain types of computation than CPUs or GPUs.While GPUs are well-positioned in machine learning, data type flexibility and power efficiency are making FPGAs increasingly attractive.

Enterprise software:Legacy enterprise applications and software systems have a reputation for being clunky, expensive, and almost impossible to keep up to date. Rethink your enterprise software systems and consider whether cloud-based options like SaaS may better serve your needs. Office 365 is massively successful. AWS services are running the backend of thousands of major companies now. As internet connections and speeds increase, the cloud becomes more and more viable as it is more cost effective to centralize computer hardware reducing costs for companies and employee overhead.

Windows 10: Microsoft is building a Chromium-powered web browser that will replace Edge on Windows 10. Microsoft could be preparing to ditch the EdgeHTML layout engine of its unloved Edge browser in Windows 10 in favour of Chromium. Windows Subsystem for Linux (WSL) is improved. Microsoft new Windows 10 reserves ~7GB of disk space for updates, apps, and more to ensure critical OS functions always have space.

Light Windows: Microsoft is working on Windows Lite, a super lightweight, instant on, always connected OS that runs only PWAs and UWP apps, to challenge Chrome OS. Microsoft’s ‘Centaurus’ device is yet another potential piece of its Chromebook-compete strategy.

Coding for Windows: Microsoft has released a public preview of Visual Studio 2019 for Windows and Mac. Microsoft open sources its most popular Windows UX frameworks and says the first preview of .NET Core 3.0 is now available — Microsoft is open sourcing WPF, Windows Forms and Win UI via GitHub.

Quantum computing: Quantum computing is a type of nonclassical computing that is based on the quantum state of subatomic particles that represent information as elements denoted as quantum bits or “qubits.” Quantum computers are an exponentially scalable and highly parallel computing model. They can work well on some specific tasks suitable for them, but are not suitable for most generic computing tasks we are used to.

Blockchain: Blockchain is a type of distributed ledger, an expanding chronologically ordered list of cryptographically signed, irrevocable transactional records shared by all participants in a network. It can work with untrusted parties without the need for a centralized party (i.e., a bank). Businesses should begin evaluating the technology to see if it fits their business or not. You need to separate Blockchain hype from Reality because it seems to be a potentially miraculous thing where almost nobody knows exactly what it is to what it is good for. Check this related Dilbert comic.

Related predictions and trends articles:

Gartner Top 10 Strategic Technology Trends for 2019

Virtual reality implementation: observations and predictions

5 IT job trends to watch in 2019 – because success starts with talent
Digital transformation reality check: 10 trends

These are the 15 best US tech companies to work for in 2019, according to Glassdoor

Kubernetes in 2019: 6 developments to expect

What to expect from CES 2019

613 Comments

  1. Tomi Engdahl says:

    Antitrust, billion dollar fines, privacy violations and numerous federal investigations. Facebook, Google, Apple and Amazon are feeling the pressure from a wave of government inquiries.

    https://m.youtube.com/watch?v=jyxS2bvQGxc

    Reply
  2. Tomi Engdahl says:

    SUGGESTIVE FIGHTING GAME COSTUME Home News CPU, APU & Chipsets AMD smashes Cinebench world record with dual EPYC 7742 processors AMD kicks some serious ass with EPYC 7742: 128C/256T destroys Cinebench world record

    Read more: https://www.tweaktown.com/news/67586/amd-smashes-cinebench-world-record-dual-epyc-7742-processors/index.html

    Reply
  3. Tomi Engdahl says:

    ORACLE AUTONOMOUS LINUX: WORLD’S FIRST AUTONOMOUS OPERATING SYSTEM ANNOUNCED
    https://www.firstpost.com/tech/news-analysis/oracle-autonomous-linux-worlds-first-autonomous-operating-system-announced-7355191.html

    Oracle Autonomous Linux and Oracle OS Management Services are included with Oracle Premier Support at no extra charge.

    Reply
  4. Tomi Engdahl says:

    Richard Stallman and the Fall of the Clueless Nerd
    https://www.wired.com/story/richard-stallman-and-the-fall-of-the-clueless-nerd/

    The controversial pioneer of free software resigned from MIT over his remarks on Jeffrey Epstein and Marvin Minsky. Stallman won’t be the last.

    Reply
  5. Tomi Engdahl says:

    Computer scientist Richard Stallman, who defended Jeffrey Epstein, resigns from MIT CSAIL and the Free Software Foundation
    https://tcrn.ch/30jWPGx

    Reply
  6. Tomi Engdahl says:

    Oracle Autonomous Linux: World’s first autonomous operating system announced
    https://www.firstpost.com/tech/news-analysis/oracle-autonomous-linux-worlds-first-autonomous-operating-system-announced-7355191.html

    Oracle has announced its free autonomous operating system — Oracle Autonomous Linux — which provisions itself, scales itself, tunes itself and patches itself while running.

    Additionally, Oracle promises “literally instantaneous migration” to the new operating system. The OS is free for Oracle Cloud Infrastructure customers.

    Besides the Autonomous Linux OS, Oracle also announced Oracle OS Management Service, which will basically help control systems whether they run Autonomous Linux, Linux, or Windows.

    Reply
  7. Tomi Engdahl says:

    The Chef situation also highlights changes in how software is developed and the challenges those changes present. Much modern software includes multiple open source components. Even commercial software often relies on “libraries” of code created or maintained by outsiders. This helps developers work faster by not having to recreate common features and components. But if the maintainers of those components delete or break them—or stop maintaining them—everyone who relies on that software is affected.

    https://www.wired.com/story/developer-deletes-code-protest-ice/?fbclid=IwAR251NsoAOJCyjQ_GrYtbpSdiV7BNjYDapGX4aht6YzvtSsQPQ4_g5y_wpg&fbclid=IwAR3sctuuSiUmbTH6pvsWsztjk1COfpWq_zx1PD-NnAL7zXh25iIa0z3LQfY&mbid=social_fb&utm_brand=wired&utm_campaign=wired&utm_medium=social&utm_social-type=owned&utm_source=facebook

    Reply
  8. Tomi Engdahl says:

    Async.h – asynchronous, stackless subroutines in C
    https://higherlogics.blogspot.com/2019/09/asynch-asynchronous-stackless.html?m=1

    The async/await idiom is becoming increasingly popular. The first widely used language to include it was C#, and it has now spread into JavaScript and Rust. Now C/C++ programmers don’t have to feel left out, because async.h is a header-only library that brings async/await to C!

    (Discussion on HN – http://bit.ly/30jr4S1)

    Reply
  9. Tomi Engdahl says:

    Dr. Ian Cutress / AnandTech:
    AMD says it is delaying release of its Ryzen 9 3950X until November due to high demand and the time needed to ensure sufficient stock availability — In a shock email late on Friday, AMD has released a statement to clarify the situation it is in with the manufacturing of its latest Ryzen processors.

    https://www.anandtech.com/show/14895/amd-next-gen-threadripper-and-ryzen-9-3950x-coming-november

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*