IoT trends for 2017

According to Intel IoT is expected to be a multi-trillion-dollar market, with 50 billion devices creating 44 zettabytes (or 44 trillion gigabytes) of data annually by 2020. But that widely cited 50 billion IoT devices in 2020 number is clearly not correct! Forecast of 50 Billion Devices by 2020 Is Outdated. In 2017 we should be talking about about some sensible numbers. The current count is somewhere between Gartner’s estimate of 6.4 billion (which doesn’t include smartphones, tablets, and computers), International Data Corporation’s estimate of 9 billion (which also excludes those devices), and IHS’s estimate of 17.6 billion (with all such devices included). Both Ericsson and Evans have lowered their expectations from 50 billion for 2020: Evans, who is now CTO of Stringify, says he expects to see 30 billion connected devices by then, while Ericsson figures on 28 billion by 2021.

Connectivity and security will be key features for Internet of Things processors  in 2017. Microcontroller (MCU) makers will continue to target their products at the Internet of Things (IoT) in 2017 by giving more focus on battery life, more connectivity of various types, and greater security. The new architectures are almost sure to spawn a multitude of IoT MCUs in 2017 from manufacturers who adopt ARM’s core designs.

ARM will be big. Last year, ARM’s partners shipped 15 billion chips based on its architectures. The trend toward IoT processors will go well beyond ARM licensees. Intel rolled out the Intel Atom E3900 Series  for IoT applications. And do not forget MIPS an RISC-V.

FPGA manufacturers are pushing their products to IoT market. They promise that FPGAs solve challenges at the core of IoT implementation: making IoT devices power efficient, handling incompatible interfaces, and providing a processing growth path to handle the inevitable increase in device performance requirement.

Energy harvesting field will become interesting in 2017 as it is more broadly adopted. Energy harvesting is becoming the way forward to help supplement battery power or lose the need for it altogether. Generally researchers are eyeing energy-harvesting to power ultra-low-power devices, wearable technology, and other things that don’t need a lot of power or don’t come in a battery-friendly form factor.

 

Low power wide area networks (LPWA) networks (also known as NarrowBand IoT) will be hot in 2017. There is hope that f LPWA nets will act as a catalyst, changing the nature of the embedded and machine-to-machine markets as NB-IoT focuses specifically on indoor coverage, low cost, long battery life, and enabling a large number of connected devices. The markets will become a kind of do-it-yourselfers paradise of modules and services, blurring the lines between vendors, users and partners.  At the same time for years to come, the market for low power wide area networks (LPWA) will be as fragmented and  is already in a race to the bottom (Sigfox, said to be promising costs approaching $1 per node per year). Competing technologies include Sigfox, LoRa Alliance, LTE Cat 1, LTE Cat M1 (eMTC), LTE Cat NB1 (NB-IoT) and other sub-gigahertz options almost too numerous to enumerate.

We are starting to see a battle between different IoT technologies, and in few years to come we will see which are winners and which technologies will be lost in the fight. Sigfox and Lora are currently starting well, but telecom operators with mobile networks NB-IoT will try hit the race heavily in 2017. Vendors prep Cat M1, NB1 for 2017: The Cat M1 standard delivers up to 380 Kbits/second over a 1.4 MHz channel. NB-1 handles up to 40 Kbits/s over 200 kHz channels.  Vendors hope the 7-billion-unit installed base of cellular M2M modules expands. It’s too early to tell which technologies will be mainstream and which niche. It could be that cellular NB-IOT was too late, it will fail in the short term, it can win in the long term, and the industry will struggle to make any money from it. At $2 a year, 20 billion devices will contribute around 4% of current global mobile subscription revenues.

New versions of communication standards will be taken into use in 2017. For example Bluetooth 5 that adds more speed and IoT functionality. In 2017, we will see an increase in the number of devices with the new Bluetooth 5 standard.

Industrial IoT to gain traction in 2017. Industrial applications ultimately have the greater transformative potential than consumer products, offering users real returns on investment (ROI) rather than just enhanced convenience or “cool factor”. But the industrial sector is conservative and has been slow to embrace an industrial IoT (IIoT), but is seems that they are getting interested now. During the past year there has been considerable progress in removing many of the barriers to IIoT adoption. A global wide implementation of an IIoT is many years away, of course. The issues of standards and interoperability will most likely remain unresolved for several years to come, but progress is being made. The Industrial Internet Consortium released a framework to support development of standards and best practices for IIoT security.

The IIoT  market is certainly poised to grow. A Genpact research study, for instance, indicates that more than 80% of large companies believe that the IIoT will be essential to their future success. In a recent market analysis by Industry ARC, for instance, the projected value of the IIoT market will reach more than $120 billion by 2021. Research firm Markets and Markets is even more optimistic, pegging IIoT growth at a CAGR of 8% to more than $150 billion by 2020. And the benefits will follow. By GE’s estimate, the IIoT will stimulate an increase in the global GDP of $10 to $15 trillion over the next 20 years.

Systems integrators are seeking a quick way to enter the industrial Internet of Things (IIoT) market. So expect to see many plug and play IoT sensor systems unveiled. There were many releses in 2016, and expect to see more in 2017. Expect to see device, connectivity and cloud service to be marketed as one packet.

IoT analytics will be talked a lot in 2017. Many companies will promise to turn Big Data insights into bigger solutions. For industrial customers Big Data analytics is promised to drive operational efficiencies, cut costs, boosting production, and improving worker productivity. There are many IIoT analytic solution and platform suppliers already on the market and a growing number of companies are now addressing industrial analytics use.

In 2016 it was all bout getting the IoT devices connected to cloud. In 2017 we will see increased talk about fog computing.  Fog computing is new IoT trend pushed by Cisco and many other companies. As the Internet of Things (IoT) evolves, decentralized, distributed-intelligence concepts such as “fog computing” are taking hold to address the need for lower latencies, improved security, lower power consumption, and higher reliability. The basic premise of fog computing is classic decentralization whereby some processing and storage functions are better performed locally instead of sending data all the way from the sensor, to the cloud, and back again to an actuator. This demands smarter sensors and new wireless sensor network architectures. Groups such as the Open Fog Consortium have formed to define how it should best be done. You might start to want to be able to run the same code in cloud and your IoT device.

 

The situation in IoT security in 2016 was already Hacking the IoT: As Bad As I Feared It’d Be and there is nothing that would indicate that the situation will not get any better in 2017.  A veritable army of Internet-connected equipment has been circumvented of late, due to vulnerabilities in its hardware, software or both … “smart” TVs, set-top boxes and PVRs, along with IP cameras, routers, DSL, fiber and cable modems, printers and standalone print servers, NASs, cellular hot spots, and probably plenty of other gear. IoT world at the moment is full of vulnerable devices, and it will take years to get then replaces with more secure devices. Those vulnerable devices can be used to make huge DDoS attacks against Internet services.  The 2016 October 21 cyberattacks on Dyn brought to light how easily many IoT devices can be compromised. I expect that kind of incidents will happen more in 2017 as DDoS botnets are pretty easy to build with tools available on-line. There’s no question that everyone in the chain – manufacturers, retailers and consumers – have to do a better job securing connected devices.When it comes to IoT, more security is needed.

 

1,759 Comments

  1. Tomi Engdahl says:

    Five reasons why system integrators are critical assets for the IIoT
    http://www.controleng.com/single-article/five-reasons-why-system-integrators-are-critical-assets-for-the-iiot/3c4ebb561a2f5f7730cd74be638c0caf.html

    System integrators can help expand the Industrial Internet of Things (IIoT) for companies because they have a deep understanding of many systems and can work as a go-between for many departments as an independent and powerful voice.

    1. The IoT is an ecosystem project. IoT is a complex organism with many parts—sensors, gateways, the platform, analytics engine and enterprise/business apps. No single vendor can offer a complete end-to-end solution that can cover all of this.

    2. Connectivity across departments. Currently, the information technology (IT) and operations technology (OT) departments work in silos with hardly any information being shared, much less in real time.

    3. Start small and scale. The complexity of the IoT project often means that the end user prefers to start small and then scale it up. They usually start with a proof-of-concept or a small goal oriented problem and then expand across departments.

    4. Domain knowledge. Typically, the IoT solution providers have in-depth knowledge about the technology and its facets, but not the implementation. They may have some understanding of the problems and issues, but they have little deep knowledge of the applications and the business.

    5. Trust factor. IoT is uncharted territory for most enterprises and they are hesitant to entrust a new set of vendors with an unused technology for achieving their critical goals. This is why integrators are useful because they already have a relationship with enterprises because they already have an established and trusted relationship with them.

    As the scope and demand for the IIoT increases, the role of integrators in bridging the gap between the solution providers and their targeted marketing segments will be increasingly important. In the age of collaborative manufacturing, the hardware and software vendors with the most robust integrator will be the one to lead the IIoT game.

    Reply
  2. Tomi Engdahl says:

    An integrated network for Industrie 4.0
    System integration via the cloud makes networking at the production level easy and secure by vertically integrating management and systems as well as providing a security function for Industrie 4.0
    http://www.controleng.com/single-article/an-integrated-network-for-industrie-40/9c30f6893fdf7090d973a80aefd33958.html

    The implementation of smart factory technologies and platforms, such as Industrie 4.0, is attracting a great deal of attention and requires integration and optimization of the information technology (IT) systems used at the management, production, and field levels. However, integrating field networks with higher levels using IT can pose problems for the network’s speed and capacity.

    Integration

    Seamless message protocol (SLMP) is defined as a mechanism for integrating and seamlessly connecting different types of field networks. This protocol enables connection from a higher-level system to field devices without considering the differences.

    Ethernet is adopted as the lower-level communication layer and a token passing method is its higher-level communication control method. In the token passing method, data transmission rights-tokens-are relayed around the network between stations following a designated route. Only those stations having data transmission right can transmit data.

    Currently, tokens are circulated around a statically determined route, but technically, it also is possible to change this route dynamically at random intervals. In the future, this will enable route switching depending on the product to be manufactured (Figure 2).

    Simultaneous troubleshooting also is important in network configuration. It must be possible to easily find the location of trouble on the network. Management tools for network event history and a network diagnostic tool can help users find the cause quickly.

    Reply
  3. Tomi Engdahl says:

    Time-sensitive networking’s benefits for the IIoT and manufacturing
    http://www.controleng.com/single-article/time-sensitive-networkings-benefits-for-the-iiot-and-manufacturing/4ac1787f8e6de89ad0df89ba3934721e.html

    Technology Update: Time-sensitive networking (TSN) is moving from the idea stage to deterministic networking and the result of widespread adoption will lead to the Industrial Internet of Things (IIoT).

    Time-sensitive networking (TSN) is finally moving from the idea stage to the main stage of deterministic networking. The IEEE TSN working group has completed the core set of standards required to implement TSN, the industry has developed the first products to support the technology, and simulations and demos are taking place. Widespread adoption of these technologies is the full-blown Industrial Internet of Things (IIoT) revolution that has been talked about.

    Full TSN implementation will take place over several phases. Because the switch to TSN requires a phased approach, companies won’t be able to just retrofit the technology into legacy systems. While companies won’t be able to immediately replace the existing machines, they must change infrastructures in a way that allows machines to communicate with each other more effectively. Many manufacturers have seen the benefits of standardized Ethernet within operations, and with TSN disparate networks aren’t needed to support time-critical and best-effort Ethernet traffic.

    Standard Ethernet

    The promise of TSN is twofold. First, it’s based on standard Ethernet. The traffic found on standard Ethernet, such as video and HTML, can share the physical network with high-priority deterministic Ethernet, such as motion control. This is important because those industrial products that need deterministic services are now part of the network, requiring attention to latency and jitter. With TSN, all devices that are connected to the network can be part of a validated architecture, rather than being siloed.

    TSN isn’t bogged down by always having to go at the set speeds at all times because it is part of the Ethernet. Instead, precise scheduling is used to speed up or slow down, and prioritize the delivery of whatever packet of information needs to be delivered. It also offers no jitter, even in an atmosphere where it can accommodate more devices. Instead of treating every packet the same, it can receive and interpret all data at once, calculate the maximum amount of time that can be expended before transmission, and disseminate all the information where it needs to go, seamlessly.

    This technology is essential because as more devices come onto a network, the need for that central “hub” to direct all the trains—and ensure they come in on time—becomes more important.

    One of the most important concepts of Industrie 4.0 is the need for standard technologies that all vendors can operate.

    The first step in this line of disruption will be the continued adoption of OPC Unified Architecture (OPC-UA). Once OPC-UA integrates functionality into one framework, it should carry TSN with it.

    Reply
  4. Tomi Engdahl says:

    IoT Security Costs are Manageable
    https://www.eetimes.com/author.asp?section_id=36&doc_id=1332412&

    Internet of Things device security has become more critical than ever, as the risks now outweigh the opportunities when it comes to potential threats to an individual or even an entire government.

    Reply
  5. Tomi Engdahl says:

    LED lighting design considerations for smart cities
    https://www.edn.com/electronics-blogs/led-diva/4458839/LED-lighting-design-considerations-for-smart-cities

    “Smart City” refers to the integration of communications and physical assets into a cohesive network to facilitate a safer, more livable, and more energy-efficient environment for the people who live there.

    In a Smart City, real-time information is provided to administrators from a city-wide deployment of sensors and monitors. Depending on the design, the collected data can enable any number of capabilities, such as monitoring weather and air quality, adjusting traffic signals to relieve congestion, adjusting mass-transit schedules to meet changing demand, and more rapid deployment of emergency responders.

    Kansas City has realized one of the more well-known examples of the Smart City concept in a two-mile corridor. The implementation uses sensors integrated into outdoor street lighting to generate a real-time picture of traffic patterns, street car transit, and even open parking spaces.

    Reply
  6. Tomi Engdahl says:

    First Widely Available SoC Bundle with a Dev Kit for AVS Has Launched
    https://developer.amazon.com/blogs/alexa/post/d1789e3e-6441-45d8-8281-cbccc774d092/first-avs-dev-kit-with-production-ready-silicon-and-client-application

    New Production-Ready Offering with NXP ARM-based SoC is Primed for Commercial Developers

    Development kits for AVS are designed for developers to evaluate the full potential of integrating Alexa into a variety of connected devices, and to give them a jumpstart in prototyping. The ultimate goal is to get product ideas into production faster and more cost-effectively. This is why we’re excited to highlight the first broadly available dev kit for AVS with a production-ready SoC, voice input processor, and device application built with the AVS Device SDK. The Synaptics AudioSmart 2-Mic Development Kit for AVS with NXP SoC

    Reply
  7. Tomi Engdahl says:

    Researchers propose an open ‘internet of water’ tracking use, quality and costs
    https://techcrunch.com/2017/10/13/researchers-propose-an-open-internet-of-water-tracking-use-quality-and-costs/?ncid=rss&utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&sr_share=facebook

    Where did the water coming out of your tap come from? How is it filtered and purified? How much does it cost the city and state per gallon to deliver? How can they improve that? These and other questions come naturally as fresh water becomes more and more valuable a resource — and we need a shared, open ‘internet of water’ to answer them, say researchers from Duke University and the Aspen Institute.

    “Our water world is data rich, but information poor,” explained Martin Doyle, of Duke’s Nicholas Institute. “If water data were shared openly and then integrated in a common digital platform, there would be game-changing opportunities ranging from private citizens’ ability to gauge the quality of local water to public officials’ ability to warn populations of water-borne public health hazards.”

    Internet of Water Could Revolutionize Water Management
    https://nicholasinstitute.duke.edu/articles/internet-water-could-revolutionize-water-management

    To realize the dormant value of the data, say some producers and users, would require making them widely shareable in standardized digital formats, thereby allowing their real-time aggregation for a host of purposes beyond those that spurred their original collection. They believe that opening the data and investing in water data infrastructure would set in motion a wave of innovation, leading to more sustainable management of our water resources. They envision creation of an Internet of Water.

    The Need for an Internet of Water

    In the United States, water management is hindered by decision makers’ inability to answer three fundamental questions about our water systems in a timely way: How much water is there? What is its quality? How is it used (withdrawn, consumed or returned for different purposes)?

    “It’s not that the data aren’t being collected,”

    First, water is undervalued—and water data even more so. Moving water from its source, treating it, and delivering it to faucets has a cost. Similarly, collecting data, “cleaning” or standardizing them, and delivering them to an end user has a cost. But unlike water utilities, most public agencies know neither the full cost of their data infrastructure nor the water and cost savings of putting the data to timely use. This blind spot has discouraged public agencies from further investing in data infrastructure.

    Second, there’s a need to make existing public water data more accessible. The data’s value in decision making is diminished if the data are hard to share across platforms.

    Third, the appropriate architecture for an Internet of Water is a federation of data producers, hubs, and users—entities often isolated from one another. Initially, some overarching governance structure is needed to intentionally connect data hubs and to help coordinate adoption of shared metadata and data standards to ensure that data hubs can talk to one another.

    Within the proposed framework, data relevant to sustainable water management would be shared by communities with specific roles and responsibilities.

    “With finite water resources and growing demand for them, we need open and accessible data to help us navigate tradeoffs,”

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*