Computing at the Edge of IoT – Google Developers – Medium
We’ve seen that demand for low latency, offline access, and enhanced machine learning capabilities is fueling a move towards decentralization with more powerful computing devices at the edge.

Nevertheless, many distributed applications benefit more from a centralized architecture and the lowest cost hardware powered by MCUs.

Let’s examine how hardware choice and use case requirements factor into different IoT system architectures.


  1. Tomi Engdahl says:

    Key drivers and benefits of edge computing for smart manufacturing

    Edge computing means faster response times, increased reliability and security. Five edge computing advantages are highlighted.

    A lot has been said about how the Internet of Things (IoT) is revolutionizing the manufacturing world. Many studies have already predicted more than 50 billion devices will be connected by 2020. It is also expected over 1.44 billion data points will be collected per plant per day. This data will be aggregated, sanitized, processed, and used for critical business decisions.

    This means unprecedented demand and expectations on connectivity, computational power and speed of service—quality of service. Can we afford any latency in critical operations? This is the biggest driver for edge computing. More power closer to the data source-the “Thing” in IoT.

    Edge computing and drivers

    Rather than a conventional central controlling system, this distributed control architecture is gaining popularity as an alternative as the light version of data center and where control functions are placed closer to the devices.

    Edge computing means data processing power at the edge of the network, closer to the source of data. With edge computing, each device—whether it be a sensor, robotic arm, HVAC unit, a connected car, or any intelligent device—collects data, uses data processing model performed by the cloud, and packages it up for processing and analysis.

    IDC research predicted in 3 years, 45% of IoT-created data will be stored, processed, analyzed, and acted upon close to, or at the edge of, the network and over 6 billion devices will be connected to the edge computing solution. Inherent challenges of the cloud infrastructure such as network latency, cost of network bandwidth and data storage, security, and compliance issues are minimized by edge computing infrastructure and are the key drivers of edge technology.

  2. Tomi Engdahl says:

    Defining the value of edge computing
    Explore the three important factors of deploying an edge computing solution.

    With the growth in connected devices and machines, industrial enterprises are realizing the need for an efficient way to manage large amounts of data, which in turn escalates the importance of edge computing. As industrial technology becomes more complex and devices become more powerful, edge computing is emerging as a valuable solution for harnessing all this computing power for business value. Today, edge devices go beyond basic automation, enabling industrial enterprises to perform an expanding array of advanced computing and analytical tasks.

    When evaluating these technologies, companies should look at these three critical components to help ensure a successful, “cloud-like” edge computing deployment:

    1. The elimination of production downtime
    2. The ability to analyze, act on and protect data in real-time
    3. Simplifies operations

    With the rise of IIoT, industrial enterprises are blurring the line that separate the enterprise data center and business systems (IT) from production automation systems (OT) and their respective networks. This converged “hybrid OT professional” has a unique combination of skills to bridge the gap separating the IT and OT worlds, thus reducing the burden on IT teams along their organization’s IIoT transformation.

  3. Tomi Engdahl says:

    How To Secure The Network Edge
    The risk of breaches is growing, and so is the potential damage.

    Microcontrollers, sensors, and other devices that live at the edge of the Internet must be protected against cyberattacks and intrusions just as much as the chips in data centers, network routers, and PCs. But securing those edge devices presents a swath of unique challenges, including the cost and availability of technology resources, as well as varying levels of motivation to solve these problems by both vendors and end users.

    But securing the edge takes on new urgency as safety issues enter the picture. Assisted and autonomous driving essentially transform cars into Internet edge devices, where real-time responsiveness is required for accident avoidance and cloud-based connectivity is needed for such things as traffic and weather alerts. Likewise, embedded systems are being used to monitor and control critical infrastructure, and that data is being read by external monitors or devices at the edge of the network that are directly connected to those systems.

    All of this raises the stakes for security. So how exactly do this issues get solved, and by whom?

    “That’s a tricky question,” observed Robert Bates, chief safety officer for the Embedded Software Division at Mentor, a Siemens Business. “In some sense, those kinds of smart devices can be as secure as anything else connected to the network. But theory and reality are two different things.”

    “The same problems exist across industry,” said Bates. “Industry buys something, and they just kind of want to forget about it. If they’re not updating these devices themselves, or they’re not thinking about updating them, they’re going to be exposed—even if their security was top-notch at the point of the link. That’s one problem.

  4. Tomi Engdahl says:

    In-cabinet thermal management in edge computing environments

    Providers of cable management systems offer resources, systems for edge computing applications.

    Edge challenges

    The company then provides the other side of that scenario: “As worthy as these benefits may be, IT will face new challenges and tasks in edge computing implementation.”

    In a similar vein, CPI’s August 2017 blog post reminds us, “As the Internet of Things (IoT) continues to evolve and edge computing—which pushes applications, data, and computing services away from centralized data centers—becomes more common, managing assets and white space remotely becomes more challenging. A comprehensive and reliable solution that works together to simplify operation, costs and labor, as well as allows for network expansion, is key.”

    Rittal adds, “Edge computing, by definition, exposes hardware to an environment that may be challenging—in footprint, ambient temperatures, contaminants, particulates, vibration or accessibility. Solutions abound for each of these concerns: micro data centers, NEMA-appropriate enclosures, thermal management and filtration systems, and shock-absorbing designs.”

  5. Tomi Engdahl says:

    Woe Canada: Rather than rise from the ashes, IBM-built C$1bn Phoenix payroll system is going down in flames

    Canucks to pull plug on ill-fated mismanaged govt IT project

    Canada is about ready to pull the plug on its IBM-built error-plagued Phoenix payroll system that has cost the nation nearly CAN$1bn ($790m).

    Launched in 2016, Phoenix was an IBM implementation of the Oracle PeopleSoft platform that was supposed to handle payroll for 46 Canadian government agencies and departments.

    Unfortunately for the Great White North, the system was almost immediately beset with problems. Nearly two years later, officials have had to expand their payroll support staff from 550 heads to more than 1,500 to cope with the cockups, and some CAN$460m has been spent on support and fixes.

    Now, America’s Hat says it’s time to cut bait and move on. The administration plans to begin a two-year CAN$16m project to design a new system to replace Phoenix.

    “Over the last year and a half, the government has hired several hundred people to rebuild capacity that was lost due to the previously flawed business plan,” the budget report said of Phoenix.

    Systems nominal

    IBM is taking the news in stride, and insisted it has held up its end of the bargain.

    “As the government has repeatedly acknowledged, IBM is fulfilling its obligations on the Phoenix contract, and the software is functioning as intended,” a spokesperson told El Reg on Thursday.

    Regardless of who is at fault, the situation is a bad look for all parties involved. For IBM, the ordeal is another major government project failure its name is attached to at a time when Big Blue was just righting its financial ship.

    The Canadian government, meanwhile, said this week that on top of the nearly CAN$900m support costs, it will have to cough up about CAN$5.5m in charges to smooth out tax headaches caused by botched payments to employees and additional costs to support the legal fallout.

  6. Tomi Engdahl says:

    IoT Security Concerns Push Vendors to the Edge

    NUREMBERG, Germany — Doing more processing at the edge to avoid sending sensitive data to the cloud emerged as a common theme among vendors at the Embedded World conference here last week. Whether this is a result of forthcoming GDPR (General Data Protection Regulation) laws coming into force across the European Union on May 25, or whether there it is simply a lack of sufficient security in current devices is difficult to tell.

  7. Tomi Engdahl says:

    Netflix could pwn 2020s IT security – they need only reach out and take
    Workload isolation is niche, but they’re rather good at it

    The container is doomed, killed by serverless. Containers are killing Virtual Machines (VM). Nobody uses bare metal servers. Oh, and tape is dead. These, and other clichés, are available for a limited time, printed on a coffee mug of your choice alongside a complimentary moon-on-a-stick for $24.99.

    Snark aside, what does the future of containers really look like?

    Recently, Red Hat’s CEO casually mentioned that containers still don’t power most of the workloads run by enterprises. Some people have seized on this data point to proclaim the death of the container. Some champion the “death” of containers because they believe serverless is the future. Some believe in the immutable glory of virtual machines and wish the end of this upstart workload encapsulation mechanism.

  8. Tomi Engdahl says:

    AI Core – Artificial Intelligence On The Edge

    UP Bridge the Gap – a brand of AAEON Europe – is proud to launch AI Core: the first embedded ultra-compact Artificial Intelligence processing cards for edge computing.

    AI Core is a mini-PCIe module powered by Intel® Movidius™ Myriad™ 2 technology. This low-power module enhances industrial IoT edge devices with hardware accelerated deep learning and enhanced machine vision functionality. AAEON Technology is one of the first IPC manufacturers to address the growing need for Artificial Intelligence on the edge with dedicated hardware.

    Most of the available IoT solutions are focused on connecting edge devices to the cloud and these deployments face challenges related to latency, network bandwidth, reliability and security. Experts in this field agree that not all the tasks and decision making processes can be addressed in cloud-only models. AI Core is the solution for cloud limitations by bringing AI performance and hardware acceleration not “at” but “ON” the edge of the Internet of Things.

  9. Tomi Engdahl says:

    Energy Requirements And Challenges For IoT Autonomous Intelligence At The Edge

    Today’s computational landscape is vast and power hungry. Can it be sustainable?

  10. Tomi Engdahl says:

    Intelligence At The Edge Is Transforming Our World

    Machine learning already plays a part in everyday life, but efficient inference will keep it moving forward.

  11. Tomi Engdahl says:

    Exponentials At The Edge

    The revolution that started in mobile phones will continue in other devices, but much faster.

  12. Tomi Engdahl says:

    Tom Krazit / GeekWire:
    Cloudflare launches Cloudflare Workers, an edge computing service for developers using its network, charging devs $0.50 for every 1M tasks used by their apps — Cloudflare is ready to take the wraps of a new service designed for developers creating Internet-of-Things apps that want to capitalize …

    Cloudflare to open an edge computing service for developers using its network

    Cloudflare is ready to take the wraps of a new service designed for developers creating Internet-of-Things apps that want to capitalize on the proximity benefits provided by edge computing.

    Cloudflare Workers was first introduced last September, and Cloudflare is expected to announce Tuesday that it is now generally available for developers to check out. The new service runs on hardware that Cloudflare has installed in more than 125 data centers around the world to power its anti-DDoS (distributed denial of service) attack service, and it allows developers to write JavaScript applications through the Service Worker API that will run much closer to their users than might otherwise be possible with standard cloud services.

    “For quite some time, we have understood that there is real power in deploying applications that ran incredibly close to where users are on the internet,”

    About 1,000 users have been playing with Cloudflare Workers since the company opened the service up to a broader beta program in January following the September announcement. “I’ve been surprised by how dramatically different all of the applications people have bult are, it doesn’t feel like there is a bound to them yet,” Prince said.

    The benefits of edge computing are just starting to make their way into the world, although lots of folks have been talking about it for a while. It’s a recognition of the fact that as connected devices spread throughout the world, it quickly makes more sense to execute a lot of the code running those devices as physically close to them as possible, as waiting for instructions from a remote cloud data center won’t always cut it for real-time IoT devices.

  13. Tomi Engdahl says:

    Tom Warren / The Verge:
    Microsoft unveils cloud gaming division led by Microsoft vet Kareem Choudhry who says the company wants content available across all devices, hints at streaming

    Microsoft’s new gaming cloud division readies for a future beyond Xbox
    Cloud services seen as the future of games

  14. Tomi Engdahl says:

    he Xilinx FPGA manufacturer today introduced its new vision and at the same time the new product category it calls ACAP, the adaptive computational acceleration platform.

    - ACAP computing capabilities go far beyond the capabilities of traditional FPGAs. It’s a genuinely new product category that can be altered to fit different applications and workloads at a device level, Peng said at a press conference.

    The words are covered. With the ACAP processor, functions can be dynamically changed during performance. The change takes time in milliseconds, and then the new application-specific computation succeeds much higher power per watt than with a general-purpose processor or graphics processor.

    According to Peng, ACAP is ideally suited to new big data and artificial intelligence applications. These include video processing, database processing, data compression, searches, calculation of AI models, machine vision, and many of the network acceleration functions.

    The first ACAP family is called Everest and is implemented in the TSMC’s 7 nanometer process. The first chips in the chip are getting through this year. – Everest circuits will radically differ from what Xilinx and Altera have done so far.


  15. Tomi Engdahl says:

    9 hidden risks of telecommuting policies

    As the boundaries of the enterprise shift, IT’s ability to support and protect remote work environments must shift correspondingly. Here’s how to develop a comprehensive telecommuting policy to mitigate potential liabilities.

    How I Learned to Stop Worrying and Love Telecommuting

    CareGroup CIO John Halamka takes an in-depth look at the policies and technologies necessary for supporting flexible work arrangements.

  16. Tomi Engdahl says:

    Microsoft’s new gaming cloud division readies for a future beyond Xbox
    Cloud services seen as the future of games

    Microsoft shipped its first video game in 1981, appropriately named Microsoft Adventure. It was an MS-DOS game that booted directly from a floppy disk, and set the stage for Microsoft’s adventures in gaming. A lot has changed over the past 37 years, and when you think of Microsoft’s efforts in gaming these days you’ll immediately think of Xbox. It’s fair to say a lot is about to change over the next few decades too, and Microsoft is getting ready. Today, the software giant is unveiling a new gaming cloud division that’s ready for a future where consoles and gaming itself are very different to today.

  17. Tomi Engdahl says:

    Xilinx to bust ACAP in the dome of data centres all over with uber FPGA
    That’s an Adaptive Compute Acceleration Platform btw

    Xilinx is developing a monstrous FPGA that can be dynamically changed at the hardware level.

    The biz’s “Everest” project is the development of what Xilinx termed an Adaptive Compute Acceleration Platform (ACAP), an integrated multi-core heterogeneous design that goes way beyond your bog-standard FPGA, apparently. It is being built with TSMC’s 7nm process technology and tapes out later this year.

    Xilinx Unveils Revolutionary Adaptable Computing Product Category


    An ACAP has – at its core – a new generation of FPGA fabric with distributed memory and hardware-programmable DSP blocks, a multicore SoC, and one or more software programmable, yet hardware adaptable, compute engines, all connected through a network on chip (NoC). An ACAP also has highly integrated programmable I/O functionality, ranging from integrated hardware programmable memory controllers, advanced SerDes technology and leading edge RF-ADC/DACs, to integrated High Bandwidth Memory (HBM) depending on the device variant.

    Software developers will be able to target ACAP-based systems using tools like C/C++, OpenCL and Python. An ACAP can also be programmable at the RTL level using FPGA tools.

    “This is what the future of computing looks like,” says Patrick Moorhead, founder, Moor Insights & Strategy. “We are talking about the ability to do genomic sequencing in a matter of a couple of minutes, versus a couple of days. We are talking about data centers being able to program their servers to change workloads depending upon compute demands, like video transcoding during the day and then image recognition at night. This is significant.”

    ACAP has been under development for four years at an accumulated R&D investment of over one billion dollars (USD). There are currently more than 1,500 hardware and software engineers at Xilinx designing “ACAP and Everest.” Software tools have been delivered to key customers. “Everest” will tape out in 2018 with customer shipments in 2019.

  18. Tomi Engdahl says:

    No future-oriented IT megatrends without real time capable fiber optics

    The digital transformation casts its shadows. The world of applications is thrown upside down. The purpose and size of the project take a back seat. Users with their expectations of a modern, digitized world are in the focus of attention. Edge Computing and the new 5G mobile network are on everyone’s lips. They are the prerequisite for real-time applications and low latency. But are these pure hype topics or concrete solutions that make future-oriented applications possible in the first place?

    Experts agree: without a modern, real-time-capable infrastructure, neither the Internet of Things (IoT), nor autonomous driving or Smart Cities can be realized. The fact is that data volumes around the globe are exploding. Real-time applications need to be processed within seconds and in the area where they are created and used.

    In the case of autonomous driving cars, for example, all data processing is carried out directly in the vehicle. Reactions must occur within milliseconds, for example to prevent accidents. The necessary real-time data processing is only possible with 5G mobile communications and Edge Computing. The construction of structures is no magic trick, because network nodes for edge computing can be integrated in the next street light, an advertising pillar or near a mobile radio cell, even in the middle of such a cell. Data processing at its best.

    A dream of the future? Not at all! At the 2018 Winter Olympics in South Korea, for example, visitors and athletes at the venues have already been able to test a 5G installation and immerse themselves in the world of new applications. At the Mobile World Congress 2018 in Barcelona, 5G was also one of the central themes. Commercial projects based on 5G are already planned for 2018 in the EU. The Federal Ministry of Transport and Digital Technologies estimates that by 2020, rapid mobile communications technology will then be available everywhere.

  19. Tomi Engdahl says:

    How to get started with edge computing

    Implementing edge devices into a system is powerful, easy to use and install, cost-effective, and optimizes data collection and reliability. Find your way to the edge.

    Edge computing is designed to enhance the Industrial Internet of Things (IIoT) and provides many potential advantages for users. Edge computing speeds up data flow and extends knowledge of what’s happening on a network. It also improves data reliability by making device data the one source of truth. And there’s less latency. If there are local human-machine interfaces (HMIs), there’s still local access and control even if network connectivity is lost to help prevent against losing data. Edge devices are more powerful, easier to use, and less expensive, making it very affordable to put powerful computers at the edge of a network.

    Getting started with edge computing

    With all the edge products on the market, there are a lot of choices for a company to make.

    Think about the entire system and how the edge devices are going to fit into the larger architecture. Find the devices that work best for the system and the company’s overall goals.

    Ask specific questions about the devices. How can they be maintained and upgraded? Can the data be moved to a central location? Can the devices be used for other functions at the edge?

    The architecture should allow plug-and-play functionality. Individual components should be replaceable without affecting the whole system. Older architecture requiring configurations in multiple places inhibits the ability to make future changes.

    Many edge devices work well with message queuing telemetry transport (MQTT), which is the perfect messaging protocol for the IIoT. MQTT was designed about 20 years ago for the industrial space. In recent years, it has become more popular because of its low bandwidth requirements and publish/subscribe model.

    MQTT reports by exception and communicates data only when there’s a change. It also makes data available for applications such as supervisory control and data acquisition (SCADA), enterprise resource planning (ERP), information technology (IT), business intelligence, and more. MQTT provides high availability and scalability.

    Results with edge computing

    Edge computing is expanding along with the IIoT because it provides numerous benefits. For example, an oil and gas pipeline used traditional polling, which usually takes 30-45 minutes to hear back from all the remote locations. If operators pressed a button to open a valve, they’d have to wait 15 minutes to get confirmation the valve had opened. After installing edge devices and MQTT, the process now takes less than 15 seconds.

  20. Tomi Engdahl says:

    Fog computing for industrial automation

    How to develop a secure, distributed automation architecture in a data-driven world: Two examples and five advantages of fog computing are highlighted.

    The manufacturing industry is experiencing substantial benefits as industrial operators use the Industrial Internet of Things (IIoT) to automate systems, deploy sensors to measure, monitor, and analyze data, improve efficiencies, and increase revenue opportunities for manufacturing operations. Using eight pillars of a fog computing architecture can help.

    The amount of data from these newly-connected plants can be measured in the petabytes (1 million gigabytes): Millions of streaming, connected sensors on industrial control systems (ICSs), dozens of autonomous drones, industrial robots, video surveillance cameras covering plants, and so on.

    Traditional information technology (IT) approaches to operational technology (OT) environments cannot keep up with the necessary volume, latency, mobility, reliability, security, privacy, and network bandwidth challenges in controlled, supplier-connected, or rugged operational environments. It’s time for a new architectural approach to allow IIoT to reach its potential with fog computing.

    Defining fog computing

    Fog computing is designed for data-dense, high-performance computing, high-stakes environments. Fog is an emerging, distributed architecture that bridges the continuum between cloud and connected devices that doesn’t require persistent cloud connectivity in the field and factory. Fog works by selectively moving compute, storage, communication, control, and decision making closer to IoT sensors and actuators, where the data is generated and used. It augments, not replaces, investments in the cloud to enable an efficient, cost-effective, secure, and constructive use of the IIoT in manufacturing environments.

    Fog is sometimes referred to as edge computing, but there are key differences. Fog is a superset of edge functionality. The fog architecture pools the resources and data sources between devices residing at the edge in north-south (cloud-to-sensor), east-west (function-to-function or peer-to-peer) hierarchies working with the cloud for maximum efficiency. Edge computing tends to be limited to a small number of north-south layers often associated with simple protocol gateway functions.

    Fog nodes are foundational elements of the fog architecture. A fog node is any device that provides computational, networking, storage, and acceleration elements of the fog architecture. Examples include industrial controllers, switches, routers, embedded servers, sophisticated gateways, programmable logic controllers (PLCs), and intelligent IoT endpoints such as video surveillance cameras.


Leave a Comment

Your email address will not be published. Required fields are marked *