Computing at the Edge of IoT – Google Developers – Medium

https://medium.com/google-developers/computing-at-the-edge-of-iot-140a888007b
We’ve seen that demand for low latency, offline access, and enhanced machine learning capabilities is fueling a move towards decentralization with more powerful computing devices at the edge.

Nevertheless, many distributed applications benefit more from a centralized architecture and the lowest cost hardware powered by MCUs.

Let’s examine how hardware choice and use case requirements factor into different IoT system architectures.

169 Comments

  1. Tomi Engdahl says:

    Designing For The Edge
    https://semiengineering.com/adding-intelligence-into-the-edge/

    Growth in data is fueling many more options, but so far it’s not clear which of them will win.

    Chip and system architectures are beginning to change as the tech industry comes to grips with the need to process more data locally for latency, safety, and privacy/security reasons.

    The emergence of the intelligent edge is an effort to take raw data from endpoints, extract the data that requires immediate action, and forward other data to various local, regional or commercial clouds. The basic idea is to prioritize what data goes where, what data should be discarded altogether, and what data needs to be analyzed on a broader scale using more sophisticated tools than are available locally.

    Reply
  2. Tomi Engdahl says:

    InferX X1 Coprocessor Takes on Inference at the Edge
    https://www.electronicdesign.com/iot/inferx-x1-coprocessor-takes-inference-edge?sfvc4enews=42&cl=article_2_b&utm_rid=CPG05000002750211&utm_campaign=24783&utm_medium=email&elq2=95a7ec9ea8624b4bbf55d3899aff98ff

    The InferX X1 edge inference coprocessor developed by Flex Logix delivers artificial-intelligence inference to IoT applications.

    Flex Logix’s nnMax machine-learning (ML) inference engine technology, originally developed for embedded FPGAs (eFPGAs), will now be available in the InferX X1 coprocessor

    Reply
  3. Tomi Engdahl says:

    Preparing For War On The Edge
    https://semiengineering.com/preparing-for-war-on-the-edge/

    The tech world is planning for an onslaught of data, but it’s far from clear who will own it.

    War clouds are gathering over the edge of the network.

    The rush by the reigning giants of data—IBM, Amazon, Facebook, Alibaba, Baidu, Microsoft and Apple—to control the cloud by building mammoth hyperscale data centers is being met with uncertainty at the edge of the network. In fact, just the emergence of the edge could mean that all bets are off when it comes to data dominance.

    It’s not that the big cloud operations are in danger of losing business. But they may be in danger of losing control over some or all of the most valuable data, and that could have significant repercussions on a global level. So what will they or others do about these changes? At this point, it’s not clear how all of this will play out.

    The edge, which even a year ago was largely ignored by investors and chipmakers, has suddenly become the next big thing. Companies are putting labels on it, defining its boundaries, and weaving marketing messages about how all of the pieces will fit together.

    Reply
  4. Tomi Engdahl says:

    Rushing To The Edge
    https://semiengineering.com/rushing-to-the-edge/

    Why the next big thing is unnerving tech giants and pushing design in new directions.

    Virtually every major tech company has an “edge” marketing presentation these days, and some even have products they are calling edge devices. But the reality is that today no one is quite sure how to define the edge or what it will become, and any attempts to pigeon-hole it are premature.

    What is becoming clear is the edge is not simply an extension of the Internet of Things. It is the result of the IoT, a way of managing and utilizing the vast and exploding amount of data produced by sensors from connected devices everywhere. But it also is its own distinct segment, even though it hasn’t been fully articulated.

    Reply
  5. Tomi Engdahl says:

    AI, IoT, 5G, and Edge Computing Shape Thermal Design
    https://www.eeweb.com/profile/haileymck/articles/ai-iot-5g-and-edge-computing-shape-thermal-design

    In the coming year, new thermal design priorities will largely be driven by technologies such as artificial intelligence (AI), the internet of things (IoT), 5G, and edge computing, recent research from Future Facilities, which makes thermal design software, found. “Recent advancements in technology over the past few years have resulted in unprecedented changes in the way engineers view their designs,” said Chris Aldham, product manager at Future Facilities. “The introduction of AI, 5G, edge computing, and the internet of things all have major implications for how — and where — electronics need to operate, and that, in turn, means a whole host of new considerations from a thermal perspective.”

    In a digital roundtable event, thermal designers, engineers, and experts from a variety of organizations — including Facebook, HP Enterprise, QuantaCool, Engineered Fluids, CommScope, Vertiv, 6SigmaET, and Binghamton University — gathered to compare notes. The group identified a handful of thermal design priorities, including:

    The need for hybrid cooling to cope with new IoT environments
    Remote monitoring of cooling systems in edge-computing devices
    More accurate monitoring and simulation of energy use in data centers
    Thermal cooling solutions for 5G base stations and new AI hardware
    Tools that can accurately simulate these new technologies and environments

    Reply
  6. Tomi Engdahl says:

    Mary Jo Foley / ZDNet:
    Microsoft announces Azure SQL Database Edge, which will run on compute-constrained devices, and new AI, mixed reality, and IoT capabilities powered by Azure — A couple days ahead of Build 2019, Microsoft is showcasing some new AI, MR, IoT and blockchain capabilities powered by Azure …

    Microsoft adds more AI, mixed-reality, IoT services to its Azure line-up
    https://www.zdnet.com/article/microsoft-adds-more-ai-mixed-reality-iot-services-to-its-azure-line-up/

    A couple days ahead of Build 2019, Microsoft is showcasing some new AI, MR, IoT and blockchain capabilities powered by Azure which it will be showing off at its developer confab.

    Reply
  7. Tomi Engdahl says:

    Benchmarking Edge Computing
    https://medium.com/@aallan/benchmarking-edge-computing-ce3f13942245

    Comparing Google, Intel, and NVIDIA accelerator hardware

    Reply
  8. Tomi Engdahl says:

    Google Is Bringing AI to the Edge for Everyone in 2019
    https://www.designnews.com/electronics-test/google-bringing-ai-edge-everyone-2019/1500467160761?ADTRK=UBM&elq_mid=8598&elq_cid=876648

    The 2019 Google I/O developer conference brought a wave of artificial intelligence announcement as the company touted a shift toward edge-based AI, and a new focus on improved privacy.

    Reply
  9. Tomi Engdahl says:

    Why Machine Learning Is Important to Embedded
    https://www.designnews.com/electronics-test/why-machine-learning-important-embedded/3392145060759?ADTRK=UBM&elq_mid=8615&elq_cid=876648

    Machine learning is opening up new features and applications that will forever change how users expect their systems to behave.

    Machine learning for embedded systems has been gaining a lot of momentum over the past several years. For embedded developers, machine learning was something that data scientists were concerned with and something that lived up on the cloud, far from the resource-constrained microcontrollers that embedded developers work with on a daily basis.

    What seems like almost overnight, however, machine learning is suddenly finding its way to microcontroller and edge devices. To some developers, this may seem baffling or at least intriguing. But why is machine learning so important to embedded developers now? Let’s explore a few possibilities.

    Reply
  10. Tomi Engdahl says:

    Google Is Bringing AI to the Edge for Everyone in 2019
    https://www.designnews.com/electronics-test/google-bringing-ai-edge-everyone-2019/1500467160761?ADTRK=UBM&elq_mid=8615&elq_cid=876648

    The 2019 Google I/O developer conference brought a wave of artificial intelligence announcement as the company touted a shift toward edge-based AI, and a new focus on improved privacy.

    Reply
  11. Tomi Engdahl says:

    Bottlenecks For Edge Processors
    https://semiengineering.com/bottlenecks-for-edge-processors/

    New processors will be blazing fast, but that doesn’t guarantee improvements in system speed.

    New processor architectures are being developed that can provide two to three orders of magnitude improvement in performance. The question now is whether the performance in systems will be anything close to the processor benchmarks.

    Most of these processors doing one thing very well. They handle specific data types and can accelerate the multiply-accumulate functions for algorithms by distributing the processing across multiple processing elements in a chip. In effect, they are parallelizing operations while also pruning algorithms and adjusting the output to whatever precision level is necessary, and they are storing and retrieving bits from memory in multiple directions rather than just left-to-right.

    There are several inherent challenges with these architectures, though. First, moving data through a chip is not particularly energy-efficient if you aren’t also reducing the amount of data that needs to be processed along the way.

    Data centers have been wrestling with this problem for decades, and hyperscale clouds have added an element of heterogeneity to the mix. The cloud essentially load-balances processing and uses high-speed optical interconnects to ship data around at the speed of light. But as this kind of operation moves closer to the data, such as in a car or an edge cloud, the ability to load balance is much more limited. There is finite real estate in an edge cloud, and far less in an autonomous vehicle.

    The second challenge is that it’s not obvious how devices will be connected throughout the edge, and that can provide its own bottlenecks. Designing a system with a data pipe running at 100Gbps or 400Gbps looks very good on paper, but the speeds are only as fast as the slowest component on the network. Anyone with a gigabit Internet connection knows the flow of data is only as fast as the server on the other end.

    Third, the economics of any compute operation that isn’t a hyperscale cloud data center are very fuzzy. Because these new processor architectures are highly tuned for certain algorithms or data types, they are unlikely to achieve the economies of scale that have made this kind of processing possible in the first place.

    Reply
  12. Tomi Engdahl says:

    The distributed intelligence triple-whammy: 5G, AI and tinyML
    https://www.audioanalytic.com/distributed-intelligence-triple-whammy-5g-ai-tinyml/

    Back in September 2018, Forbes contributor Prakash Sangam heralded two major tech trends, AI and 5G, and made the case, in autonomous driving, for distributed intelligence: AI running at the intelligent edge AND in the intelligent cloud. His point is to put critical ‘sensing’ that is required to act immediately in the car, while processing-intensive functions in the cloud. 5G ends up being the connective glue between both intelligent systems, offering low-latency, high bandwidth and high data transfer speeds.

    Reply
  13. Tomi Engdahl says:

    Hands-On with the SmartEdge Agile
    Out-of-the-box artificial intelligence at the edge
    https://blog.hackster.io/hands-on-with-the-smartedge-agile-b7b7f02b5d4b

    Reply
  14. Tomi Engdahl says:

    Bottlenecks For Edge Processors
    https://semiengineering.com/bottlenecks-for-edge-processors/

    New processors will be blazing fast, but that doesn’t guarantee improvements in system speed.

    Reply
  15. Tomi Engdahl says:

    Rethinking Enclosures To Support IoT & The Edge
    https://www.eetimes.com/document.asp?doc_id=1334654

    A watchword of business intelligence nowadays, the Internet of Things (IoT) 4.0 offers nothing less than the digitization of your company’s entire operations. IoT data-collection devices are already ubiquitous in the manufacturing world, and are projected to reach over 5 billion by 2020.

    The Internet of Things 4.0 utilizes sensors to automatically gather your data at distributed points, transmitting critical business intelligence including analytics and diagnostics. All this data and decision-making is rapidly becoming decentralized, making businesses more agile and able to make decisions remotely. It all translates to more efficient processes, improved security, and lower operational costs.

    However, to handle all this big data and to successfully implement IoT 4.0, industrial and IT businesses will need extremely short latency times for effective M2M communication. That’s where Edge computing delivers the speed you need.

    The Edge is nothing new–IT professionals and plant managers have always deployed computers in distributed locations and uncontrolled environments. Edge computing relegates data control from your central network to the periphery (or ‘edge’) of the internet where your smart devices, wireless sensors and other Internet-enabled devices are located. Using localized computing, Edge deployments store all this big data to reduce your cloud dependence.

    For example, Edge sites manage your smart grid, i.e. your energy distribution, grid services, and components at a local level, providing your high-speed internet and television. In fact, Edge data centers support 75% of all local internet usage. Municipal utilities including water and power also utilize the Edge, while electronic tolling is another common example (E-ZPass, or automated license-plate readers).

    Improving Latency, Business Intelligence
    While the Internet of Things enables you to centralize control over the systems that run your plant processes or IT infrastructure, it needs to approach real-time speeds. Naturally, with all its smart devices and sensors, IoT 4.0 is already starting to heavily depend on Edge deployments. Edge computing supports the IoT world by enabling more efficient communication and bringing the network closer to the data to reduce latency.

    In many business areas, the datacenter has been replaced with a cloud datacenter which in turn has been replaced with a ‘fog’ datacenter. The fog offers specific cloud services for data storage, while some information is also collected/sent at the local level as with Edge computing.

    Not every IT staffer or factory manager has the free choice of where to build, the power to modify servers, or the staff to design a datacenter from scratch. Consequently, Edge deployments are located on oil rigs, in hazardous plant areas, on cruise ships–just about anywhere.

    As important as the Edge is, even more so is its implementation. The Edge and the IoT are a closed loop supporting each other. However, deploying edge computing for IoT devices can be a complex task. Just as with traditional IT footprints, physical Edge deployments require their own diverse considerations, precautions, and equipment.

    Prioritize for your specific Edge deployment. For example, PUE (Power Usage Efficiency) is a very important performance metric for both traditional and Edge datacenters.

    A comprehensive assessment of current and future IoT infrastructure can save long-term headaches and ensure success. These include:

    • Safety and Security
    • Protection
    • Scalability
    • Climate Control

    In addition, the ecosystem of cloud providers, mobile network companies and microprocessor companies can be overwhelming for many industrial managers trying to understand the integration of IoT. Working with infrastructure manufacturers, like Rittal, that have global relationships with industrial, IT and IoT companies can provide insight and direction to finding the right solutions providers for your company’s needs.

    Reply
  16. Tomi Engdahl says:

    The Need for Tiered Security at the Edge
    https://www.securityweek.com/need-tiered-security-edge

    Enabling the networks of tomorrow requires organizations to Radically Reimagine the Security Tools they have in Place Today

    One of the most disruptive results of digital transformation for many organizations has been the rapid emergence of the edge, which in many ways is what has been replacing the traditionally static network perimeter. The advent and support of an edge-based networking model enables organizations to more dynamically expand their networks, embrace mobile users, IoT, and enduser devices, build flexible and responsive WAN and cloud connections, and enable distributed processing.

    Reply
  17. Tomi Engdahl says:

    Big 5G stakeholders roll eyes over edge computing hype: Light Reading
    https://www.cablinginstall.com/wireless-5g/article/16470376/big-5g-stakeholders-roll-eyes-over-edge-computing-hype-light-reading?cmpid=&utm_source=enl&utm_medium=email&utm_campaign=cim_data_center_newsletter&utm_content=2019-05-13&eid=289644432&bid=2441562

    Reporting from Light Reading’s Big 5G Event, held this month in Denver (May 6-8), Mike Dano notes that “top players in the mobile networking and data center industries are voicing serious concerns” about edge computing.

    Jim Poole, VP of ecosystem business development at data center giant Equinix, said that mobile operators would need to completely revise their network designs away from voice services to get edge computing to work in a 5G world. “This whole thing needs to be changed, rearchitected,” he said. “5G is an extraordinarily daunting change.”

    The topic of edge computing has generated a significant amount of hype, and many in the space do agree it could play a key role in the ultimate development of 5G technology. But some top players in the mobile networking and data center industries are voicing serious concerns about edge computing in the near and even the medium term.

    “Spend enough time in the telecom and technology industries and it becomes clear that the hype of many new technologies usually precedes the reality by 5-10 years. We believe that is the case with micro edge data centers,” wrote Raul Martynek, the CEO of data center provider DataBank, in a lengthy post on LinkedIn.

    Data Center Firms, Mobile Operators Pour Cold Water on Edge Computing
    https://www.lightreading.com/mobile/mec-(mobile-edge-computing)/data-center-firms-mobile-operators-pour-cold-water-on-edge-computing/d/d-id/751350

    The topic of edge computing has generated a significant amount of hype, and many in the space do agree it could play a key role in the ultimate development of 5G technology. But some top players in the mobile networking and data center industries are voicing serious concerns about edge computing in the near and even the medium term.

    “Spend enough time in the telecom and technology industries and it becomes clear that the hype of many new technologies usually precedes the reality by 5-10 years. We believe that is the case with micro edge data centers,” wrote Raul Martynek, the CEO of data center provider DataBank, in a lengthy post on LinkedIn.

    Edge computing proponents argue that the mostly centralized nature of the Internet today won’t support the snappy, real-time services that 5G providers hope to offer, like autonomous vehicles and streaming virtual reality. Such services require almost immediate connections between computing services and users, and an edge computing design would enable that instant connection by physically locating data centers geographically next to the users that need them. Such a design — dispersed computing instead of consolidated in one location — potentially could eliminate the tens or even hundreds of milliseconds it takes for a user’s request to travel across a network to a computer that can answer it.

    At least, that’s the idea.

    DataBank’s Martynek argued that, at least so far, there’s very little need for hundreds or thousands of mini computing locations spread out all over the country. Specifically, he noted that there are already several data centers physically located in most major metro markets in the US.

    . “The incremental improvement from going from one data center location to 5 micro data center locations only improves your round trip latency by less than 1-2ms,” he wrote.

    As a result, he argued, “deploying in tens-hundreds-thousands of micro-data centers would only improve latency by 1ms or less, and in some cases introduce latency depending on where the peering occurs.” Those comments essentially represent a dig at the likes of EdgeMicro and Vapor IO that are hoping to build out mini data centers in dozens of cities across the US.

    Similarly, Equinix’s Poole noted that edge computing is already available in a basic form today, considering that Equinix operates roughly 200 data centers around the world.

    However, most speakers agreed that, eventually, 5G would help spark more demand for edge computing services. “Does 5G need edge computing? I’d say the answer is yes. Does edge computing need 5G? The answer is no.” Equinix’s Poole said.

    “I think edge computing is one of the two or three things that make 5G different,” Gedeon said.

    But Equinix’s Poole argued that wireless networks need to essentially be redesigned in order to fully take advantage of the edge computing opportunity. Instead of routing all traffic through a handful of on-ramps, mobile operators will need to instead create ways for applications to immediately access local mobile users — and to interoperate.

    Reply
  18. Tomi Engdahl says:

    Is ADAS The Edge?
    https://semiengineering.com/is-adas-the-edge/

    Uncertainty about where processing will occur is causing confusion over definitions.

    Debate is brewing over whether ADAS applications fall on the edge, or if they are better viewed squarely within the context of the automotive camp.

    There is more to this discussion than just semantics. The edge represents a huge greenfield opportunity for electronics of all sorts, and companies from the mobile market and from the cloud are both rushing to stake their claim. At this point there is no single instruction-set architecture that owns this space, and it’s not clear exactly how this market will be carved up. There are end devices that can do pre-processing, and various levels of servers and local clouds before data is processed in the large cloud operations of companies such as Google, Amazon and Microsoft.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*