Computing at the Edge of IoT – Google Developers – Medium

https://medium.com/google-developers/computing-at-the-edge-of-iot-140a888007b
We’ve seen that demand for low latency, offline access, and enhanced machine learning capabilities is fueling a move towards decentralization with more powerful computing devices at the edge.

Nevertheless, many distributed applications benefit more from a centralized architecture and the lowest cost hardware powered by MCUs.

Let’s examine how hardware choice and use case requirements factor into different IoT system architectures.

187 Comments

  1. Tomi Engdahl says:

    Designing For The Edge
    https://semiengineering.com/adding-intelligence-into-the-edge/

    Growth in data is fueling many more options, but so far it’s not clear which of them will win.

    Chip and system architectures are beginning to change as the tech industry comes to grips with the need to process more data locally for latency, safety, and privacy/security reasons.

    The emergence of the intelligent edge is an effort to take raw data from endpoints, extract the data that requires immediate action, and forward other data to various local, regional or commercial clouds. The basic idea is to prioritize what data goes where, what data should be discarded altogether, and what data needs to be analyzed on a broader scale using more sophisticated tools than are available locally.

    Reply
  2. Tomi Engdahl says:

    InferX X1 Coprocessor Takes on Inference at the Edge
    https://www.electronicdesign.com/iot/inferx-x1-coprocessor-takes-inference-edge?sfvc4enews=42&cl=article_2_b&utm_rid=CPG05000002750211&utm_campaign=24783&utm_medium=email&elq2=95a7ec9ea8624b4bbf55d3899aff98ff

    The InferX X1 edge inference coprocessor developed by Flex Logix delivers artificial-intelligence inference to IoT applications.

    Flex Logix’s nnMax machine-learning (ML) inference engine technology, originally developed for embedded FPGAs (eFPGAs), will now be available in the InferX X1 coprocessor

    Reply
  3. Tomi Engdahl says:

    Preparing For War On The Edge
    https://semiengineering.com/preparing-for-war-on-the-edge/

    The tech world is planning for an onslaught of data, but it’s far from clear who will own it.

    War clouds are gathering over the edge of the network.

    The rush by the reigning giants of data—IBM, Amazon, Facebook, Alibaba, Baidu, Microsoft and Apple—to control the cloud by building mammoth hyperscale data centers is being met with uncertainty at the edge of the network. In fact, just the emergence of the edge could mean that all bets are off when it comes to data dominance.

    It’s not that the big cloud operations are in danger of losing business. But they may be in danger of losing control over some or all of the most valuable data, and that could have significant repercussions on a global level. So what will they or others do about these changes? At this point, it’s not clear how all of this will play out.

    The edge, which even a year ago was largely ignored by investors and chipmakers, has suddenly become the next big thing. Companies are putting labels on it, defining its boundaries, and weaving marketing messages about how all of the pieces will fit together.

    Reply
  4. Tomi Engdahl says:

    Rushing To The Edge
    https://semiengineering.com/rushing-to-the-edge/

    Why the next big thing is unnerving tech giants and pushing design in new directions.

    Virtually every major tech company has an “edge” marketing presentation these days, and some even have products they are calling edge devices. But the reality is that today no one is quite sure how to define the edge or what it will become, and any attempts to pigeon-hole it are premature.

    What is becoming clear is the edge is not simply an extension of the Internet of Things. It is the result of the IoT, a way of managing and utilizing the vast and exploding amount of data produced by sensors from connected devices everywhere. But it also is its own distinct segment, even though it hasn’t been fully articulated.

    Reply
  5. Tomi Engdahl says:

    AI, IoT, 5G, and Edge Computing Shape Thermal Design
    https://www.eeweb.com/profile/haileymck/articles/ai-iot-5g-and-edge-computing-shape-thermal-design

    In the coming year, new thermal design priorities will largely be driven by technologies such as artificial intelligence (AI), the internet of things (IoT), 5G, and edge computing, recent research from Future Facilities, which makes thermal design software, found. “Recent advancements in technology over the past few years have resulted in unprecedented changes in the way engineers view their designs,” said Chris Aldham, product manager at Future Facilities. “The introduction of AI, 5G, edge computing, and the internet of things all have major implications for how — and where — electronics need to operate, and that, in turn, means a whole host of new considerations from a thermal perspective.”

    In a digital roundtable event, thermal designers, engineers, and experts from a variety of organizations — including Facebook, HP Enterprise, QuantaCool, Engineered Fluids, CommScope, Vertiv, 6SigmaET, and Binghamton University — gathered to compare notes. The group identified a handful of thermal design priorities, including:

    The need for hybrid cooling to cope with new IoT environments
    Remote monitoring of cooling systems in edge-computing devices
    More accurate monitoring and simulation of energy use in data centers
    Thermal cooling solutions for 5G base stations and new AI hardware
    Tools that can accurately simulate these new technologies and environments

    Reply
  6. Tomi Engdahl says:

    Mary Jo Foley / ZDNet:
    Microsoft announces Azure SQL Database Edge, which will run on compute-constrained devices, and new AI, mixed reality, and IoT capabilities powered by Azure — A couple days ahead of Build 2019, Microsoft is showcasing some new AI, MR, IoT and blockchain capabilities powered by Azure …

    Microsoft adds more AI, mixed-reality, IoT services to its Azure line-up
    https://www.zdnet.com/article/microsoft-adds-more-ai-mixed-reality-iot-services-to-its-azure-line-up/

    A couple days ahead of Build 2019, Microsoft is showcasing some new AI, MR, IoT and blockchain capabilities powered by Azure which it will be showing off at its developer confab.

    Reply
  7. Tomi Engdahl says:

    Benchmarking Edge Computing
    https://medium.com/@aallan/benchmarking-edge-computing-ce3f13942245

    Comparing Google, Intel, and NVIDIA accelerator hardware

    Reply
  8. Tomi Engdahl says:

    Google Is Bringing AI to the Edge for Everyone in 2019
    https://www.designnews.com/electronics-test/google-bringing-ai-edge-everyone-2019/1500467160761?ADTRK=UBM&elq_mid=8598&elq_cid=876648

    The 2019 Google I/O developer conference brought a wave of artificial intelligence announcement as the company touted a shift toward edge-based AI, and a new focus on improved privacy.

    Reply
  9. Tomi Engdahl says:

    Why Machine Learning Is Important to Embedded
    https://www.designnews.com/electronics-test/why-machine-learning-important-embedded/3392145060759?ADTRK=UBM&elq_mid=8615&elq_cid=876648

    Machine learning is opening up new features and applications that will forever change how users expect their systems to behave.

    Machine learning for embedded systems has been gaining a lot of momentum over the past several years. For embedded developers, machine learning was something that data scientists were concerned with and something that lived up on the cloud, far from the resource-constrained microcontrollers that embedded developers work with on a daily basis.

    What seems like almost overnight, however, machine learning is suddenly finding its way to microcontroller and edge devices. To some developers, this may seem baffling or at least intriguing. But why is machine learning so important to embedded developers now? Let’s explore a few possibilities.

    Reply
  10. Tomi Engdahl says:

    Google Is Bringing AI to the Edge for Everyone in 2019
    https://www.designnews.com/electronics-test/google-bringing-ai-edge-everyone-2019/1500467160761?ADTRK=UBM&elq_mid=8615&elq_cid=876648

    The 2019 Google I/O developer conference brought a wave of artificial intelligence announcement as the company touted a shift toward edge-based AI, and a new focus on improved privacy.

    Reply
  11. Tomi Engdahl says:

    Bottlenecks For Edge Processors
    https://semiengineering.com/bottlenecks-for-edge-processors/

    New processors will be blazing fast, but that doesn’t guarantee improvements in system speed.

    New processor architectures are being developed that can provide two to three orders of magnitude improvement in performance. The question now is whether the performance in systems will be anything close to the processor benchmarks.

    Most of these processors doing one thing very well. They handle specific data types and can accelerate the multiply-accumulate functions for algorithms by distributing the processing across multiple processing elements in a chip. In effect, they are parallelizing operations while also pruning algorithms and adjusting the output to whatever precision level is necessary, and they are storing and retrieving bits from memory in multiple directions rather than just left-to-right.

    There are several inherent challenges with these architectures, though. First, moving data through a chip is not particularly energy-efficient if you aren’t also reducing the amount of data that needs to be processed along the way.

    Data centers have been wrestling with this problem for decades, and hyperscale clouds have added an element of heterogeneity to the mix. The cloud essentially load-balances processing and uses high-speed optical interconnects to ship data around at the speed of light. But as this kind of operation moves closer to the data, such as in a car or an edge cloud, the ability to load balance is much more limited. There is finite real estate in an edge cloud, and far less in an autonomous vehicle.

    The second challenge is that it’s not obvious how devices will be connected throughout the edge, and that can provide its own bottlenecks. Designing a system with a data pipe running at 100Gbps or 400Gbps looks very good on paper, but the speeds are only as fast as the slowest component on the network. Anyone with a gigabit Internet connection knows the flow of data is only as fast as the server on the other end.

    Third, the economics of any compute operation that isn’t a hyperscale cloud data center are very fuzzy. Because these new processor architectures are highly tuned for certain algorithms or data types, they are unlikely to achieve the economies of scale that have made this kind of processing possible in the first place.

    Reply
  12. Tomi Engdahl says:

    The distributed intelligence triple-whammy: 5G, AI and tinyML
    https://www.audioanalytic.com/distributed-intelligence-triple-whammy-5g-ai-tinyml/

    Back in September 2018, Forbes contributor Prakash Sangam heralded two major tech trends, AI and 5G, and made the case, in autonomous driving, for distributed intelligence: AI running at the intelligent edge AND in the intelligent cloud. His point is to put critical ‘sensing’ that is required to act immediately in the car, while processing-intensive functions in the cloud. 5G ends up being the connective glue between both intelligent systems, offering low-latency, high bandwidth and high data transfer speeds.

    Reply
  13. Tomi Engdahl says:

    Hands-On with the SmartEdge Agile
    Out-of-the-box artificial intelligence at the edge
    https://blog.hackster.io/hands-on-with-the-smartedge-agile-b7b7f02b5d4b

    Reply
  14. Tomi Engdahl says:

    Bottlenecks For Edge Processors
    https://semiengineering.com/bottlenecks-for-edge-processors/

    New processors will be blazing fast, but that doesn’t guarantee improvements in system speed.

    Reply
  15. Tomi Engdahl says:

    Rethinking Enclosures To Support IoT & The Edge
    https://www.eetimes.com/document.asp?doc_id=1334654

    A watchword of business intelligence nowadays, the Internet of Things (IoT) 4.0 offers nothing less than the digitization of your company’s entire operations. IoT data-collection devices are already ubiquitous in the manufacturing world, and are projected to reach over 5 billion by 2020.

    The Internet of Things 4.0 utilizes sensors to automatically gather your data at distributed points, transmitting critical business intelligence including analytics and diagnostics. All this data and decision-making is rapidly becoming decentralized, making businesses more agile and able to make decisions remotely. It all translates to more efficient processes, improved security, and lower operational costs.

    However, to handle all this big data and to successfully implement IoT 4.0, industrial and IT businesses will need extremely short latency times for effective M2M communication. That’s where Edge computing delivers the speed you need.

    The Edge is nothing new–IT professionals and plant managers have always deployed computers in distributed locations and uncontrolled environments. Edge computing relegates data control from your central network to the periphery (or ‘edge’) of the internet where your smart devices, wireless sensors and other Internet-enabled devices are located. Using localized computing, Edge deployments store all this big data to reduce your cloud dependence.

    For example, Edge sites manage your smart grid, i.e. your energy distribution, grid services, and components at a local level, providing your high-speed internet and television. In fact, Edge data centers support 75% of all local internet usage. Municipal utilities including water and power also utilize the Edge, while electronic tolling is another common example (E-ZPass, or automated license-plate readers).

    Improving Latency, Business Intelligence
    While the Internet of Things enables you to centralize control over the systems that run your plant processes or IT infrastructure, it needs to approach real-time speeds. Naturally, with all its smart devices and sensors, IoT 4.0 is already starting to heavily depend on Edge deployments. Edge computing supports the IoT world by enabling more efficient communication and bringing the network closer to the data to reduce latency.

    In many business areas, the datacenter has been replaced with a cloud datacenter which in turn has been replaced with a ‘fog’ datacenter. The fog offers specific cloud services for data storage, while some information is also collected/sent at the local level as with Edge computing.

    Not every IT staffer or factory manager has the free choice of where to build, the power to modify servers, or the staff to design a datacenter from scratch. Consequently, Edge deployments are located on oil rigs, in hazardous plant areas, on cruise ships–just about anywhere.

    As important as the Edge is, even more so is its implementation. The Edge and the IoT are a closed loop supporting each other. However, deploying edge computing for IoT devices can be a complex task. Just as with traditional IT footprints, physical Edge deployments require their own diverse considerations, precautions, and equipment.

    Prioritize for your specific Edge deployment. For example, PUE (Power Usage Efficiency) is a very important performance metric for both traditional and Edge datacenters.

    A comprehensive assessment of current and future IoT infrastructure can save long-term headaches and ensure success. These include:

    • Safety and Security
    • Protection
    • Scalability
    • Climate Control

    In addition, the ecosystem of cloud providers, mobile network companies and microprocessor companies can be overwhelming for many industrial managers trying to understand the integration of IoT. Working with infrastructure manufacturers, like Rittal, that have global relationships with industrial, IT and IoT companies can provide insight and direction to finding the right solutions providers for your company’s needs.

    Reply
  16. Tomi Engdahl says:

    The Need for Tiered Security at the Edge
    https://www.securityweek.com/need-tiered-security-edge

    Enabling the networks of tomorrow requires organizations to Radically Reimagine the Security Tools they have in Place Today

    One of the most disruptive results of digital transformation for many organizations has been the rapid emergence of the edge, which in many ways is what has been replacing the traditionally static network perimeter. The advent and support of an edge-based networking model enables organizations to more dynamically expand their networks, embrace mobile users, IoT, and enduser devices, build flexible and responsive WAN and cloud connections, and enable distributed processing.

    Reply
  17. Tomi Engdahl says:

    Big 5G stakeholders roll eyes over edge computing hype: Light Reading
    https://www.cablinginstall.com/wireless-5g/article/16470376/big-5g-stakeholders-roll-eyes-over-edge-computing-hype-light-reading?cmpid=&utm_source=enl&utm_medium=email&utm_campaign=cim_data_center_newsletter&utm_content=2019-05-13&eid=289644432&bid=2441562

    Reporting from Light Reading’s Big 5G Event, held this month in Denver (May 6-8), Mike Dano notes that “top players in the mobile networking and data center industries are voicing serious concerns” about edge computing.

    Jim Poole, VP of ecosystem business development at data center giant Equinix, said that mobile operators would need to completely revise their network designs away from voice services to get edge computing to work in a 5G world. “This whole thing needs to be changed, rearchitected,” he said. “5G is an extraordinarily daunting change.”

    The topic of edge computing has generated a significant amount of hype, and many in the space do agree it could play a key role in the ultimate development of 5G technology. But some top players in the mobile networking and data center industries are voicing serious concerns about edge computing in the near and even the medium term.

    “Spend enough time in the telecom and technology industries and it becomes clear that the hype of many new technologies usually precedes the reality by 5-10 years. We believe that is the case with micro edge data centers,” wrote Raul Martynek, the CEO of data center provider DataBank, in a lengthy post on LinkedIn.

    Data Center Firms, Mobile Operators Pour Cold Water on Edge Computing
    https://www.lightreading.com/mobile/mec-(mobile-edge-computing)/data-center-firms-mobile-operators-pour-cold-water-on-edge-computing/d/d-id/751350

    The topic of edge computing has generated a significant amount of hype, and many in the space do agree it could play a key role in the ultimate development of 5G technology. But some top players in the mobile networking and data center industries are voicing serious concerns about edge computing in the near and even the medium term.

    “Spend enough time in the telecom and technology industries and it becomes clear that the hype of many new technologies usually precedes the reality by 5-10 years. We believe that is the case with micro edge data centers,” wrote Raul Martynek, the CEO of data center provider DataBank, in a lengthy post on LinkedIn.

    Edge computing proponents argue that the mostly centralized nature of the Internet today won’t support the snappy, real-time services that 5G providers hope to offer, like autonomous vehicles and streaming virtual reality. Such services require almost immediate connections between computing services and users, and an edge computing design would enable that instant connection by physically locating data centers geographically next to the users that need them. Such a design — dispersed computing instead of consolidated in one location — potentially could eliminate the tens or even hundreds of milliseconds it takes for a user’s request to travel across a network to a computer that can answer it.

    At least, that’s the idea.

    DataBank’s Martynek argued that, at least so far, there’s very little need for hundreds or thousands of mini computing locations spread out all over the country. Specifically, he noted that there are already several data centers physically located in most major metro markets in the US.

    . “The incremental improvement from going from one data center location to 5 micro data center locations only improves your round trip latency by less than 1-2ms,” he wrote.

    As a result, he argued, “deploying in tens-hundreds-thousands of micro-data centers would only improve latency by 1ms or less, and in some cases introduce latency depending on where the peering occurs.” Those comments essentially represent a dig at the likes of EdgeMicro and Vapor IO that are hoping to build out mini data centers in dozens of cities across the US.

    Similarly, Equinix’s Poole noted that edge computing is already available in a basic form today, considering that Equinix operates roughly 200 data centers around the world.

    However, most speakers agreed that, eventually, 5G would help spark more demand for edge computing services. “Does 5G need edge computing? I’d say the answer is yes. Does edge computing need 5G? The answer is no.” Equinix’s Poole said.

    “I think edge computing is one of the two or three things that make 5G different,” Gedeon said.

    But Equinix’s Poole argued that wireless networks need to essentially be redesigned in order to fully take advantage of the edge computing opportunity. Instead of routing all traffic through a handful of on-ramps, mobile operators will need to instead create ways for applications to immediately access local mobile users — and to interoperate.

    Reply
  18. Tomi Engdahl says:

    Is ADAS The Edge?
    https://semiengineering.com/is-adas-the-edge/

    Uncertainty about where processing will occur is causing confusion over definitions.

    Debate is brewing over whether ADAS applications fall on the edge, or if they are better viewed squarely within the context of the automotive camp.

    There is more to this discussion than just semantics. The edge represents a huge greenfield opportunity for electronics of all sorts, and companies from the mobile market and from the cloud are both rushing to stake their claim. At this point there is no single instruction-set architecture that owns this space, and it’s not clear exactly how this market will be carved up. There are end devices that can do pre-processing, and various levels of servers and local clouds before data is processed in the large cloud operations of companies such as Google, Amazon and Microsoft.

    Reply
  19. Tomi Engdahl says:

    Mike Wheatley / SiliconANGLE:
    Nvidia unveils its first AI platform for edge devices, Nvidia EGX, compatible with AWS and Azure IoT offerings; launch partners include Cisco, HPE, and Lenovo

    Nvidia announces its first AI platform for edge devices
    https://siliconangle.com/2019/05/27/nvidia-announces-first-ai-platform-edge-devices/

    Nvidia Corp. is bringing artificial intelligence to the edge of the network with the launch early Monday of its new Nvidia EGX platform that can perceive, understand and act on data in real time without sending it to the cloud or a data center first.

    Delivering AI to edge devices such as smartphones, sensors and factory machines is the next step in the technology’s evolutionary progress. The earliest AI algorithms were so complex that they could be processed only on powerful machines running in cloud data centers, and that means sending lots of information across the network. But this is undesirable because it requires lots of bandwidth and results in higher latencies, which makes “real-time” AI something less than that.

    What companies really want is AI to be performed where the data itself is created, be it at manufacturing facilities, retail stores or warehouses.

    Reply
  20. Tomi Engdahl says:

    Satasen kortilla tekoäly verkon reunalle
    http://etn.fi/index.php?option=com_content&view=article&id=9512&via=n&datum=2019-05-24_14:54:51&mottagare=30929

    Nvidia tunnetaan tehokkaista grafiikkaprosessoreistaan, mutta parina viime vuonna se on alkanut muuttua tekoälyfirmaksi. Ja sen tavoitteena on viedä koneoppiminen samoilla malleilla kaikenlaisiin erilaisiin laitteisiin. Mysö verkon reunalle uudella Jetson Nano -kortilla.

    Uusi Jetson Nano -kortti on tästä hyvä esimerkki. Se on tarkoitettu 5-10 watin laitteisiin, joissa tarvitaan puolen teraFLOPSin suorituskykyä neuroverkkojen pyörittämiseen. Aivan mikro-ohjaimista Nvidia ei vielä siis puhu tekoälyn yhteydessä.

    Reply
  21. Tomi Engdahl says:

    Opto 22 explains edge programmable industrial controllers for IIoT
    https://www.cablinginstall.com/standards/article/16479652/opto-22-explains-edge-programmable-industrial-controllers-for-iiot?cmpid=&utm_source=enl&utm_medium=email&utm_campaign=cim_data_center_newsletter&utm_content=2019-05-20&eid=289644432&bid=2447828

    New white paper shows how an edge programmable industrial controller can simplify, reduce costs, and increase security for Industrial Internet of Things (IIoT) and other data-intensive applications.

    Reply
  22. Tomi Engdahl says:

    Edge AI Going Beyond Voice and Vision
    https://www.eetimes.com/document.asp?doc_id=1334753

    Widespread public awareness of systems such as the Amazon Alexa and camera-enabled autonomous cars have made voice and vision almost automatically come to mind when discussing the role of AI in edge-device design. But AI technology is applicable well beyond voice and vision interpretation, supporting the implementation of complex system behaviors that can be intractable using conventional algorithmic development. The trick is to move the AI as close as possible to the edge.

    The two signature AI applications of voice and vision happen to also illustrate the two architectural alternatives for designing AI into an embedded system. In the case of voice, both of AI’s major tasks – learning and inferencing — are handled in the cloud, where substantial processing power is available. This allows the edge device to get along with much less processing capability. It spends most of its limited capacity capturing and forwarding data to the cloud and implementing any commands coming back. This approach has the advantage of allowing a relatively inexpensive edge device design but suffers from the high bandwidth demands and latency effects of substantial WAN communications activity.

    Vision systems, on the other hand, provide considerable local processing power to make real-time inferences and response decisions from live data.

    The two major benefits that stem from local processing have prompted a surge in development of AI processing options for the edge that seek to lower the cost of local AI inferencing

    Reply
  23. Tomi Engdahl says:

    Hardware Helping Move AI to the Edge
    https://www.eetimes.com/document.asp?doc_id=1334741

    While artificial intelligence and machine learning computation is often performed at large scale in datacentres, the latest processing devices are enabling a trend towards embedding AI/ML capability into IoT devices at the edge of the network. AI on the edge can respond quickly, without waiting for a response from the cloud. There’s no need for an expensive data upload, or for costly compute cycles in the cloud, if the inference can be done locally. Some applications also benefit from reduced concerns about privacy.

    Outside of the two most common applications, speech processing and image recognition, machine learning can of course be applied to data from pretty much any type of sensor.

    NXP showed a module for exactly this purpose at the Microsoft Build developer conference earlier this month. The 30x40mm board combines a handful of sensors with a powerful microcontroller – the i.MX RT1060C, a high performance Cortex-M device which runs at a very fast 600MHz – plus connectivity capability

    Reply
  24. Tomi Engdahl says:

    Smart AI Assistants are the Real Enabler for Edge AI
    https://www.eeweb.com/profile/yairs/articles/smart-ai-assistants-are-the-real-enabler-for-edge-ai

    AI at the edge will reduce overwhelming volumes of data to useful and relevant information on which we can act.

    A recent McKinsey report projects that by 2025, the CAGR for silicon containing AI functionality will be 5× that for non-AI silicon. Sure, AI is starting small, but that’s a pretty fast ramp. McKinsey also shows that the bulk of the opportunity is in AI inference and that the fastest growth area is on the edge. Stop and think about that. AI will be growing very fast in a lot of edge designs over the next six-plus years; that can’t be just for novelty value.

    Artificial-intelligence hardware: New opportunities for semiconductor companies
    https://www.mckinsey.com/industries/semiconductors/our-insights/artificial-intelligence-hardware-new-opportunities-for-semiconductor-companies

    Our analysis revealed three important findings about value creation:

    AI could allow semiconductor companies to capture 40 to 50 percent of total value from the technology stack, representing the best opportunity they’ve had in decades.
    Storage will experience the highest growth, but semiconductor companies will capture most value in compute, memory, and networking.
    To avoid mistakes that limited value capture in the past, semiconductor companies must undertake a new value-creation strategy that focuses on enabling customized, end-to-end solutions for specific industries, or “microverticals.”

    The AI technology stack will open many opportunities for semiconductor companies

    Reply
  25. Tomi Engdahl says:

    Hardware Helping Move AI to the Edge
    https://www.eetimes.com/document.asp?doc_id=1334741

    Outside of the two most common applications, speech processing and image recognition, machine learning can of course be applied to data from pretty much any type of sensor.

    Reply
  26. Tomi Engdahl says:

    The word edge in this context means literal geographic distribution. Edge computing is computing that’s done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work. It doesn’t mean the cloud will disappear. It means the cloud is coming to you.

    https://pentestmag.com/how-to-effectively-combat-emerging-supply-chain-vulnerabilities/

    https://www.theverge.com/circuitbreaker/2018/5/7/17327584/edge-computing-cloud-google-microsoft-apple-amazon

    Reply
  27. Tomi Engdahl says:

    According to Arpit Joshipura, The Linux Foundation’s general manager of networking, edge computing will overtake cloud computing by 2025.

    Linux Foundation exec believes edge computing will be more important than cloud computing
    https://www.zdnet.com/article/linux-foundation-executive-believes-edge-computing-will-be-more-important-than-cloud-computing/?ftag=COS-05-10aaa0h&utm_campaign=trueAnthem%3A+Trending+Content&utm_content=5d88f33cbf0aaa0001986f38&utm_medium=trueAnthem&utm_source=facebook

    According to Arpit Joshipura, The Linux Foundation’s general manager of networking, edge computing will overtake cloud computing by 2025.

    Reply
  28. Tomi Engdahl says:

    Satya Nadella looks to the future with edge computing
    https://tcrn.ch/33hxQWu

    Speaking today at the Microsoft Government Leaders Summit in Washington DC, Microsoft CEO Satya Nadella made the case for edge computing, even while pushing the Azure cloud as what he called “the world’s computer.”

    While Amazon, Google and other competitors may have something to say about that, marketing hype aside, many companies are still in the midst of transitioning to the cloud. Nadella says the future of computing could actually be at the edge where computing is done locally before data is then transferred to the cloud for AI and machine learning purposes. What goes around, comes around.

    Reply
  29. Tomi Engdahl says:

    Microsoft’s new backpack-friendly Azure device weighs under 10 pounds, runs on batteries, and meets the 810G ruggedized standard.

    Microsoft Unveils Battery-Powered Version of Azure That Fits in a Backpack
    https://www.petri.com/microsoft-unveils-battery-powered-version-of-azure-that-fits-in-a-backpack

    Reply
  30. Tomi Engdahl says:

    Running EdgeX on a Raspberry Pi
    This project will walk you through installing and running the essential EdgeX Foundry microservices on a Raspberry Pi.
    https://www.hackster.io/mhall119/running-edgex-on-a-raspberry-pi-d35dd5

    Reply
  31. Tomi Engdahl says:

    how global cloud platforms will offer the 1ms latency for 5G. AWS Wavelength promises to do it by extending your VPCs into Wavelength Zones, where you can run local EC2 instances and EBS volumes at the edge.

    Announcing AWS Wavelength for delivering ultra-low latency applications for 5G
    https://aws.amazon.com/about-aws/whats-new/2019/12/announcing-aws-wavelength-delivering-ultra-low-latency-applications-5g/

    AWS Wavelength embeds AWS compute and storage services at the edge of telecommunications providers’ 5G networks, and provides seamless access to the breadth of AWS services in the region. AWS Wavelength enables you to build applications that serve mobile end-users and devices with single-digit millisecond latencies over 5G networks, like game and live video streaming, machine learning inference at the edge, and augmented and virtual reality.

    AWS Wavelength brings AWS services to the edge of the 5G network, minimizing the network hops and latency to connect to an application from a 5G device. Wavelength delivers a consistent developer experience across multiple 5G networks around the world

    Reply
  32. Tomi Engdahl says:

    https://www.equinix.fi/resources/analyst-reports/edge-computing-frontiers-gartner/?ls=Email&lsd=19q4_cross-vertical_digital-edge+expansion_analyst-reports/edge-computing-frontiers-gartner/_emea-programs_Equinix-run_promo-email_promo-email_fi-en_EMEA_awareness&utm_campaign=fi-en_promo-email_promo-email_NN_emea-programs_awareness&utm_source=promo-email&utm_medium=promo-email&utm_content=digital-edge+expansion_&mkt_tok=eyJpIjoiTXpneU9USTNNRFEzTnpJdyIsInQiOiJnTkZRQ3lXemgyTTFITWh3ejNvT1BDN1BQeVh4Y3J3SHlPSGM5SFwvS3g1TVp4YWw4RWxrNnphWGJXK1h0RFwvUDEwb3BDb3grN3VUUFwveHZnK3NoYkw1azRMOHVcL1wvcXRmaGxIZzY3VHM3cW1mUSt0a0tYcEpkVnlRTDJtN295ZVZkIn0%3D

    Infrastructure and Operations (I&O) leaders who manage cloud and edge infrastructure should develop a multiyear edge computing strategy that encompasses the diversity of use cases needed by their digital organizations. According to an analyst report from Gartner, the edge computing use-case landscape is broad and early deployments are highly customized.

    Gartner highlights key topics:

    Edge computing complements cloud computing for digital business use cases, solving latency, bandwidth, autonomy and privacy requirements.
    Edge computing use-case categories are defined by digital business interactions between people, things and businesses.
    More than 90% of enterprises start with a single unique use case for edge computing; over time, a typical enterprise will have many.

    “By year-end 2023, more than 50% of large enterprises will deploy at least six edge computing use cases deployed for IoT or immersive experiences, versus less than 1% in 2019.”

    Reply
  33. Tomi Engdahl says:

    The Rough Edge of Rugged Computing
    https://www.sealevel.com/2019/09/23/the-rough-edge-of-rugged-computing/

    Rough around the edges. Cutting edge. On the ragged edge.
    You’d think that an edge device by nature would be able to withstand intense environments.

    However, not all edge computing devices are created equal. And companies are quickly realizing that what works in the lab for initial testing and software development, doesn’t necessarily work in the field.

    Reply
  34. Tomi Engdahl says:

    Smart factory controllers bring security and connectivity
    Powerful edge controllers are replacing PCs on the factory floor and going where PCs can’t, providing a variety of “apps” for every task.
    https://www.controleng.com/articles/smart-factory-controllers-bring-security-and-connectivity/?oly_enc_id=0462E3054934E2U

    What is edge computing again?

    Edge computing refers to the trend of increasing processing power and storage in devices that reside close to where real-time data is generated by sensors, equipment, and users. For control systems, edge controllers bring general-purpose computing power, connectivity, data processing, and storage to the process level in a compact, industrial form factor, along with input/output (I/O), real-time control and visualization options.

    On-board OPC and more

    Integrating multi-vendor programmable logic controllers (PLCs) or aggregating data from heterogeneous devices might be handled using a dedicated OPC server. It could be hosted on anything from a consumer-grade laptop on a shelf, to a rack-mounted server, to a virtual machine (VM) in an information technology (IT)-managed infrastructure. Regardless, this dependence on PCs requires additional licensing costs and management overhead.

    IT management complexity, in particular, is a thorn for factory automation. Every new PC requires configuration, user access controls, antivirus, driver and operating system updates, and so on, which invite potential disruptions to production due to maintenance or unexpected downtime. Each of these components may introduce more costs in the form of licensing and long-term support agreements. System ownership also can become contentious when maintenance procedures don’t integrate well with any particular group’s operations.

    Unlike legacy PLCs, edge controllers can provide a complete connectivity solution, including acting as OPC or messaging queuing telemetry transport (MQTT) servers. Unlike PC-based solutions, edge controllers require little IT involvement, because they’re built for industrial environments and are secure out of the box.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*