What is edge computing? | Opensource.com

https://opensource.com/article/17/9/what-edge-computing?sc_cid=7016000000127ECAAY

Edge computing is poised to boost the next generation of IoT technology into the mainstream. Here’s how it works with the cloud to benefit business operations in all industries

There is much speculation about edge replacing cloud, and in some cases, it may do so. However, in many situations, the two have a symbiotic relationship. 

The distributed nature of edge computing (when built right) means that along with reducing latency, it can also improves resiliency, reduces networking load, and is easier to scale.

Processing of data starts at its source, which helps to reduce latency and makes systems more realiable because networks always add latency and are never 100 per cent reliable. Once initial processing in edge computer is completed, only the data that needs further analysis or requires other services needs to be sent to cloud. This reduces networking requirements.

Over the next few years, we will see an explosion in this technology.

For example services such as AWS Lambda may be overhauled to run functions at the edge location nearest to the request’s origination point: We have already seen the first signs of this with AWS Lambda@Edge.  With systems such as AWS’s Lambda@Edge, Greengrass, and Microsoft Azure IoT Edge, edge computing has become a key factor driving the adoption of technologies such as IoT.

We will also see the maturing of emerging edge technologies such as blockchain and fog computing. 

As edge matures, cloud computing will grow along with it.

12 Comments

  1. Tomi Engdahl says:

    This brings us to current edge solutions, of which there are many. Whether purely distributed systems such as blockchain and peer-to-peer or mixed systems such as AWS’s Lambda@Edge, Greengrass, and Microsoft Azure IoT Edge, edge computing has become a key factor driving the adoption of technologies such as IoT.

    Reply
  2. Tomi Engdahl says:

    5 Ways To Navigate Fog IoT
    https://www.eetimes.com/author.asp?section_id=36&doc_id=1332488&

    As fog computing moves toward broad deployment, I offer some tips to those designing, connecting to and using fog-based systems.

    Fog envelops the Internet of Things from end nodes to edge networks. Fog computing is a “horizontal, system level architecture that distributes computing, storage, control and networking functions closer to the users along the Cloud-to-Thing continuum,” as defined by the OpenFog Consortium, the group leading the development of an open, interoperable architecture for it.

    Fog computing is rapidly gaining momentum as the architecture that bridges the current gap in IoT, 5G and embedded AI systems. As the fog rolls out from its conception to a deployment phase, there is plenty of room for miscues, overlap and U-turns. So, here are five tips to help you find your way through it all.

    #1 Recognize where fog techniques are needed.

    Fog is about SCALE–Security, Cognition, Agility, Latency, Efficiency. When designing or building a system or app, take heed of certain warning signs which indicate sub-optimal functionality if implemented in traditional cloud-based systems.

    Fog is best known for slashing latency times, but it also helps reduce network bandwidth requirements by moving computation to or near an IoT end node. It also provides additional security robustness for data transfers, and it can improve cognition, reliability and geographic focus through local processing.

    #2 Span software across fog nodes North-South and East-West.

    Applications can be partitioned to run on multiple fog nodes in a network. This partitioning can be North-South where different parts of an application run hierarchically, up to and including the cloud at times. It can also be East–West for load balancing or fault tolerance between peer-level fog nodes. Partitioning can be adjusted as needed on millisecond time scales.

    #3 Understand the pillars of the fog.

    OpenFog has identified eight pillars: Security, Scalability, Openness, Autonomy, RAS (Reliability, Availability, Serviceability), Agility, Hierarchy and Programmability. Each of these can be studied in depth in the OpenFog Reference Architecture.

    #4 Make fog software modular, linked by standard APIs.

    Software is the key to the performance, versatility and trustworthiness of fog implementations. Make it manageable and interoperable by carefully partitioning it into functional blocks.

    #5 Make each installation very easy.

    Global IoT applications will require the installation of millions of fog nodes over the next several years. Ensure the fog node hardware drops right in with simple mechanical and electrical connections for most scenarios.

    Pay attention to fog node aesthetics, power requirements, cable management, and so on to minimize environmental impact. Fog node provisioning and commissioning tests should be 100% automated nearly 100% of the time. Use pilot programs to optimize design of fog equipment for installation and maintenance.

    Reply
  3. Tomi Engdahl says:

    Iot offers collision and competition for industrial cascades and it’s gorillas

    An industrial internet sets production cascades and it’s gorillas into competitive conditions, but also offers opportunities for co-operation. GE Digital, which favors edge counting, feels it is much more efficient to process and analyze data as little as measuring points than to send huge pulses back and forth between the hosts and the cloud.

    Industrial giants General Electric strengthens its own industrial Internet Predix platform with machine learning and edge counting. The latter means that the data is processed and analyzed close to the measurement point, whereby the results of the proactive analysis are revealed by fewer and more backward data transmissions and in real time.

    The goal of edge computing is to analyze data in real time, optimize network traffic and generate cost savings. The edge in the development of computation and machine learning has reached the furthest industrial giant software side of the subsidiary GE Digital, Network World writes.

    And as it is suitable for industrial companies, the GE Predix contour computing focuses on the needs of equipment maintenance and anticipates machine break times.

    GE’s last year, the Predix Edge steel platform, combines ot- and it-systems to gather data collected from different locations, such as orders, production, and warehouses, into cloud-based erp and distribution chain systems.

    Source: http://www.tivi.fi/CIO/iot-tarjoaa-kimppaa-ja-kilpailua-teollisuuden-kaskeloteille-ja-it-n-gorilloille-6684708

    More:
    GE adds edge analytics, AI capabilities to its industrial IoT suite
    GE is making a bid to influence industrial IoT by adding new features to its Predix platform-as-a-service offering.
    https://www.networkworld.com/article/3234641/internet-of-things/ge-adds-edge-analytics-ai-to-predix-industrial-iot-suite.html

    Reply
  4. Tomi Engdahl says:

    Gartner Report – Maverick Research – The Edge Will Eat the Cloud
    https://www.equinix.nl/digital-edge/analyst-reports/gartner-the-edge-will-eat-the-cloud/?ls=Email&lsd=17q4_enterprise_digital-edge-2017_analyst-reports/gartner-the-edge-will-eat-the-cloud__promo-email-initial_nl-en&utm_campaign=digital-edge-2017&utm_source=&utm_medium=promo-email&mkt_tok=eyJpIjoiTVdVMlpHWmhNV1ZrTkdSbCIsInQiOiJRV0I5c1wvUmZVUVNpNHBkaE0wZDVSQUJzaXpTbVVTODF1bkhaRWVoXC9KS0ZPeCtlWjlLNjFZSzJzSzNsUnhNWjd0eWlPaTA2QWZ3UGZBVFVcL1lWWWNiNVlmbGZhOEFSK201NmttTVNoVkliZjUwZlFCZDZJajVLcVJoQlBKckROSSJ9

    The growth of the Internet of Things and the upcoming trend toward more immersive and interactive user interfaces will flip the center of gravity of data production and computing away from central data centers and out to the edge. This research contradicts prevailing views on the future of cloud computing, the topology of computing architectures and the nature of applications as we move toward digital business. Instead of continued growth of mega data centers, compute and storage will move toward the edge, due to the Internet of Things and new user/machine interfaces.

    Invest in architecture and technologies that help address the increased criticality and user experience that digital business demands.

    Build strategies for future applications that factor in latency, location, federation and autonomy, as well as determining placement at the edge, cloud and in between.

    Reply
  5. Tomi Engdahl says:

    C’mon, edgelords: The APIs are ours to command – do we do good or evil?
    Edge computing is awesome and scary
    https://www.theregister.co.uk/2017/10/31/doing_good_with_iot/

    Edge computing is the pendulum swinging away from the idea of big, centralised servers back to distributed systems. It’s the idea that instead of centralising all of our workloads in big clouds we bring the computing closer to the devices requesting that compute power.

    The idea is that edge computing solves whole new classes of workloads for which the latency involved with cloud is just too high. The power of the edge is greater than the sum of its parts.

    The driverless car is held up as a good use case for edge computing: not just the vehicle as the device, but the vehicles festooned with devices. These cars will scan their surroundings and communicate with one another as well as incalculable other machines that will emit beacons of various types.

    While driverless cars will have a limited capacity to analyse problems in real time, there are real-world limits to how much compute power we can practicably and efficiently cram into them.

    If all cars in a given area had the ability to stream some or all of their data to a physically proximate server farm then that server farm could greatly enhance the decision-making capabilities of those vehicles.

    A driverless car isn’t going to do much better than a human; it can’t see around corners all that much better than we can. The driverless car around the corner, however, can see what’s going on in its vicinity. And multiple cars around multiple corners provide enough data to know what’s what, what’s where and start making predictions about the vectors of all the moving pieces. Maybe we even throw in some extra sensors on lampposts and the like to make life easier.

    Cloud data centres can be tens or even hundreds of milliseconds away. At 60kmph, 100ms is 1.67 metres. That’s more than enough to kill someone. The speed of light is unforgiving that way. Place a local number cruncher in there and your 100ms round trip becomes 5-10ms.

    Of course, edge computing isn’t just about cars. There’s a strong Big Brother camp, too. They’ve been popping up at conferences promising to track patients at hospitals, children in schools, and prisoners in jails.

    Enter developers

    Cloud is becoming the way of doing business. When you disregard the sysadmin-facing Infrastructure as a Service (IaaS) and the user-facing Software as a Service (SaaS) portions of cloud computing today, what you are left with is Platform as a Service (PaaS). PaaS provides a pre-canned environment for developers to code in with no need for sysadmins. Alongside PaaS we have proprietary public cloud services ranging from serverless to BDCA tools like machine learning, artificial intelligence and so forth.

    Today’s modern applications work something like this: a script instantiates a series of PaaS environments. Application code is injected into these environments, creating a series of microservices. These microservices will listen for input data and then either farm that data directly out to a BDCA tool and then store the results or store the data and then run it through BDCA tools. Or both. The results are then made available to humans or machines to act on.

    These BDCA tools are essentially filling a similar role to code libraries. Except instead of providing a simple function to convert XML to JSON they provide voice recognition, computer vision as a service, or join together a dozen different enterprise identity services to provide Single Sign On (SSO).

    We are already standing on the edge

    The edge is already in our workplaces and our homes. I mentioned vehicles and drones, but we also have Google’s Nest and even Amazon’s Alexa as early manifestations. With Nest, various Internet of Things devices report back to a central device. This device does some local decision making where real-time decisions matter and it farms the rest of the number crunching out to Google’s cloud.

    The API presented to us is the voice interface. The latency sensitive portion of that API may consist of nothing more than “Hello, Alexa”

    I have become

    Some argue that edge computing is the true beginning of The Singularity. Machines already know things we’ll never understand. These individuals view the edge as a missing intermediate link in distributed machine learning. One that bridges the gap between low-powered, real-time, decision-making capability and the big number crunching capacity that centralised batch-job style clouds can offer.

    The truth is, we don’t know what the edge will become, because we are the ones who will make that choice. The edge could enable machines to make our societies more efficient and capable that we can even imagine today.

    Reply
  6. Tomi Engdahl says:

    AT&T Previews Edge Compute Plans
    Server/storage pods target low latency apps
    https://www.eetimes.com/document.asp?doc_id=1332542

    AT&T Labs has been quietly defining its concept of edge computers and is now slowly edging toward deploying them. Long term, the work has broad implications for the future design of both cloud and mobile systems.

    AT&T defines edge compute as groups of servers and storage systems placed on the periphery of its network to deliver low latency services. It foresees a wide variety of such systems that vary in size and location depending on the application and demand.

    “Edge compute is the next step in getting more out of our network, and we are busy putting together an edge computing architecture,” said Alicia Abella, a senior executive at AT&T Labs, in a keynote at the Fog World Congress here.

    “We want to deploy edge compute nodes in mobile data centers, in buildings, at customers’ locations and in our central offices. Where it is…depends on where there is demand, where we have spectrum, we are developing methods for optimizing the locations,” she said.

    The edge systems serve many uses. They aim to help AT&T reduce the volume of data it must carry to and from its core network. They also will enable higher quality for existing services and hopefully open the doors to new services as well.

    One clear application is running video analytics for surveillance cameras. Such edge systems might use GPUs, FPGAs or other accelerators and be located in cities.

    A more challenging use is handling jobs for automated vehicles because it would require a significant investment in roadside infrastructure but have an uncertain return-on-investment. Interestingly, AT&T now has 12 million smart cars on its network, a number growing by a million per quarter, she said.

    Reply
  7. Tomi Engdahl says:

    Datacenter to container when the cloud is too far away

    Automated production requires a delay of less than 1 millisecond (latency) between IT systems. According to Mikko Aho, Sales Manager at Rittal, a solution for many would be a preconfigured, container-built machine.

    Since automated production generates a huge amount of data, and because this data has to be able to process a delay, the speed of the general clouds today is not always enough.

    Thus, more and more industrial companies have introduced so-called edge data centers. According to their name, these are located at the edge of the production facilities or even directly within the factory buildings.

    - Close-placed data centers are able to process the data stream required in automated production with a sufficiently low latency to enable data to be used in production and logistics management, Aho summarizes the benefits of this edge model.

    The most effective solution for the edge data center is a preconfigured, ready “container bin”. On the side of the production plant, a data center located in its own dedicated space can be erected, which contains not only the servers but also everything else: the racks and other components, the power supply and the cooling

    Source: http://etn.fi/index.php?option=com_content&view=article&id=7096&via=n&datum=2017-11-02_16:29:46&mottagare=30929

    Reply
  8. Tomi Engdahl says:

    ZeroTier Edge: Open Source Enterprise VPN, SD-WAN
    https://www.indiegogo.com/projects/zerotier-edge-open-source-enterprise-vpn-sd-wan#/

    An enterprise grade VPN, SD-WAN, and network virtualization device powered by open source software

    Reply
  9. Tomi Engdahl says:

    Edge computing moves the open cloud beyond the data center
    https://opensource.com/article/17/11/openstack-edge-computing?sc_cid=7016000000127ECAAY

    Edge computing, like public cloud at scale, requires a convenient, powerful cloud software stack that can be deployed in a unified, efficient and sustainable way. Open source is leading the way.

    the biggest new opportunity: distributed cloud infrastructure.

    Today, almost every company in every industry sector needs near-instant access to data and compute resources to be successful. Edge computing pushes applications, data and computing power services away from centralized data centers to the logical extremes of a network, close to users, devices and sensors. It enables companies to put the right data in the right place at the right time, supporting fast and secure access. The result is an improved user experience and, oftentimes, a valuable strategic advantage. The decision to implement an edge computing architecture is typically driven by the need for location optimization, security, and most of all, speed.

    Reply
  10. Tomi Engdahl says:

    Home> Systems-design Design Center > How To Article
    Edge servers ease network congestion
    https://www.edn.com/design/systems-design/4459044/Edge-servers-ease-network-congestion?utm_content=buffere3462&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

    Many upcoming networked applications demand massive bandwidths and real-time communication in small form-factor edge servers with dedicated interfaces. COM Express Type 7 server-on-module boards are appropriate platforms for designing such dedicated micro servers for the edges.
    Public and private network operators need to provide an appropriate infrastructure for 1 GbE (Gigabit Ethernet) enabled devices. As more and more devices get connected, they need to eliminate oversubscription ratios in 1 GbE switched networks.

    There are also many high-performance applications demanding increased speed. Application areas include but are not limited to:
    Access edges to broadcasting infrastructures
    Service provider datacenters for video and audio streaming as well as SaaS
    Local carrier-grade infrastructures for the mobile edge
    Metropolitan and larger private networks
    Cloud and edge servers on enterprise level
    Storage attached networks (SANs) for Big Data storage
    Intelligent switching technologies and smart NAS devices
    Fog servers in Industry 4.0 applications
    Edge nodes for wireless smart sensor networks
    Collaborative deep learning computers

    Reply
  11. Tomi Engdahl says:

    Edge-computing is the new black – Distence News
    https://www.distence.fi/en/edge-computing-new-black-distence-news/

    Legacy assets present an enormous opportunity

    Any company producing equipment for the industrial sector has a legacy, or installed base. In the industrial sector the life of an industrial asset can be anything from 10 to 40 years. During that period, depending on the asset, the Total Cost of Ownership (TCO) excluding initial investment adds up to 70-90%. This includes energy, spare parts, maintenance etc. It does not take into account additional secondary costs, such as the cost to sales from downtime, failures caused to other assets and other ripple effects. This is why the installed base of industrial assets is regarded by many as the biggest opportunity for the Industrial IoT. The portential savings are enoromous, and so are the revenue opportunities.The Distence solution is designed to be a flexible, focused and robust approach to taking those legacy assets fast and efficiently into control, trusted by global players in the energy business. Our goal is to help our customers get closer to their customers, to increase transparency, optimization and control of installed assets. Edge-computing is here and ready to serve the business and optimize that TCO.

    Reply
  12. Tomi Engdahl says:

    IoT Edge Gateways Emerging as Enterprise Connectivity Option
    https://www.designnews.com/automation-motion-control/iot-edge-gateways-emerging-enterprise-connectivity-option/64826465457861?ADTRK=UBM&elq_mid=2183&elq_cid=876648

    Edge gateways are using the MQTT protocol as a way to enable secure data flow between edge devices and the cloud, and create a new class of IIoT connectivity.

    Industrial IoT edge gateways represent an emerging product category and key technology for connecting both legacy controllers and edge devices to the Internet of Things (IoT). Along with integrating a variety of protocols for networking, they provide resources for managing data storage and analytics, along with enabling secure data flow between edge devices and cloud services.

    New Option for IIoT Applications

    The idea behind edge gateways is to provide a way to use data already created by operational technology and translate data from existing control applications into IoT-friendly formats that can be easily accessed via the cloud. One argument is that, because data is often aggregated and displayed for operators on a Human Machine Interface (HMI), the same data can be presented to other users and provides a powerful additional capability for IoT applications.

    By building MQTT support into the HMI, data can be organized into topics and presented to upstream IT applications in a flexible, modular and efficient way. According to the Maple Systems website, “The MQTT broker is responsible for maintaining client connections and sending/receiving messages. Client devices, edge gateways, and IT applications (or publishers/subscribers in MQTT language), are freed up to focus on producing and consuming data.”

    It’s easy to see how, as MQTT is a lightweight protocol, and up to 1,000 connected clients can be easily supported, the ability to publish data as topics can provide an effective method for monitoring a specific machine or industrial process.

    The architectural vision of the Hilscher products is to leverage techniques and protocols that are already available for “crossing the edge” including, for example, HTTP and web services. But to address the needs of automation networking, the hard real-time benefits of protocols like PROFINET and EtherNet/IP will offer additional networking options.

    The approach is to use OPC UA as a technology that can exist on both sides of the divide between IT and OT, along with use of the MQTT protocol for transmitting data over long distances for SCADA implementations. The thinking is that both MQTT and OPC UA will become de facto standards for IIoT applications, and that AMQP has the potential to support data management functions needed for MES and ERP connectivity.

    With the Industry 4.0 initiative in Germany making progress, along with the work of the Industrial Internet Consortium, the Avnu Alliance, OPC UA, and the IEEE 802.1 Time Sensitive Networking standards groups, there is an unprecedented effort to unify industrial connectivity standards. But edge gateways, on the other hand, provide a way that companies can immediately utilize data from edge devices to implement solutions that don’t require these other technologies to move ahead with IoT applications.

    Use of MQTT, and potentially AMQP, provides connectivity options for linking to cloud services, and new options including OPC UA make implementing these systems more realizable.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*