What is edge computing? | Opensource.com


Edge computing is poised to boost the next generation of IoT technology into the mainstream. Here’s how it works with the cloud to benefit business operations in all industries

There is much speculation about edge replacing cloud, and in some cases, it may do so. However, in many situations, the two have a symbiotic relationship. 

The distributed nature of edge computing (when built right) means that along with reducing latency, it can also improves resiliency, reduces networking load, and is easier to scale.

Processing of data starts at its source, which helps to reduce latency and makes systems more realiable because networks always add latency and are never 100 per cent reliable. Once initial processing in edge computer is completed, only the data that needs further analysis or requires other services needs to be sent to cloud. This reduces networking requirements.

Over the next few years, we will see an explosion in this technology.

For example services such as AWS Lambda may be overhauled to run functions at the edge location nearest to the request’s origination point: We have already seen the first signs of this with AWS Lambda@Edge.  With systems such as AWS’s Lambda@Edge, Greengrass, and Microsoft Azure IoT Edge, edge computing has become a key factor driving the adoption of technologies such as IoT.

We will also see the maturing of emerging edge technologies such as blockchain and fog computing. 

As edge matures, cloud computing will grow along with it.


  1. Tomi Engdahl says:

    This brings us to current edge solutions, of which there are many. Whether purely distributed systems such as blockchain and peer-to-peer or mixed systems such as AWS’s Lambda@Edge, Greengrass, and Microsoft Azure IoT Edge, edge computing has become a key factor driving the adoption of technologies such as IoT.

  2. Tomi Engdahl says:

    5 Ways To Navigate Fog IoT

    As fog computing moves toward broad deployment, I offer some tips to those designing, connecting to and using fog-based systems.

    Fog envelops the Internet of Things from end nodes to edge networks. Fog computing is a “horizontal, system level architecture that distributes computing, storage, control and networking functions closer to the users along the Cloud-to-Thing continuum,” as defined by the OpenFog Consortium, the group leading the development of an open, interoperable architecture for it.

    Fog computing is rapidly gaining momentum as the architecture that bridges the current gap in IoT, 5G and embedded AI systems. As the fog rolls out from its conception to a deployment phase, there is plenty of room for miscues, overlap and U-turns. So, here are five tips to help you find your way through it all.

    #1 Recognize where fog techniques are needed.

    Fog is about SCALE–Security, Cognition, Agility, Latency, Efficiency. When designing or building a system or app, take heed of certain warning signs which indicate sub-optimal functionality if implemented in traditional cloud-based systems.

    Fog is best known for slashing latency times, but it also helps reduce network bandwidth requirements by moving computation to or near an IoT end node. It also provides additional security robustness for data transfers, and it can improve cognition, reliability and geographic focus through local processing.

    #2 Span software across fog nodes North-South and East-West.

    Applications can be partitioned to run on multiple fog nodes in a network. This partitioning can be North-South where different parts of an application run hierarchically, up to and including the cloud at times. It can also be East–West for load balancing or fault tolerance between peer-level fog nodes. Partitioning can be adjusted as needed on millisecond time scales.

    #3 Understand the pillars of the fog.

    OpenFog has identified eight pillars: Security, Scalability, Openness, Autonomy, RAS (Reliability, Availability, Serviceability), Agility, Hierarchy and Programmability. Each of these can be studied in depth in the OpenFog Reference Architecture.

    #4 Make fog software modular, linked by standard APIs.

    Software is the key to the performance, versatility and trustworthiness of fog implementations. Make it manageable and interoperable by carefully partitioning it into functional blocks.

    #5 Make each installation very easy.

    Global IoT applications will require the installation of millions of fog nodes over the next several years. Ensure the fog node hardware drops right in with simple mechanical and electrical connections for most scenarios.

    Pay attention to fog node aesthetics, power requirements, cable management, and so on to minimize environmental impact. Fog node provisioning and commissioning tests should be 100% automated nearly 100% of the time. Use pilot programs to optimize design of fog equipment for installation and maintenance.

  3. Tomi Engdahl says:

    Iot offers collision and competition for industrial cascades and it’s gorillas

    An industrial internet sets production cascades and it’s gorillas into competitive conditions, but also offers opportunities for co-operation. GE Digital, which favors edge counting, feels it is much more efficient to process and analyze data as little as measuring points than to send huge pulses back and forth between the hosts and the cloud.

    Industrial giants General Electric strengthens its own industrial Internet Predix platform with machine learning and edge counting. The latter means that the data is processed and analyzed close to the measurement point, whereby the results of the proactive analysis are revealed by fewer and more backward data transmissions and in real time.

    The goal of edge computing is to analyze data in real time, optimize network traffic and generate cost savings. The edge in the development of computation and machine learning has reached the furthest industrial giant software side of the subsidiary GE Digital, Network World writes.

    And as it is suitable for industrial companies, the GE Predix contour computing focuses on the needs of equipment maintenance and anticipates machine break times.

    GE’s last year, the Predix Edge steel platform, combines ot- and it-systems to gather data collected from different locations, such as orders, production, and warehouses, into cloud-based erp and distribution chain systems.

    Source: http://www.tivi.fi/CIO/iot-tarjoaa-kimppaa-ja-kilpailua-teollisuuden-kaskeloteille-ja-it-n-gorilloille-6684708

    GE adds edge analytics, AI capabilities to its industrial IoT suite
    GE is making a bid to influence industrial IoT by adding new features to its Predix platform-as-a-service offering.

  4. Tomi Engdahl says:

    Gartner Report – Maverick Research – The Edge Will Eat the Cloud

    The growth of the Internet of Things and the upcoming trend toward more immersive and interactive user interfaces will flip the center of gravity of data production and computing away from central data centers and out to the edge. This research contradicts prevailing views on the future of cloud computing, the topology of computing architectures and the nature of applications as we move toward digital business. Instead of continued growth of mega data centers, compute and storage will move toward the edge, due to the Internet of Things and new user/machine interfaces.

    Invest in architecture and technologies that help address the increased criticality and user experience that digital business demands.

    Build strategies for future applications that factor in latency, location, federation and autonomy, as well as determining placement at the edge, cloud and in between.

  5. Tomi Engdahl says:

    C’mon, edgelords: The APIs are ours to command – do we do good or evil?
    Edge computing is awesome and scary

    Edge computing is the pendulum swinging away from the idea of big, centralised servers back to distributed systems. It’s the idea that instead of centralising all of our workloads in big clouds we bring the computing closer to the devices requesting that compute power.

    The idea is that edge computing solves whole new classes of workloads for which the latency involved with cloud is just too high. The power of the edge is greater than the sum of its parts.

    The driverless car is held up as a good use case for edge computing: not just the vehicle as the device, but the vehicles festooned with devices. These cars will scan their surroundings and communicate with one another as well as incalculable other machines that will emit beacons of various types.

    While driverless cars will have a limited capacity to analyse problems in real time, there are real-world limits to how much compute power we can practicably and efficiently cram into them.

    If all cars in a given area had the ability to stream some or all of their data to a physically proximate server farm then that server farm could greatly enhance the decision-making capabilities of those vehicles.

    A driverless car isn’t going to do much better than a human; it can’t see around corners all that much better than we can. The driverless car around the corner, however, can see what’s going on in its vicinity. And multiple cars around multiple corners provide enough data to know what’s what, what’s where and start making predictions about the vectors of all the moving pieces. Maybe we even throw in some extra sensors on lampposts and the like to make life easier.

    Cloud data centres can be tens or even hundreds of milliseconds away. At 60kmph, 100ms is 1.67 metres. That’s more than enough to kill someone. The speed of light is unforgiving that way. Place a local number cruncher in there and your 100ms round trip becomes 5-10ms.

    Of course, edge computing isn’t just about cars. There’s a strong Big Brother camp, too. They’ve been popping up at conferences promising to track patients at hospitals, children in schools, and prisoners in jails.

    Enter developers

    Cloud is becoming the way of doing business. When you disregard the sysadmin-facing Infrastructure as a Service (IaaS) and the user-facing Software as a Service (SaaS) portions of cloud computing today, what you are left with is Platform as a Service (PaaS). PaaS provides a pre-canned environment for developers to code in with no need for sysadmins. Alongside PaaS we have proprietary public cloud services ranging from serverless to BDCA tools like machine learning, artificial intelligence and so forth.

    Today’s modern applications work something like this: a script instantiates a series of PaaS environments. Application code is injected into these environments, creating a series of microservices. These microservices will listen for input data and then either farm that data directly out to a BDCA tool and then store the results or store the data and then run it through BDCA tools. Or both. The results are then made available to humans or machines to act on.

    These BDCA tools are essentially filling a similar role to code libraries. Except instead of providing a simple function to convert XML to JSON they provide voice recognition, computer vision as a service, or join together a dozen different enterprise identity services to provide Single Sign On (SSO).

    We are already standing on the edge

    The edge is already in our workplaces and our homes. I mentioned vehicles and drones, but we also have Google’s Nest and even Amazon’s Alexa as early manifestations. With Nest, various Internet of Things devices report back to a central device. This device does some local decision making where real-time decisions matter and it farms the rest of the number crunching out to Google’s cloud.

    The API presented to us is the voice interface. The latency sensitive portion of that API may consist of nothing more than “Hello, Alexa”

    I have become

    Some argue that edge computing is the true beginning of The Singularity. Machines already know things we’ll never understand. These individuals view the edge as a missing intermediate link in distributed machine learning. One that bridges the gap between low-powered, real-time, decision-making capability and the big number crunching capacity that centralised batch-job style clouds can offer.

    The truth is, we don’t know what the edge will become, because we are the ones who will make that choice. The edge could enable machines to make our societies more efficient and capable that we can even imagine today.

  6. Tomi Engdahl says:

    AT&T Previews Edge Compute Plans
    Server/storage pods target low latency apps

    AT&T Labs has been quietly defining its concept of edge computers and is now slowly edging toward deploying them. Long term, the work has broad implications for the future design of both cloud and mobile systems.

    AT&T defines edge compute as groups of servers and storage systems placed on the periphery of its network to deliver low latency services. It foresees a wide variety of such systems that vary in size and location depending on the application and demand.

    “Edge compute is the next step in getting more out of our network, and we are busy putting together an edge computing architecture,” said Alicia Abella, a senior executive at AT&T Labs, in a keynote at the Fog World Congress here.

    “We want to deploy edge compute nodes in mobile data centers, in buildings, at customers’ locations and in our central offices. Where it is…depends on where there is demand, where we have spectrum, we are developing methods for optimizing the locations,” she said.

    The edge systems serve many uses. They aim to help AT&T reduce the volume of data it must carry to and from its core network. They also will enable higher quality for existing services and hopefully open the doors to new services as well.

    One clear application is running video analytics for surveillance cameras. Such edge systems might use GPUs, FPGAs or other accelerators and be located in cities.

    A more challenging use is handling jobs for automated vehicles because it would require a significant investment in roadside infrastructure but have an uncertain return-on-investment. Interestingly, AT&T now has 12 million smart cars on its network, a number growing by a million per quarter, she said.

  7. Tomi Engdahl says:

    Datacenter to container when the cloud is too far away

    Automated production requires a delay of less than 1 millisecond (latency) between IT systems. According to Mikko Aho, Sales Manager at Rittal, a solution for many would be a preconfigured, container-built machine.

    Since automated production generates a huge amount of data, and because this data has to be able to process a delay, the speed of the general clouds today is not always enough.

    Thus, more and more industrial companies have introduced so-called edge data centers. According to their name, these are located at the edge of the production facilities or even directly within the factory buildings.

    - Close-placed data centers are able to process the data stream required in automated production with a sufficiently low latency to enable data to be used in production and logistics management, Aho summarizes the benefits of this edge model.

    The most effective solution for the edge data center is a preconfigured, ready “container bin”. On the side of the production plant, a data center located in its own dedicated space can be erected, which contains not only the servers but also everything else: the racks and other components, the power supply and the cooling

    Source: http://etn.fi/index.php?option=com_content&view=article&id=7096&via=n&datum=2017-11-02_16:29:46&mottagare=30929

  8. Tomi Engdahl says:

    ZeroTier Edge: Open Source Enterprise VPN, SD-WAN

    An enterprise grade VPN, SD-WAN, and network virtualization device powered by open source software

  9. Tomi Engdahl says:

    Edge computing moves the open cloud beyond the data center

    Edge computing, like public cloud at scale, requires a convenient, powerful cloud software stack that can be deployed in a unified, efficient and sustainable way. Open source is leading the way.

    the biggest new opportunity: distributed cloud infrastructure.

    Today, almost every company in every industry sector needs near-instant access to data and compute resources to be successful. Edge computing pushes applications, data and computing power services away from centralized data centers to the logical extremes of a network, close to users, devices and sensors. It enables companies to put the right data in the right place at the right time, supporting fast and secure access. The result is an improved user experience and, oftentimes, a valuable strategic advantage. The decision to implement an edge computing architecture is typically driven by the need for location optimization, security, and most of all, speed.

  10. Tomi Engdahl says:

    Home> Systems-design Design Center > How To Article
    Edge servers ease network congestion

    Many upcoming networked applications demand massive bandwidths and real-time communication in small form-factor edge servers with dedicated interfaces. COM Express Type 7 server-on-module boards are appropriate platforms for designing such dedicated micro servers for the edges.
    Public and private network operators need to provide an appropriate infrastructure for 1 GbE (Gigabit Ethernet) enabled devices. As more and more devices get connected, they need to eliminate oversubscription ratios in 1 GbE switched networks.

    There are also many high-performance applications demanding increased speed. Application areas include but are not limited to:
    Access edges to broadcasting infrastructures
    Service provider datacenters for video and audio streaming as well as SaaS
    Local carrier-grade infrastructures for the mobile edge
    Metropolitan and larger private networks
    Cloud and edge servers on enterprise level
    Storage attached networks (SANs) for Big Data storage
    Intelligent switching technologies and smart NAS devices
    Fog servers in Industry 4.0 applications
    Edge nodes for wireless smart sensor networks
    Collaborative deep learning computers

  11. Tomi Engdahl says:

    Edge-computing is the new black – Distence News

    Legacy assets present an enormous opportunity

    Any company producing equipment for the industrial sector has a legacy, or installed base. In the industrial sector the life of an industrial asset can be anything from 10 to 40 years. During that period, depending on the asset, the Total Cost of Ownership (TCO) excluding initial investment adds up to 70-90%. This includes energy, spare parts, maintenance etc. It does not take into account additional secondary costs, such as the cost to sales from downtime, failures caused to other assets and other ripple effects. This is why the installed base of industrial assets is regarded by many as the biggest opportunity for the Industrial IoT. The portential savings are enoromous, and so are the revenue opportunities.The Distence solution is designed to be a flexible, focused and robust approach to taking those legacy assets fast and efficiently into control, trusted by global players in the energy business. Our goal is to help our customers get closer to their customers, to increase transparency, optimization and control of installed assets. Edge-computing is here and ready to serve the business and optimize that TCO.

  12. Tomi Engdahl says:

    IoT Edge Gateways Emerging as Enterprise Connectivity Option

    Edge gateways are using the MQTT protocol as a way to enable secure data flow between edge devices and the cloud, and create a new class of IIoT connectivity.

    Industrial IoT edge gateways represent an emerging product category and key technology for connecting both legacy controllers and edge devices to the Internet of Things (IoT). Along with integrating a variety of protocols for networking, they provide resources for managing data storage and analytics, along with enabling secure data flow between edge devices and cloud services.

    New Option for IIoT Applications

    The idea behind edge gateways is to provide a way to use data already created by operational technology and translate data from existing control applications into IoT-friendly formats that can be easily accessed via the cloud. One argument is that, because data is often aggregated and displayed for operators on a Human Machine Interface (HMI), the same data can be presented to other users and provides a powerful additional capability for IoT applications.

    By building MQTT support into the HMI, data can be organized into topics and presented to upstream IT applications in a flexible, modular and efficient way. According to the Maple Systems website, “The MQTT broker is responsible for maintaining client connections and sending/receiving messages. Client devices, edge gateways, and IT applications (or publishers/subscribers in MQTT language), are freed up to focus on producing and consuming data.”

    It’s easy to see how, as MQTT is a lightweight protocol, and up to 1,000 connected clients can be easily supported, the ability to publish data as topics can provide an effective method for monitoring a specific machine or industrial process.

    The architectural vision of the Hilscher products is to leverage techniques and protocols that are already available for “crossing the edge” including, for example, HTTP and web services. But to address the needs of automation networking, the hard real-time benefits of protocols like PROFINET and EtherNet/IP will offer additional networking options.

    The approach is to use OPC UA as a technology that can exist on both sides of the divide between IT and OT, along with use of the MQTT protocol for transmitting data over long distances for SCADA implementations. The thinking is that both MQTT and OPC UA will become de facto standards for IIoT applications, and that AMQP has the potential to support data management functions needed for MES and ERP connectivity.

    With the Industry 4.0 initiative in Germany making progress, along with the work of the Industrial Internet Consortium, the Avnu Alliance, OPC UA, and the IEEE 802.1 Time Sensitive Networking standards groups, there is an unprecedented effort to unify industrial connectivity standards. But edge gateways, on the other hand, provide a way that companies can immediately utilize data from edge devices to implement solutions that don’t require these other technologies to move ahead with IoT applications.

    Use of MQTT, and potentially AMQP, provides connectivity options for linking to cloud services, and new options including OPC UA make implementing these systems more realizable.

  13. Tomi Engdahl says:

    Artificial Intelligence and Deep Learning at the Edge

    Increasingly, embedded systems are required to perform ‘AI and DL on the Edge’; i.e., the edge of the Internet where sensors and actuators interface with the real world.

    Things are progressing apace with regard to artificial intelligence (AI), artificial neural networks (ANNs), and deep learning (DL). Some hot-off-the-press news is that CEVA has just unveiled its NeuPro family of processors for AI/DL at the edge

    Like all of CEVA’s hardware offerings, NeuPro processors are presented in the form of intellectual property (IP) that designers can deploy on FPGAs or integrate into their System-on-Chip (SoC) devices.

    We start by defining our ANN architecture and capturing it using an appropriate system like Caffe or Google’s TensorFlow. Next, we “train” our network using hundreds of thousands or millions of images. At this stage we need a lot of accuracy, which means we’re typically working with 32-bit floating-point values.

    The next step is to convert our 32-bit floating-point network into a 16-bit or 8-bit fixed-point equivalent that is suitable for deployment in an FPGA or on an SoC (fixed-point representations are used to boost performance while lowering power consumption).

  14. Tomi Engdahl says:

    Intel’s CES 2018 event in 15 minutes

    Intel presents Neuromorphic Computing at CES 2018

  15. Tomi Engdahl says:

    2018 CES: Neuromophic Computing Mimics the Human Brain

    Intel’s neuromorphic computing team is creating a chip that mimics the way the human brain observes, learns and understands. The company’s prototype chip “Loihi” is the most recent step in that direction. Neuromorphic computing has the potential to change the future

  16. Tomi Engdahl says:

    CES 2018: How NVIDIA Is Shrinking The Computing Hardware.

    Games computers, data centers, supercomputers, autonomous driving…new products presented by NVIDIA CEO Jensen Huang during CES press conference, bring more power in smaller spaces and increased reliability to self-driving cars. January 2018.

  17. Tomi Engdahl says:

    CES 2018 Keynote

    Huawei CEO Consumer Business Group, Richard Yu, takes the stage at #CES2018 to launch the #HUAWEIMate10Pro to the U.S Market. The world’s first smartphone with Kirin AI processor, It’s time for an intelligent experience.

  18. Tomi Engdahl says:

    Intel’s latest chip is designed for computing at the edge

    As we develop increasingly sophisticated technologies like self-driving cars and industrial internet of things sensors, it’s going to require that we move computing to the edge. Essentially this means that instead of sending data to the cloud for processing, it needs to be done right on the device itself because even a little bit of latency is too much.

    Intel announced a new chip today, called the Intel Xeon D-2100 processor, to help customers who want to move computing to the edge.

  19. Tomi Engdahl says:

    Computing at the Edge of IoT

    Microcontrollers (MCUs) are simple, programmable, and fully integrated systems typically used for embedded control. In addition to a processor (CPU), they generally include all of the memory and peripheral interfaces necessary on a single chip. This simplicity and integration means that MCUs are relatively inexpensive and generally consume very little power. Many popular hardware development platforms, such as Arduino, are built on top of MCUs.

    MCUs generally do not have the resources (such as a Memory Management Unit) to run a higher-level operating system like Linux or Android.

    MCUs are very effective in real-time applications where timing is critical. Production MCUs often run some flavor of a real-time operating system (RTOS)

    Go to the profile of Dave Smith
    Dave Smith
    Android+Embedded. Developer Advocate, IoT @ Google.
    Feb 9
    Computing at the Edge of IoT
    Over the past year, we’ve had some great conversations with developers about building IoT devices with the Android Things platform. A common question that comes up is whether the platform is suitable for the Internet of Things (IoT) given that the hardware is much more powerful than the microcontrollers typically found in the space today. To answer that question, let’s examine how hardware choice and use case requirements factor into different IoT system architectures.

    “Computer programmer’s single microchip” by Brian Kostiuk on Unsplash
    You already lost me, what’s a microcontroller?
    Microcontrollers (MCUs) are simple, programmable, and fully integrated systems typically used for embedded control. In addition to a processor (CPU), they generally include all of the memory and peripheral interfaces necessary on a single chip. This simplicity and integration means that MCUs are relatively inexpensive and generally consume very little power. Many popular hardware development platforms, such as Arduino, are built on top of MCUs.

    MCUs generally do not have the resources (such as a Memory Management Unit) to run a higher-level operating system like Linux or Android. Nor can they interface with high-speed peripherals like high-resolution cameras and displays. However, because the application code runs much closer “to the metal”, MCUs are very effective in real-time applications where timing is critical. Production MCUs often run some flavor of a real-time operating system (RTOS) to ensure tasks run in the exact amount of time required to ensure precise measurement and control.

    All of these characteristics start to define applications where MCUs are a perfect fit…and where they aren’t.

    The race to the cloud
    Systems focused primarily (or entirely) on MCU hardware are based on what I’ll call the cloud first architecture. In this architecture, every edge device is connected directly to the internet (usually through WiFi), then provisioned and managed through a cloud service such as Google’s Cloud IoT Core. Inexpensive MCU platforms with built-in WiFi stacks, like the popular ESP8266, make designing systems like this very attractive.

    In these systems, complex data analysis and decision making tasks are handled in the cloud back-end, while the device nodes perform data collection tasks or respond to remote control commands.

    Overall, this is a nice balance. Hardware is inexpensive to replace and can run on small batteries for multiple years, and heavy compute resources are provided by cloud services that are easy to scale up

    MCU hardware and a cloud-first architecture perform well in applications where bandwidth and network latency are less of a concern. Data payloads are small and can be uploaded to the cloud in batches. Here are some examples:

    Distributed sensor-based data collection
    Mobile asset monitoring and tracking

    Living on the edge
    There is a trend in IoT systems moving towards edge computing, enabled by the smartphone economy driving down the cost (and power consumption) of more capable hardware.

    Google Home device. While the Google Assistant functionality is cloud-driven, the hotword detection happens locally on the device.

    This is also true in industrial automation systems where latency and reliability are critical. In these systems, one or more intermediate gateway devices act as an interface between local edge devices (which may be MCU-powered) and any cloud services.

    If we remove the need to connect each device to a cloud service, we can substitute a more power-efficient local network transport

    The advancements in both artificial intelligence (AI) and machine learning (ML) promise to have big effects on IoT systems in the coming years. The ability for ML algorithms to find patterns and make predictions based on data collected by devices will quickly become a necessary component to the success

    It all comes down to evaluating what your system’s needs truly are, and selecting the right tools for the job.

  20. Tomi Engdahl says:

    New Class of Embedded Design Emerges to Support Virtualized Fog Servers

    Real time has taken on a new dimension with the advent of Industry 4.0. It is no longer enough for controls to communicate with sensors and actuators. Today, real-time communication is also required between industrial plants and machines as well as their incoming and outgoing systems, a demand being met by real-time-capable virtualized fog servers with redundant design for high availability.

  21. Tomi Engdahl says:

    Tom Krazit / GeekWire:
    Cloudflare launches Cloudflare Workers, an edge computing service for developers using its network, charging devs $0.50 for every 1M tasks used by their apps — Cloudflare is ready to take the wraps of a new service designed for developers creating Internet-of-Things apps that want to capitalize …

    Cloudflare to open an edge computing service for developers using its network

    Cloudflare is ready to take the wraps of a new service designed for developers creating Internet-of-Things apps that want to capitalize on the proximity benefits provided by edge computing.

    Cloudflare Workers was first introduced last September, and Cloudflare is expected to announce Tuesday that it is now generally available for developers to check out. The new service runs on hardware that Cloudflare has installed in more than 125 data centers around the world to power its anti-DDoS (distributed denial of service) attack service, and it allows developers to write JavaScript applications through the Service Worker API that will run much closer to their users than might otherwise be possible with standard cloud services.

    “For quite some time, we have understood that there is real power in deploying applications that ran incredibly close to where users are on the internet,”

    About 1,000 users have been playing with Cloudflare Workers since the company opened the service up to a broader beta program in January following the September announcement. “I’ve been surprised by how dramatically different all of the applications people have bult are, it doesn’t feel like there is a bound to them yet,” Prince said.

    The benefits of edge computing are just starting to make their way into the world, although lots of folks have been talking about it for a while. It’s a recognition of the fact that as connected devices spread throughout the world, it quickly makes more sense to execute a lot of the code running those devices as physically close to them as possible, as waiting for instructions from a remote cloud data center won’t always cut it for real-time IoT devices.


Leave a Comment

Your email address will not be published. Required fields are marked *