Computing at the Edge of IoT – Google Developers – Medium

https://medium.com/google-developers/computing-at-the-edge-of-iot-140a888007b
We’ve seen that demand for low latency, offline access, and enhanced machine learning capabilities is fueling a move towards decentralization with more powerful computing devices at the edge.

Nevertheless, many distributed applications benefit more from a centralized architecture and the lowest cost hardware powered by MCUs.

Let’s examine how hardware choice and use case requirements factor into different IoT system architectures.

310 Comments

  1. Tomi Engdahl says:

    Tech Talk: Data-Driven Design
    https://semiengineering.com/tech-talk-data-driven-design/

    How more data is shifting memory architectures.

    Reply
  2. Tomi Engdahl says:

    Defining Edge Memory Requirements
    https://semiengineering.com/defining-edge-memory-requirements/

    Edge compute covers a wide range of applications. Understanding bandwidth and capacity needs is critical.

    Defining edge computing memory requirements is a growing problem for chipmakers vying for a piece of this market, because it varies by platform, by application, and even by use case.

    Edge computing plays a role in artificial intelligence, automotive, IoT, data centers, as well as wearables, and each has significantly different memory requirements. So it’s important to have memory requirements nailed down early in the design process, along with the processing units and the power, performance and area tradeoffs.

    “In the IoT space, ‘edge’ to companies like Cisco is much different than ‘edge’ to companies like NXP,” observed Ron Lowman, strategic marketing manager for IoT at Synopsys. “They have completely different definitions and the scale of the type of processing required looks much different. There are definitely different thoughts out there on what edge is. The hottest trend right now is AI and everything that’s not data center is considered edge because they’re doing edge inference, where optimizations will take place for that.”

    Reply
  3. Tomi Engdahl says:

    Tech Talk: Connected Intelligence
    A look at the slowdown in Moore’s Law, and what comes next.
    https://semiengineering.com/tech-talk-connected-intelligence/

    Gary Patton, CTO at GlobalFoundries, talks about computing at the edge, the slowdown in scaling, and why new materials and packaging approaches will be essential in the future.

    Reply
  4. Tomi Engdahl says:

    Digital twin AIs designed to learn at the edge
    https://www.controleng.com/single-article/digital-twin-ais-designed-to-learn-at-the-edge/8e321e6ad54fd763ddcf50f39c8deb19.html?OCVALIDATE=

    Artificial intelligence (AI) startup SWIM is aiming to democratize both AI and digital twin technologies by placing them at the edge without the need for large-scale number-crunching as well as making it affordable.

    With Pure Storage and NVIDIA recently launching their artificial intelligence (AI) supercomputer, it is easy to believe enterprise-grade AI is solely about throwing massive number-crunching ability at Big Data sets and seeing what patterns emerge. But while these technologies notionally are aimed at all types of business, the cost of optimized AI hardware that can be slotted into a data center may be too high for many organizations.

    At the other end of the scale are technologies such as IBM’s Watson and Watson Assistant—which can be deployed as cloud services—and of course numerous suite-based AI tools currently offered by many companies. However, for many Internet of Things (IoT) and connected-device deployments, neither data center nor cloud options are realistic, which is why many AI systems are moving elsewhere, fast.

    For time-critical processing—such as when an autonomous vehicle needs to avoid a collision—the edge environment and the distributed core are where the real number crunching needs to take place. This is why companies such as Microsoft and Dell have announced new IoT strategies focused principally on the edge and/or the distributed core. The ability to add AI at the edge is an increasingly important element in the IoT, avoiding the need to transfer large amounts of data to supercomputers or the cloud and back again to IoT networks.

    Startup SWIM.AI aims to “turn any edge device into a data scientist,”

    The company’s AI edge product, EDX, is designed to autonomously build digital twins directly from streaming data in the edge environment. The system is built for the emerging IoT world in which real-world devices are not just interconnected, but also offer digital representations of themselves, which can be automatically created from, and continually updated by, data from their real-world siblings.

    Digital twins are digital representations of a real-world object, entity, or system, and are created either purely in data or as 3-D representations of their physical counterparts.

    SWIM’s EDX system is designed to enable digital twins to analyze, learn, and predict their future states from their own real-world data. In this way, systems can use their own behavior to train accurate behavioral models via deep neural networks.

    Gartner views digital twins as one of the top strategic enterprise trends in 2018. However, a key challenge is how enterprises can implement the technology, given their investments in legacy assets.

    SWIM believes limited skill sets in streaming analytics, coupled with an often poor understanding of the assets that generate data within complex IoT systems, make deploying digital twins too complex for some. Meanwhile, the prohibitive cost of some digital twin infrastructures puts other organizations off.

    Reply
  5. Tomi Engdahl says:

    Edge Devices are Hot for IoT
    https://www.mentor.com/products/mechanical/resources/overview/edge-devices-are-hot-for-iot-d7527fdf-fff6-4ccb-9811-fe242a8f77d4?uuid=d7527fdf-fff6-4ccb-9811-fe242a8f77d4&contactid=1&PC=L&c=2018_06_21_mad_eedge_q2_issue_2

    Dell has improved the IoT device Edge Gateway Models continuously & the model 5100 is a fan-less, convection cooled design to operate with reliably in extreme temperatures or difficult industrial or enterprise environments as it helps connect endpoints.

    Reply
  6. Tomi Engdahl says:

    Addressing ‘Memory Wall’ is Key to Edge-Based AI
    https://www.eetimes.com/document.asp?doc_id=1333456

    Addressing the “memory wall” and pushing for a new architectural solution enabling highly efficient performance computing for rapidly growing artificial intelligence (AI) applications are key areas of focus for Leti, the French technology research institute of CEA Tech.

    Speaking to EE Times at Leti’s annual innovation conference here, Leti CEO Emmanuel Sabonnadière said there needs to be a highly integrated and holistic approach to moving AI from software and the cloud into an embedded chip at the edge.

    “We really need something at the edge, with a different architecture that is more than just CMOS, but is structurally integrated into the system, and enable autonomy from the cloud — for example for autonomous vehicles, you need independence of the cloud as much as possible,” Sabonnadière said.

    Reply
  7. Tomi Engdahl says:

    With its Snowball Edge, AWS now lets you run EC2 on your factory floor
    https://techcrunch.com/2018/07/17/with-its-snowball-edge-aws-now-lets-you-run-ec2-on-your-factory-floor/?sr_share=facebook&utm_source=tcfbpage

    AWS’s Snowball Edge devices aren’t new, but they are getting a new feature today that’ll make them infinitely more interesting than before. Until now, you could use the device to move lots of data and perform some computing tasks on them, courtesy of the AWS Greengrass service and Lambda that run on the device. But AWS is stepping it up and you can now run a local version of EC2, the canonical AWS compute service, right on a Snowball Edge.

    Reply
  8. Tomi Engdahl says:

    . It’s worth noting that this was also the original idea behind OpenStack (though setting that up is far more complicated than ordering a Snowball Edge) and that Microsoft, with Azure Stack and its various edge computing services, offers similar capabilities.

    https://techcrunch.com/2018/07/17/with-its-snowball-edge-aws-now-lets-you-run-ec2-on-your-factory-floor/?sr_share=facebook&utm_source=tcfbpage

    Reply
  9. Tomi Engdahl says:

    Google is making a fast specialized TPU chip for edge devices and a suite of services to support it
    https://techcrunch.com/2018/07/25/google-is-making-a-fast-specialized-tpu-chip-for-edge-devices-and-a-suite-of-services-to-support-it/?sr_share=facebook&utm_source=tcfbpage

    In a pretty substantial move into trying to own the entire AI stack, Google today announced that it will be rolling out a version of its Tensor Processing Unit — a custom chip optimized for its machine learning framework TensorFlow — optimized for inference in edge devices.

    Reply
  10. Tomi Engdahl says:

    Announcing the New AIY Edge TPU Boards
    Custom ASIC for accelerated machine learning on the edge
    https://blog.hackster.io/announcing-the-new-aiy-edge-tpu-boards-98f510231591

    Earlier this morning, during his keynote at the Google Next conference in San Francisco, Injong Rhee, the VP of IoT, Google Cloud, announced two new AIY Project boards—the AIY Projects Edge TPU Dev Board, and the Edge TPU Accelerator—both based around Google’s new purpose-built Edge TPU.

    Reply
  11. Tomi Engdahl says:

    Bringing intelligence to the edge with Cloud IoT
    https://www.blog.google/products/google-cloud/bringing-intelligence-to-the-edge-with-cloud-iot/

    But just as opportunities increase with IoT, so does data. IDC estimates that the total amount of data generated from connected devices will exceed 40 trillion gigabytes by 2025. This is where advanced data analytics and AI systems can help, to extract insights from all that data quickly and easily.

    There are also many benefits to be gained from intelligent, real-time decision-making at the point where these devices connect to the network—what’s known as the “edge.” Manufacturing companies can detect anomalies in high-velocity assembly lines in real time. Retailers can receive alerts as soon as a shelved item is out of stock. Automotive companies can increase safety through intelligent technologies like collision avoidance, traffic routing, and eyes-off-the-road detection systems.

    But real-time decision-making in IoT systems is still challenging due to cost, form factor limitations, latency, power consumption, and other considerations. We want to change that.

    Reply
  12. Tomi Engdahl says:

    MicroZed Chronicles: The Ultra96 and Machine Learning
    https://blog.hackster.io/microzed-chronicles-the-ultra96-and-machine-learning-3b8684a82059

    A short time ago we looked at the Ultra96 board, one of the great use cases for the Ultra96 is when it comes to implementing machine learning at the edge.

    Reply
  13. Tomi Engdahl says:

    Announcing the New AIY Edge TPU Boards
    https://blog.hackster.io/announcing-the-new-aiy-edge-tpu-boards-98f510231591

    Custom ASIC for accelerated machine learning on the edge

    Reply
  14. Tomi Engdahl says:

    Deep Learning at the Edge on an Arm Cortex-Powered Camera Board
    https://blog.hackster.io/deep-learning-at-the-edge-on-an-arm-cortex-powered-camera-board-3ca16eb60ef7

    It’s no secret that I’m an advocate of edge-based computing, and after a number of years where cloud computing has definitely been in ascendency, the swing back towards the edge is now well underway. Driven, not by the Internet of Things as you might perhaps expect, but by the movement of machine learning out of the cloud.

    Reply
  15. Tomi Engdahl says:

    Create Intelligence at the Edge with the Ultra96 Board
    https://blog.hackster.io/create-intelligence-at-the-edge-with-the-ultra96-board-446cd153fa85

    What intelligent applications could you create with the power of programmable logic?

    Reply
  16. Tomi Engdahl says:

    Google unveils tiny new AI chips for on-device machine learning
    https://www.theverge.com/2018/7/26/17616140/google-edge-tpu-on-device-ai-machine-learning-devkit

    The hardware is designed for enterprise applications, like automating quality control checks in a factory

    Reply
  17. Tomi Engdahl says:

    DWDM Optical Modules Take It to the Edge
    https://www.lightwaveonline.com/articles/2018/04/dwdm-optical-modules-take-it-to-the-edge.html?cmpid=enl_lightwave_lightwave_enabling_technologies_2018-08-02&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2193929

    The need for low latency and quality of service is driving cloud traffic ever closer to the edge of the network. In response, cloud providers are moving toward a new distributed data center architecture of multiple edge data centers rather than a single mega-data center in a geographic market. This distributed data center model requires an orders-of-magnitude increase in optical connectivity among the edge data centers to ensure reliable and robust service quality for the end users.

    As a result, the industry is clamoring for low-cost and high-bandwidth transceivers between network elements. The advent of pluggable 100G Ethernet DWDM modules in QSFP28 form factor holds the promise of superior performance, tremendous cost savings, and scalability.

    Moving data to the edge

    According to Cisco, global IP traffic will increase nearly threefold over the next 5 years, and will have increased 127-fold from 2005 to 2021. In addition, almost half a billion (429 million) mobile devices and connections were added in 2016. Smartphones accounted for most of that growth, followed by machine-to-machine (M2M) modules. As these devices continue to multiply, the need to bring the data center closer to the sources, devices, and networks all producing data is driving the shift to the network’s edge.

    With 5G on the horizon, bandwidth will continue to be a major challenge. Cisco predicts that although 5G will only be 0.2% of connections (25 million) by 2021, it will generate 4.7 times more traffic than the average 4G connection.

    The exponential increase in point-to-point connections and the growing bandwidth demands of cloud service providers (CSPs) have driven demand for low-cost 100G optical communications. However, in contrast to a more traditional data center model (where all the data center facilities reside in a single campus), many CSPs have converged on distributed regional architectures to be able to scale sufficiently and provide cloud services with high availability and service quality. Pushing data center resources to the network’s edge and thereby closer to the consumer and enterprise customers reduces latency, improves application responsiveness, and enhances the overall end-user experience.

    Reply
  18. Tomi Engdahl says:

    Energy At The Edge
    How much energy will billions of complex devices require?
    https://semiengineering.com/energy-at-the-edge/

    Ever since the first mention of the IoT, everyone assumed there would be billions of highly efficient battery-powered devices that drew milliwatts of energy. As it turns out, we are about to head down a rather different path.

    The enormous amount of data that will be gathered by sensors everywhere cannot possibly be sent to the cloud for processing. The existing infrastructure cannot handle it, and there are doubts that even 5G using millimeter-wave technology would suffice. This realization, which has become a hot topic of discussion across the electronics industry in the past few months, has broad implications.

    Rather than a collection of dumb, simple, mass-produced sensors, edge devices will have to be much more sophisticated and do far more processing than previously thought. They will need to assess which data should be relayed to data centers, which data should be stored locally, and which data can be thrown away.

    These are complex transactions by any metric. Data types vary greatly. Vision data is different from voice data, which is different again from data about mechanical vibration or near-field scans from an industrial or commercial operation. Understanding how data can be used, and what is useful within that data, requires sophisticated collection, partitioning and purging, which is the kind of stuff that today is being done by very powerful computers.

    How this affects electricity usage on a global scale remains to be seen.

    Reply
  19. Tomi Engdahl says:

    More Processing Everywhere
    https://semiengineering.com/more-processing-everywhere/

    Arm’s CEO contends that a rise in data will fuel massive growth opportunities around AI and IoT, but there are significant challenges in making it all work properly.

    Reply
  20. Tomi Engdahl says:

    Pace Quickens As Machine Learning Moves To The Edge
    https://semiengineering.com/pace-quickens-as-machine-learning-moves-to-the-edge/

    More powerful edge devices means everyday AI applications, like social robots, are becoming feasible.

    Reply
  21. Tomi Engdahl says:

    More Performance At The Edge
    https://semiengineering.com/more-performance-at-the-edge/

    Scaling is about to take on a whole different look, and it not just from shrinking features.

    Shrinking features has been a relatively inexpensive way to improve performance and, at least for the past few decades, to lower power. While device scaling will continue all the way to 3nm and maybe even further, it will happen at a slower pace. Alongside of that scaling, though, there are different approaches on tap to ratchet up performance even with chips developed at older nodes.

    This is particularly important for edge devices, which will be called on to do pre-processing of an explosion of data. Performance improvements there will come from a combination of more precise design, less accurate processing for some applications, and better layout using a multitude of general-purpose and specialized processors. There also will be different packaging options available, which will help with physical layouts to shorten the distance between processors and both memory and I/O. And there will be improvements in memory to move data back and forth faster using less power.

    Reply
  22. Tomi Engdahl says:

    All edge data centers require these 3 things
    https://www.cablinginstall.com/articles/pt/2018/06/all-edge-data-centers-require-these-3-things.html?cmpid=enl_cim_cim_data_center_newsletter_2018-06-19&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2164352

    “large, centralized data centers house the racks, servers and other hardware needed to support cloud services, content delivery networks, customized enterprise workloads and other functionality. With the emergence of 5G and the latency reductions needed to support a range of applications like mobile gaming, industrial automation and autonomous driving, for instance, there’s a concurrent move to take that centralized compute power and distribute it to edge data centers.”

    “If you think about what you need to build edge computing infrastructure, you need three things: A way to house the equipment and cool it. You need real estate and ideally real estate that’s colocated with the wireless network infrastructure…and the third thing you need is fiber in order to interconnect to other sites, backhaul networks and peering sites.”

    Edge data centers need three things: Equipment housing, real estate and fiber
    https://www.rcrwireless.com/20180619/network-infrastructure/edge-data-centers-real-estate-fiber-tag17

    Vapor IO working with Crown Castle to deploy edge data centers

    Reply
  23. Tomi Engdahl says:

    Local data center can serve a local cloud
    https://www.controleng.com/single-article/local-data-center-can-serve-a-local-cloud/f7259530e3e8d24726cfd00361e10b96.html?OCVALIDATE=

    Technology Update: Smart data management may include keeping a data center on location as part of a cybersecurity strategy, for manufacturers, aviation, defense, and other applications. An example shows how.

    For data-intensive industries such as manufacturing, aviation, defense, energy, and healthcare, debate continues about application of cloud computing; smart data management may include an on-site component to augment or replace massive off-site data center storage. No approach works for all situations but taking the cloud from the sky and adding local storage can be an option.

    “For hospitals, manufacturers, and many other industries, there’s a big struggle right now as to the right mix between the cloud and internal management of data,” said Bob Venero, CEO and founder of Future Tech Enterprise Inc. “It’s about ensuring the availability of data. If the connection to the cloud goes down, they still need to be able to work. The other challenge is figuring out which data sets are classified in which area. It’s all a delicate balance across many industries.”

    Smart hybrid data management

    Future Tech Enterprise Inc. is a proponent of iFortress, a modular, hermetically sealed, flexible data center design. Courtesy: Future Tech Enterprise Inc.Combining on-premise and cloud solutions on different hardware platforms can be a favorable combination for many applications.

    Organizations should analyze potential risks and benefits of cloud use; there’s often a good case for non-sensitive data to be stored in cloud with the goal of reducing IT costs and driving efficiency.

    The convenience-related benefits of cloud options can be incorporated into on-premise locations.

    An internal cloud network was constructed in collaboration with a major security company to “provide services and record operations. Everything syncs back to a main data center with information never crossing into the public cloud,” Venero said.

    Other possibilities include investing in advanced hardware options with superior computing power, for data intensive industries such as healthcare. Incorporating hardware-agnostic software is another option that helps reduce costs and provides flexibility.

    Reply
  24. Tomi Engdahl says:

    AI Flood Drives Chips to the Edge
    Deep learning spawns a silicon tsunami
    https://www.eetimes.com/document.asp?doc_id=1333413

    It’s easy to list semiconductor companies working on some form of artificial intelligence — pretty much all of them are. The broad potential for machine learning is drawing nearly every chip vendor to explore the still-emerging technology, especially in inference processing at the edge of the network.

    “It seems like every week, I run into a new company in this space, sometimes someone in China that I’ve never heard of,” said David Kanter, a microprocessor analyst at Real World Technologies.

    Deep neural networks are essentially a new way of computing. Instead of writing a program to run on a processor that spits out data, you stream data through an algorithmic model that filters out results in what’s called inference processing.

    Reply
  25. Tomi Engdahl says:

    Digital twin AIs designed to learn at the edge
    https://www.controleng.com/single-article/digital-twin-ais-designed-to-learn-at-the-edge/8e321e6ad54fd763ddcf50f39c8deb19.html?OCVALIDATE=

    Artificial intelligence (AI) startup SWIM is aiming to democratize both AI and digital twin technologies by placing them at the edge without the need for large-scale number-crunching as well as making it affordable.

    Twin management

    Gartner views digital twins as one of the top strategic enterprise trends in 2018. However, a key challenge is how enterprises can implement the technology, given their investments in legacy assets.

    SWIM believes limited skill sets in streaming analytics, coupled with an often poor understanding of the assets that generate data within complex IoT systems, make deploying digital twins too complex for some. Meanwhile, the prohibitive cost of some digital twin infrastructures puts other organizations off.

    “Digital twins need to be created based on detailed understanding of how the assets they represent perform, and they need to be paired with their real-world siblings to be useful to stakeholders on the front line,” said SWIM in a statement. “Who will operate and manage digital twins? Where will the supporting infrastructure run? How can digital twins be married with enterprise resource planning (ERP) and other applications, and how can the technology be made useful for agile business decisions?”

    The company claims SWIM EDX addresses these challenges by enabling any organization with lots of data to create digital twins that learn from the real world continuously, and to do so easily, affordably, and automatically.

    Reply
  26. Tomi Engdahl says:

    Reborn of Classic Shell.
    https://github.com/Open-Shell/Open-Shell-Menu

    Features

    Classic style Start Menu for Windows 7, 8, 8.1, 10
    Toolbar for Windows Explorer
    Classic copy UI (Windows 7 only)
    Show file size in Explorer status bar
    Title bar and status bar for Internet Explorer

    Reply
  27. Tomi Engdahl says:

    Managing IoT: A problem and solution for data center and IT managers
    https://www.cablinginstall.com/articles/print/volume-26/issue-7/features/technology/managing-iot-a-problem-and-solution-for-data-center-and-it-managers.html?cmpid=enl_cim_cim_data_center_newsletter_2018-08-07&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2196737

    The Internet of Things is one of several challenges facing network administrators, but it’s also one of the solutions to those challenges.

    You’ve probably seen some of the projections regarding the growth in the Internet of Things (IoT) in the coming years. Cisco projects there will be 23 billion devices connected to Internet Protocol (IP) networks by 2021. Gartner says 20.8 billion by 2020, while IDC puts the 2020 number at 28.1 billion.

    While there’s some discrepancy in the numbers, there’s little debate that IoT is growing fast. Whether it is enabling smart homes, smart factories, or smart cities, this growth is being driven by IoT’s potential to improve efficiency, productivity and availability.

    But IoT applications also can generate huge volumes of data that must be transmitted, processed and stored, creating data management challenges information technology (IT) professionals must prepare to address. One of the ways they can address them is by applying IoT technology to improve the management of data centers and edge sites.

    IoT in the data center

    According to the Cisco Visual Networking Index global IP traffic will grow from 1.2 zettabytes in 2016 to 3.3 zettabytes by 2021. While that represents a tripling of data in just five years, not all of that data will originate or end up in a traditional data center. A large percentage of IoT data, for example, will be generated, processed and stored at the network edge. Only a fraction will need to be transmitted to a central data center for archiving and deep learning.

    Yet, the data center is also an extremely complex and diverse environment that has left much of that operating data stranded within devices due to the variety of protocols in use and the lack of a system-level control layer.

    Using an IoT strategy provides a framework for capturing and using this data to enhance reliability and efficiency as well as enable automation. For example, system-level controls, such as those available for thermal management, enable machine-to-machine communication and coordination across units to optimize performance across the facility. They also support continuous monitoring to enhance availability.

    Management gateways designed specifically for the data center are now available to enable true, real-time, integrated monitoring, access and control across IT and facilities systems. These flexible gateways aggregate and normalize the incoming data and provide a local control point necessary for the latency requirements of some of the edge archetypes. The gateways consolidate data from multiple devices using different protocols to support centralized data center management.

    IoT on the edge
    https://www.cablinginstall.com/articles/print/volume-26/issue-7/features/technology/managing-iot-a-problem-and-solution-for-data-center-and-it-managers.html?cmpid=enl_cim_cim_data_center_newsletter_2018-08-07&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2196737

    This has led to the recognition of four edge archetypes that can guide decisions regarding edge infrastructure, particularly at the local level. These four archetypes are described here.

    Data intensive—encompasses uses cases where the amount of data is so large that layers of storage and computing are required between the endpoint and the cloud to reduce bandwidth costs or latency.
    Human-latency sensitive—includes applications where latency negatively impacts the experience of humans using a technology or service.
    Machine-to-machine latency sensitive—similar to the human-latency sensitive archetype except the tolerance for latency in machines is even less than it is for humans because of the speed at which machines process data.
    Life critical—applications that impact human health or safety and so have very low latency and very high availability requirements.

    Reply
  28. Tomi Engdahl says:

    What edge computing means for the future of the data center
    https://www.cablinginstall.com/articles/pt/2018/07/what-edge-computing-means-for-the-future-of-the-data-center.html?cmpid=enl_cim_cim_data_center_newsletter_2018-08-07&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2196737

    In a new article for Computer Business Review, Raghavan Srinivasan, senior director of enterprise data solutions at Seagate Technology, predicts that “the large traditional data center has been the mainstay of computing and connectivity networks for more than half a century – and essentially all processing of transactions have been carried out in a centralized core – but mobility, technological advancements and economic demand mean that businesses will increasingly add edge elements to this essential core.”

    Edge Computing and the Future of the Data Center
    https://www.cbronline.com/opinion/edge-computing-future-data-center

    The rise of edge computing could see the advent of a rapidly growing array of smaller data centers built closer to population centers, says Seagate Technology’s Raghavan Srinivasan

    In almost every respect, the world is getting faster. We expect customer service problems to be resolved right away, we want our goods to arrive the day after we order them and have become used to communicating with anyone, anywhere, at any time.

    For enterprises, this trend is reflected in the increased demand for real-time data processing. Look at the trends that will power the next generation of business innovation: AI, the internet of things and 5G are driving a surge in data production that, according to a Gartner study, could see more than 7.5 billion connected devices in use in enterprises by 2020.

    This shift may power next-generation technologies from connected cars and smart drones to manufacturing and intelligent retail. More data will need to be analyzed in real time –according to the DataAge 2025 study commissioned by Seagate, by 2025, almost 20 percent of data created will be real-time in nature – rather than be sent to the core of the network for processing. This means enterprises will build on their central cloud computing architecture and develop the ability to process – and, equally importantly, securely store – more data at the edge.

    A New Network from the Old

    Micro data centers could be deployed at the base of telecom towers and other important points in the existing wireless network. Therefore, there could be far higher numbers of data centers around, but the majority of these will be unrecognizable from the warehouse-sized locations of today.

    Reply
  29. Tomi Engdahl says:

    The evolution of cellular air conditioning
    https://www.edn.com/electronics-blogs/5g-waves/4460930/The-evolution-of-cellular-air-conditioning

    Mobile networks have an interesting cost driver lurking behind the scenes: air-conditioning at remote sites. From the introduction of cellular in the early 1980s through the deployment of 3G service in the 2000s, antennas were mounted on towers or buildings and the signals were transmitted through lossy metal coaxial cables to a small weatherproof cabinet. The cabinet was located near the base of the tower containing power-hungry radio equipment and amplifiers. The equipment consumed a lot of power and generated a lot of heat. The cabinets needed air conditioning to avoid equipment damage, and cooling costs could be more than 20- to 30 percent of the annual cost of operating the tower.

    The 3G standard brought many technology advances, some of which alleviated the air conditioning expense. An architectural change was responsible for the largest shift in energy usage and heat production; 3G radio equipment was split into two parts.

    Baseband processing (which converts a digital bitstream from the network into a baseband radio signal) was separated from up-conversion and amplification (where the baseband radio signal becomes a higher-power RF radio signal). Up-conversion and amplification components were packaged into a remote radio head (RRH) and mounted on the cell tower near the antenna. The proximity to the antenna meant that far less power was required to overcome cable losses, and thus the amplifiers no longer had to be actively cooled.

    Low-loss fiber connectivity to the remote radio head allowed distances of up to 6 miles between the radio head and baseband unit (BBU). This enabled massive consolidation, moving the bulk of baseband processing into a regional office, often dubbed a “baseband hotel” because it housed multiple BBUs. The co-location ushered in a whole host of additional optimizations including lower-latency coordination between cell sites, reduced intra-site interference, more reliable user handoffs, and improved coverage via coordinated multi-point (CoMP) transmissions.

    History repeats itself
    Many technology transitions are cyclic. Consider the shift from centralized mainframe computing to independent PCs with local file storage. This trend has gone full circle, leading to the recent mass centralization of compute resources in public clouds and many mourning the demise of the PC.

    I expect a similar cycle in cellular, as network function virtualization (NFV) and software defined networking (SDN) dramatically change the way networks are built. 5G operators can leapfrog some of the tribulations of this cycle by learning from the last decade of public cloud evolution.

    Amazon, Microsoft, and Google centralized massive amounts of compute and networking into mega data centers, but customers quickly found that application response time suffered. Cloud providers adapted with a hybrid architecture that pushes latency-sensitive operations to the edge of the network, while keeping many non-latency sensitive functions in the core.

    Evolution of the baseband hotel
    In a 5G network, NFV separates software from hardware. Services that once ran on proprietary hardware, like routing, load balancing, firewalls, video caching, and transcoding can be deployed on standard servers, and these workloads can be placed anywhere in the network.

    5G also introduces network slicing functionality to maximize the utility of these capabilities. In network slicing, an operator can provision specific sub-interfaces at the air interface level, and map these to specific network function chains. This allows providers to deliver highly differentiated services, similar to the way a local area network can offer quality of service for different traffic flows.

    One of the most interesting new deployment models that extends from network slicing is multi-access edge computing. Mobile edge computing (MEC) is an architectural approach where the 5G carrier moves specific services much closer to the edge of the network, similar to the way Amazon provides Lambda processing at the edge of its cloud.

    The result is much lower latency, and helps meet requirements for next-gen applications, such as the 15ms motion-to-photon response target needed to minimize user discomfort in augmented reality and virtual reality applications. MEC can also reduce core loading by caching data such as video at the edge.

    Reply
  30. Tomi Engdahl says:

    Optimizing 5G With AI At The Edge
    https://semiengineering.com/optimizing-5g-with-ai-at-the-edge/

    5G is necessary to deal with the increasing amount of data being generated, but successful rollout of mmWave calls for new techniques.

    For example, AI techniques are essential to the successful rollout of 5G wireless communications. 5G is the developing standard for ultra-fast, ultra-high-bandwidth, low-latency wireless communications systems and networks whose capabilities and performance will leapfrog that of existing technologies.

    5G-level performance isn’t a luxury; it’s a capability the world critically needs because of the exploding deployment of wirelessly connected devices. A crushing amount of data is poised to overwhelm existing systems, and the amount of data that must be accessed, transmitted, stored and processed is growing fast.

    5G needed for the upcoming data explosion
    Every minute, by some estimates, users around the world send 18 million text messages and 187 million emails, watch 4.3 million YouTube videos and make 3.7 million Google search queries. In manufacturing, analysts predict the number of connected devices will double between 2017 and 2020. Overall, by 2021 internet traffic will amount to 3.3 zettabytes per year, with Wi-Fi and mobile devices accounting for 63% of that traffic (a zettabyte is 12 orders of magnitude larger than a gigabyte, or 1021 bytes).

    The new 5G networks are needed to handle all of this data. The new networks will roll out in phases, with initial implementations leveraging the existing 4G LTE and unlicensed access infrastructure already in place. However, while these initial Phase 1 systems will support sub-6GHz applications and peak data rates >10GBps, things really begin to get interesting in Phase 2.

    In Phase 2, millimeter-wave (mmWave) systems will be deployed enabling applications requiring ultra-low latency, high security, and very high cell edge data rates. (The “edge” refers to the point where a device connects to a network. If a device can do more data processing and storage at the edge – that is, without having to send data back and forth across a network to the cloud or to a data center – then it can respond more quickly and space on the network will be freed up.)

    Reply
  31. Tomi Engdahl says:

    Startup’s Funds Fuel Edge Networks
    Vapor IO plans 100 data centers in the U.S.
    https://www.eetimes.com/document.asp?doc_id=1333663

    A startup snagged a large Series C round to build dozens of medium-sized data centers for edge networks, at least some with its own novel gear. The news underscores work on a new class of distributed networks for carriers and web giants.

    Vapor IO suggested that it won more than $100 million in financing, although it declined to reveal an exact amount. It aims to have more than 18 sites for 150-kilowatt data centers under construction by the end of the year and more than 100 by the end of 2020.

    The 25-person Vapor IO operates in Chicago two of five planned 150-kW data centers. It is hiring to expand to three metro regions this year and 20 by the end of 2020. The startup aims to use software-defined networking and APIs to give carriers, cloud-computing providers, content distribution networks, and others a unified view of distributed networks that it maintains for them.

    “Since we were founded in 2015, we have done nothing but attempt to perfect a design that’s the equivalent of the electrical grid for the digital economy,” said Cole Crawford, founder and chief executive of Vapor IO.

    Crawford is best known as the former executive director of the Open Compute Project, a non-profit set up by Facebook to promote open hardware designs for the world’s largest data centers.

    Edge networks use much smaller versions of the multi-megawatt data centers that companies such as Facebook run. They aim to deliver response times of a few milliseconds — an order of magnitude lower than today’s data centers thanks to being in the same city as their users and sitting on fiber-optic rings near large cellular base stations. The facilities hope to enable emerging apps ranging from AR/VR to network slicing and helping robocars navigate.

    Carriers say that the edge networks could sometimes be as small as a coat closet that pairs a few servers with a cellular base station. Base station giant Nokia recently rolled out a family of compact servers for such deployments.

    Crawford sees his centers also playing a role as peering locations where carriers, content owners, and internet points of presence meet.

    Vapor IO started out with a unique doughnut-shaped design for a group of servers called a Vapor Chamber, replacing traditional 19-inch racks. Rather than forcing cool air laterally through a building, it uses one large fan in the center of a cylindrical design into which groups of servers fit like wedges of cheese.

    The design enables a 135-kW server cluster to be deployed in a single day. Vapor also has a design that packs servers into the rough equivalent of a shipping container, an approach popular in the early days of so-called hyper-scale data centers.

    In addition, Vapor designed its own boards for managing large server deployments. One monitors 72 sensors to track temperature, air pressure, vibration, and air quality. Another puts a programmable-logic controller on a mezzanine board to control server functions over a powerline network.

    The startup supports APIs that let users build programs that run on top of its control systems. “We’ve open-sourced code, turning register lookups into HTTP interfaces,” he said.

    Reply
  32. Tomi Engdahl says:

    The Edge Will Eat The Cloud
    https://blogs.gartner.com/thomas_bittman/2017/03/06/the-edge-will-eat-the-cloud/

    Today, cloud computing is eating enterprise data centers, as more and more workloads are born in the cloud, and some are transforming and moving to the cloud. But there’s another trend that will shift workloads and data and processing and business value significantly away from the cloud. The edge will eat the cloud. And this is perhaps as important as the cloud computing trend ever was.Edge Eat

    Several overlapping trends are colliding: (1) cloud computing, centralizing IT for massive economies of scale and agile provisioning, volatility and growth, (2) the Internet of Things (IoT), where things are becoming connected and sending reams of data, (3) machine learning, taking all of that data and improving processing and predictions, (4) Augmented and Mixed Reality (along with Virtual Reality), where people can interact with other people and things both in physical and virtual worlds, and (5) Digital Business and the Digital World, where connections between things and people are pushing us to more and more real-time interactions and decisions.

    The agility of cloud computing is great – but it simply isn’t enough. Massive centralization, economies of scale, self-service and full automation get us most of the way there – but it doesn’t overcome physics – the weight of data, the speed of light.

    It’s one thing to have local processing next to a number of fixed devices, monitoring pipeline pressure, etc. – it’s another when those things are in flight, moving constantly, and perhaps even appearing – digitally – at a location. Sure – software is a thing, too, and software agents could be sent somewhere on the edge to deal with an event, provide information, or interact with people and things locally.

    The edge will need some serious muscle.

    Cloud computing providers are really good at managing their standardized, centralized, massively-scaled data centers and control software. But the technologies for the edge will be completely different, much more dynamic, much more evolutionary and competitive. Cloud providers are trying to reach out now and subsume control of the edge before the edge takes off.

    Reply
  33. Tomi Engdahl says:

    5 Tips for Building Fog Networks
    https://www.eetimes.com/author.asp?section_id=36&doc_id=1333798

    Fog computing is ready for deployment now for engineers aware of the basic concepts and resources for designing them.

    Fog computing was conceived as a way to enable applications in high-throughput, high-compute ecosystems that require real-time processing. In the world of the Internet of Things, fog is already supporting deployments on a global scale.

    The OpenFog Consortium defines fog computing as: “A horizontal, system-level architecture that distributes computing, storage, control and networking functions closer to the users along a Cloud-to-Thing continuum.”

    Select applications carefully

    In IoT, for example, applications are sometimes difficult to predict or pin down. It is often helpful to define the application space of a network or network element in terms of a three-layer taxonomy of the vertical market served, the use cases in those verticals and the specific applications within those use cases.

    Fog nodes are fundamental processing elements that enable high-compute operations in close proximity to end nodes. Fog nodes may serve multiple verticals, use cases or applications in an efficient, combined network. Once these high-level requirements are well understood, the design of the network and its elements can commence.

    Partition workloads

    Fog offers unique partitioning options for complex software-based systems using a deep hierarchy of processing, networking and storage resources from the cloud to IoT endpoints. With several layers of fog nodes potentially in a deployment, finding a balance of the optimal layer in the hierarchy to host functions is an interesting challenge.

    Securty is pervasive

    Entire books and conferences are devoted to the security of IoT networks. Using fog techniques can both complicate and enhance the security properties of IoT networks. More intelligent nodes, perhaps in a multi-layer hierarchy, require more crypto processors and keys, more complex security policies and more attention to potential threats.

    Orchestraton and management

    It is going to take considerable effort to install, configure, download and commission large networks of fog nodes. Once operational, these networks require constant monitoring and frequent updating.

    Complex orchestration processes manage fog-based resources and assign workloads to them. They also manage their quick reconfiguration in case of overloads or failures.

    Measure lifecycle costs of ownership

    You can’t decide to deploy fog networks based solely on their initial cost. While purchase and installation cost will be significant, other costs need to be factored in to the equation such as ongoing maintenance, periodic hardware and software updates, energy consumption, and decommissioning costs. Be careful about this step given the fact that many fog nodes will have a 10-20 year life in the field.

    Reply
  34. Tomi Engdahl says:

    Low-Cost, Low-Power AI on the Edge
    https://www.eeweb.com/profile/max-maxfield/articles/low-cost-low-power-ai-on-the-edge

    Expanded features in the Lattice sensAI stack are designed to speed time to market for developers of flexible machine-learning inferencing in consumer and industrial IoT applications

    Reply
  35. Tomi Engdahl says:

    https://www.uusiteknologia.fi/2018/10/16/edgesta-tulee-ensi-vuoden-ykkostermi/

    Teknologiamaailma rakastaa trendejä ja uudenkarheita termejä. Vuonna 2009 puhuttiin taukoamatta jo pilvipalveluista, 2012 mobiilista ja pari vuotta sitten alkoi IoT-tykitys. Ensi vuoden muotisanaksi noussee reunalaskenta eli Edge, johon järjestelmätoimittajat valmistautuvat jo täyttä päätä.

    Reply
  36. Tomi Engdahl says:

    Canonical announced a partnership with Eurotech to help organizations advance in the IoT realm. In connection with this partnership, Canonical “has published a Snap for the Eclipse Kura project—the popular, open-source Java-based IoT edge framework. Having Kura available as a Snap—the universal Linux application packaging format—will enable a wider availability of Linux users across multiple distributions to take advantage of the framework and ensure it is supported on more hardware. Snap support will also extend on Eurotech’s commercially supported version; the Everywhere Software Framework (ESF).”
    Source: https://www.linuxjournal.com/content/canonical-announces-partnership-eurotech-big-four-end-support-tls-10-and-11-sony-using

    Canonical collaborates with Eurotech on edge computing solutions
    https://snapcraft.io/blog/canonical-collaborates-with-eurotech-on-edge-computing-solutions

    Coinciding with IoT World Solutions Congress in Barcelona this week, Canonical is pleased to announce a dual-pronged technological partnership with Eurotech to help organisations advance their internet of things enablement. Eurotech is a long time leader in embedded computing hardware as well as providing software solutions to aid enterprises to deliver their IoT projects either end to end or by providing intervening building blocks.

    As part of the partnership, Canonical has published a Snap for the Eclipse Kura project – the popular, open-source Java-based IoT edge framework. Having Kura available as a Snap – the universal Linux application packaging format – will enable a wider availability of Linux users across multiple distributions to take advantage of the framework and ensure it is supported on more hardware. Snap support will also extend on Eurotech’s commercially supported version; the Everywhere Software Framework (ESF). By installing Kura as a Snap on a device, users will benefit with automatic updates to ensure they are always working from the latest version while with the reassurance of a secure, confined environment.

    Reply
  37. Tomi Engdahl says:

    Edge Computing Emerges as Megatrend in Automation
    https://www.designnews.com/automation-motion-control/edge-computing-emerges-megatrend-automation/27888481159634?ADTRK=UBM&elq_mid=6072&elq_cid=876648

    Edge computing offers a range of benefits for automation control applications. It’s an emerging megatrend for controlling, collecting, and analyzing data at the network’s edge.

    Edge computing technology is quickly becoming a megatrend in industrial control, offering a wide range of benefits for factory automation applications. While the major cloud suppliers are expanding, new communications hardware and software technology are beginning to provide new solutions compared to the previous offerings used in factory automation.

    “The most important benefit [compared to existing solutions] will be interoperability—from the device level to the cloud,” John Kowal, director of business development for B&R Industrial Automation, told Design News. “So it’s very important that communications be standards-based, as you see with OPC UA TSN. ‘Flavors’ of Ethernet including ‘flavors’ of TSN should not be considered as providing interoperable edge communications, although they will function perfectly well in a closed system. Interoperability is one of the primary differences between previous solutions. OPC UA TSN is critical to connecting the edge device to everything else.”

    Emerging Technology Solutions

    Kowal added that, in legacy installations, gateways will be necessary to translate data from proprietary systems—ideally using OPC UA over standard Ethernet to the cloud. An edge computing device can also provide this gateway translation capability. “One of the benefits of Edge technology is its ability to perform analytics and optimization locally, and therefore achieve faster response for more dynamic applications, such as adjusting line speeds and product accumulation to balance the line. You do not expect this capability of a gateway,’” Kowal added.

    Sari Germanos of B&R added that these comments about edge computing can also be equally applied to the cloud. “With edge, you are using fog instead of cloud with a gateway. Edge controllers need things like redundancy and backup, while cloud services do that for you automatically,” Germanos said. He also noted that cloud computing generally makes data readily accessible from anywhere in the world, while the choice of serious cloud providers for industrial production applications is limited. Edge controllers are likely to have more local features and functions, though the responsibility for tasks like maintenance and backup falls on the user.

    Reply
  38. Tomi Engdahl says:

    AI on a MEMS Device Brings Neuromorphic Computing to the Edge
    https://spectrum.ieee.org/tech-talk/robotics/artificial-intelligence/artificial-intelligence-on-a-mems-device-brings-neuromorphic-computing-to-the-edge

    In order to achieve the edge computing that people talk about in a host of applications including 5G networks and the Internet of Things (IoT), you need to pack a lot of processing power into comparatively small devices.

    The way forward for that idea will be to leverage artificial intelligence (AI) computing techniques—for so-called AI at the edge.

    Sylvestre explained that it’s hard to modify the inner workings of a MEMS device, but it’s not necessary in reservoir computing, which is why they used this approach to do AI in MEMS.

    “Our work shows that it’s possible to use the non-linear resources in MEMS to do AI,” said Sylvestre. “It’s a novel way to build “artificially smart” devices that can be really small and efficient.”

    Reply
  39. Tomi Engdahl says:

    Edge computing: the cyber security risks you must consider
    https://www.zdnet.com/article/edge-computing-the-cyber-security-risks-you-must-consider/

    Edge computing could be an innovative new way to collect data, but it also opens up a world of additional security headaches.

    Edge computing is based around the idea that, to cope with the vast amounts of data generated by IoT sensors and environmental monitors, computing and network infrastructure will need a rethink: a lot of that data will need to be analysed and processed at the edge of the network, rather than transported to a remote centralised data centre.

    With processing being done close to where data is generated, such architectures will be able to deliver better performance and efficiency, and ultimately allow companies to reduce their operational expenses.

    But like the IoT, the supposed benefits of edge computing also come with additional risks: adding more data-generating devices to your network in more locations — particularly those that are physically remote or aren’t well monitored — can lead to additional cyber security headaches.

    “Security at the edge remains a huge challenge, primarily because there are highly diverse use cases for IoT, and most IoT devices don’t have traditional IT hardware protocols. So the security configuration and software updates which are often needed through the lifecycle of the device may not be present,” says Barika Pace, research director at analyst firm Gartner.

    Reply
  40. Tomi Engdahl says:

    Edge Computing Emerges as Megatrend in Automation
    https://www.designnews.com/automation-motion-control/edge-computing-emerges-megatrend-automation/27888481159634?ADTRK=UBM&elq_mid=6086&elq_cid=876648

    Edge computing offers a range of benefits for automation control applications. It’s an emerging megatrend for controlling, collecting, and analyzing data at the network’s edge.

    Edge computing technology is quickly becoming a megatrend in industrial control, offering a wide range of benefits for factory automation applications. While the major cloud suppliers are expanding, new communications hardware and software technology are beginning to provide new solutions compared to the previous offerings used in factory automation.

    Reply
  41. Tomi Engdahl says:

    Fog Computing Could Make Smart City Applications More Reliable
    https://innovate.ieee.org/innovation-spotlight/fog-computing-anomaly-detection-smart-city/#utm_source=facebook&utm_medium=social&utm_campaign=innovation&utm_content=Fog%20Computing%20Anomaly%20Detection?LT=CMH_WB_2018_LM_XIS_Paid_Social

    Smart cities powered by connected sensors promise to transform everything from public transportation, to public health monitoring and energy grids. But the Internet of Things (IoT) will also require new ways to ensure timely and stable data flow among millions of connected devices – particularly for applications performing critical functions.

    New research out of Ghent University suggests anomaly detection through fog computing may be the answer to helping ensure the reliability of delay-sensitive and data-intensive applications that are driving growth in the IoT and making cities smarter.

    However, scalable, low-latency anomaly detection has become more feasible with 5G networks—due to their significantly greater data transmission capacity—as well as with new paradigms like Software-defined Networking (SDN), Network Function Virtualization (NFV) and edge computing.

    “Current anomaly detection approaches for IoT only focus on the centralized cloud management aspects,”

    Reply
  42. Tomi Engdahl says:

    Edge computing: the cyber security risks you must consider
    https://www.zdnet.com/article/edge-computing-the-cyber-security-risks-you-must-consider/

    Edge computing could be an innovative new way to collect data, but it also opens up a world of additional security headaches.

    But like the IoT, the supposed benefits of edge computing also come with additional risks: adding more data-generating devices to your network in more locations — particularly those that are physically remote or aren’t well monitored — can lead to additional cyber security headaches.

    “Security at the edge remains a huge challenge, primarily because there are highly diverse use cases for IoT, and most IoT devices don’t have traditional IT hardware protocols. So the security configuration and software updates which are often needed through the lifecycle of the device may not be present,” says Barika Pace, research director at analyst firm Gartner.

    “This is why when we talk about security in edge computing, tracking the threat landscape becomes more challenging,” she adds.

    Reply
  43. Tomi Engdahl says:

    Fog Computing Could Make Smart City Applications More Reliable
    https://innovate.ieee.org/innovation-spotlight/fog-computing-anomaly-detection-smart-city/#utm_source=facebook&utm_medium=social&utm_campaign=innovation&utm_content=Fog%20Computing%20Anomaly%20Detection?LT=CMH_WB_2018_LM_XIS_Paid_Social

    Smart cities powered by connected sensors promise to transform everything from public transportation, to public health monitoring and energy grids. But the Internet of Things (IoT) will also require new ways to ensure timely and stable data flow among millions of connected devices – particularly for applications performing critical functions.

    New research out of Ghent University suggests anomaly detection through fog computing may be the answer to helping ensure the reliability of delay-sensitive and data-intensive applications that are driving growth in the IoT and making cities smarter.

    Traditional anomaly detection approaches for identifying IoT problems are significantly impacted by latency, and therefore, not suitable for the real-time communication requirements of many IoT applications. However, scalable, low-latency anomaly detection has become more feasible with 5G networks—due to their significantly greater data transmission capacity—as well as with new paradigms like Software-defined Networking (SDN), Network Function Virtualization (NFV) and edge computing.

    “Current anomaly detection approaches for IoT only focus on the centralized cloud management aspects,” said José Santos, a researcher at Ghent University’s imec IDLab. “We propose a distributed fog computing-based anomaly detection approach that goes beyond the current state-of-the-art by considering not only cloud requirements but also low power wide area network (LPWAN) constraints.”

    Reply
  44. Tomi Engdahl says:

    Research Program Creating IoT Platform for Edge AI
    https://www.eetimes.com/document.asp?doc_id=1333941

    A joint project funded by European and South Korean research programs aims to develop an IoT platform empowering emerging artificial intelligence (AI) applications with on-demand computing at the edge of networks.

    As part of the project, research institute Leti announced that its sensiNact IoT middleware will be the core of the platform under development in the three-year development program.

    The joint project, called DECENTER, is focused on integrating the IoT, AI, the cloud, edge, fog computing and smart contracts, together with a secure blockchain. It aims to develop a platform facilitating an ecosystem in which computing and IoT resources (processing, memory, storage, connectivity, sensing, actuating) are orchestrated in multi-cloud, federated environments. In this “fog computing platform,” it envisages that all providers can share resources and be rewarded through the automatic execution of smart contracts logged and monitored via blockchain-based technologies.

    The federated fog platform intends to address processing of the ever-increasing amount of data continuously gathered from a myriad of heterogeneous IoT sensors and appliances.

    Reply
  45. Tomi Engdahl says:

    To comply with GDPR, most data should remain at the edge
    https://iot.eetimes.com/to-comply-with-gdpr-most-data-should-remain-at-the-edge/

    Companies using IoT devices are struggling to comply with the requirements of the European General Data Protection Regulation (GDPR). To avoid breaking the new law and thus being fined, companies should keep most of the data collected out of the cloud and process it at the edge.

    If there has been a theme capturing the attention of technology conferences’ attendees over the past two years, it would be GDPR.

    It doesn’t matter if the conversation is about wearables, 5G networks, Edge computing, payment systems, CCTV, or transit cards, at some point in the discussion the renowned four-letter acronym pops up.

    The recent IoT Solutions World Congress was no exception. During the session IoT, data and AI as key levers of digital transformation, Patrice Slupowski. Senior Vice-President Digital Innovation at Orange, argued that GDPR compliance is one the most challenging issues companies are facing when implementing IoT technologies.

    During his recent speech at the European Conference of Data Protection and Privacy Commissioners, Apple’s CEO Tim Cook praised the GDPR as an example that other governments should follow.

    “This year, you’ve shown the world that good policy and political will can come together to protect the rights of everyone. We should celebrate the transformative work of the European institutions tasked with the successful implementation of the GDPR…. It is time for the rest of the world, including my home country, to follow your lead.” Cook said.

    IoT devices are designed to collect data, lots of it!

    Worldwide, close to 8,000 new devices connect to the Internet every minute, 345 million per month, which will grow to an expected 50 billion by 2020. All these devices are designed to collect and transmit data.

    For example, most households in Europe and the US already have one connected device (wanted or not): the smart meter. Whether for electricity, gas, or water, smart meters collect thousands of data points per month, and sometimes every day.

    That data, attached to the resident’s account number, can be used to help the user choose a better service plan, adjust consumption, and improve the utility company’s forecast of the area demands. The same data, however, can also be used to determine the resident’s behavior, when the house or apartment is empty, what appliances are in use and, in some cases, how many people are on the premises. All of that without WiFi or any other internet connection service present in the house.

    Smart devices can help, but most usage data should remain at the edge

    The GDPR is explicit: Data collected should be used only to provide the service or services that users signed up for. The only exception that doesn’t require explicit user consent is data needed to comply with government regulations.

    Otherwise, any company collecting data requires opt-in consent from the user or customer. And a customer’s refusal to opt-in cannot be used to deny access to the product or service.

    That’s why it is essential to implement privacy-by-design and keep raw collected data at the edge.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*