The Linux Foundation Projects
Skip to main content
By | November 3, 2020

Edge Excitement! Innovation & Collaboration (Q4 2020)

Written by Ryan Anderson, Member and Contributor of LF Edge’s Open Horizon Project and IBM Architect in Residence, CTO Group for IBM Edge

This article originally ran on Ryan’s LinkedIN page

Rapid innovation in edge computing

It is an exciting time for the edge computing community! Since my first post April 2019, we are seeing a rapid acceleration of innovation driven by the convergence of multiple factors:

·     a convergence towards shared “mental models” in the edge solution space;

·     the increasing power of edge devices – pure CPU, as well as CPU plus GPU/VPU;

·     enormous investments in 5G infrastructures by network and communications stakeholders;

·     new collaborations across several IT/OT/IOT/Edge ecosystems;

·     increasing participation in, and support for, open source foundations such as LF Edge, by major players; and

·     widespread adoption of Kubernetes and Docker containers as a core layer of the edge.

With this convergence, innovation and accelerating adoption, Gartner’s prediction that 75% of enterprise-generated data will be created and processed at the edge, appears prescient.

Edge nodes – from datacenters to devices 

Much like “AI” and “IT” – edge computing is a broad and nebulous term that means different things to different stakeholders. In the diagram below, we consider four points of view for edge:

  1. Industrial Edge
  2. Enterprise Network Cloud Edge
  3. 5G / Telco Edge
  4. Consumer and Retail Edge
No alt text provided for this image

 

This model illustrates a few key ideas:

·     Some edge use cases fall squarely within one quadrant – whereas others span two, or sometimes three.

·     Solution mapping will help shape architecture discussions and may inform which stakeholders should be involved in conversations.

·     Edge can mean very different things to different people; and consequently, value propositions (and ROI/KPI) will also vary dramatically.

Technology tools for next generation edge computing must be flexible enough to work across different edge quadrants and work across different types of edge nodes.

Terminology. And what is edge computing?  

At IBM our edge computing definition is “act on insights closer to where data is created.”

We define edge node generically as any edge device, edge cluster, or edge server/gateway on which computing (workload, applications, analytics) can be performed, outside of public or private cloud infrastructure, or the central IT data center.

An edge device is a special-purpose piece of equipment that also has compute capacity integrated into that device on which interesting work can be performed. An assembly machine on the factory floor, an ATM, an intelligent camera or a next-generation automobile are all examples of edge devices. It is common to find edge devices with ARM or x86 class CPUs with one or two cores, 128MB of memory, and perhaps 1 GB of local persistent storage.

Sometimes edge devices include GPUs (graphics processing unit) and VPUs (vision processing units) – optimized chips that are very good for running AI models and inferencing on edge devices.

Fixed function IOT equipment that lack general open compute are not typically considered edge nodes, but rather IOT sensors. IOT sensors often interoperate with edge devices – but are not the same thing, as we see on the left side of this diagram.

 

No alt text provided for this image

An edge cluster is a general-purpose IT computer located in remote premises, such as a factory, retail store, hotel, distribution center or bank, for example – and typically used to run enterprise application workloads and shared services.

Edge nodes can also live within network facilities such as central offices, regional data-centers and hub locations operated by a network provider, or a metro facility operated by colocation provider.

An edge cluster is typically an industrial PC, or racked server, or an IT appliance.

Often, edge clusters include GPU/VPU hardware.

Tools for devices to data centers

IBM Edge Application Manager (IEAM) and Red Hat have created reference architectures and tools to manage the workload cross CPU and GPU/VPU compute resources.

Customers want simplicity. IEAM can provide simplicity with a single pane of glass to manage and orchestrate workloads from core to edge, across multiple clouds.

For edge clusters running Red Hat OpenShift Container Platform (OCP), a Kubernetes-based GPU/VPU Operators, solves the problem of needing unique operating system (OS) images between GPU and CPU nodes; instead, the GPU Operator bundles everything you need to support the GPU — the driver, container runtime, device plug-in, and monitoring with deployment by a Helm chart. Now, a single gold master image covers both CPU and GPU nodes.

Caution: Avoid fragmentation and friction with open source

This is indeed an exciting time for the edge computing community, as seen by the acceleration of innovation and emerging use case and architectures.

However, there is an area of concern as relates to fragmentation and friction in this emerging space.

Because the emerging edge market is enormous, there is a risk that some incumbents or niche players may be tempted to “go it alone,” trying to secure and defend a small corner (fragment) of a large space with a proprietary solution. If too many stakeholders do this – edge computing may fail to reach its potential.

This approach can be dangerous for companies for three reasons:

(1)  While isolated walled-garden (defensive) approach may work short term, over time isolated technology stacks may get left behind.

(2)  Customers are increasingly wary of attempts to vendor lock in and will source more flexible solutions.

(3)  Innovation is a team sport (e.g. Linux, Python).

Historically, emergent technologies can also encounter friction when key industry participants or standards organization are not working closely enough together (GSM/CDMA; VHS/Beta or HD-DVD/Blu-ray; Industrial IOT; Digital Twins).

So, what can we do to encourage collaboration?

The answer is open source.

Open source to reduce friction and increase collaboration

The IBM Edge team believes working with and through the open source community is the right approach to help edge computing evolve and reach its potential in the coming years.

IBM has a long history and strong commitment to open source. IBM was one of the earliest champions of communities like Linux, Apache, and Eclipse, pushing for open licenses, open governance, and open standards.

IBM engineers began contributing to Linux and helped to establish the Linux Foundation in 2000. In 1999, we helped to create the Apache Software Foundation (ASF) and supported the creation of the Eclipse Foundation in 2004 – providing open source developers a neutral place to collaborate and innovate in the open.

Continuing our tradition of support for open source collaboration, IBM and Red Hat are active members of Linux Foundation LF Edge;

  • LF Edge is an umbrella organization for several projects that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system.
  • By bringing together industry leaders, LF Edge will create a common framework for hardware and software standards and best practices critical to sustaining current and future generations of IoT and edge devices.
  • Fostering collaboration and innovation across the multiple industries including industrial manufacturing, cities and government, energy, transportation, retail, home and building automation, automotive, logistics and health care
No alt text provided for this image

IBM is an active contributor to Open Horizon – one of the LF Edge projects – and the core of IBM Edge Application Manager; LF Edge’s Open Horizon is an open source platform for managing the service software lifecycle of containerized workloads and related machine learning assets. It enables autonomous management of applications deployed to distributed webscale fleets of edge computing nodes – clusters and devices based on Kubernetes and Docker – all from a central management hub.

Open Horizon is already working with several other LF Edge projects including EdgeX Foundry, Fledge and SDO (Secure Device Onboard)

SDO makes it easy and secure to configure edge devices and associate them with an edge management hub. Devices built with SDO can be added as an Edge Node by simply importing their associated ownership vouchers and then powering on the edge devices.

Additional Resources for Open Horizon

Open-Horizon documentation: https://open-horizon.github.io

Open-Horizon GitHub (source code): https://github.com/open-horizon

Example programs for Open-Horizon: https://github.com/open-horizon/examples

Open-Horizon Playlist on YouTube: https://bit.ly/34Xf0Ge