The Linux Foundation Projects
Skip to main content
By | April 6, 2021

How Edge computing enables Artificial Intelligence


By Utpal Mangla (VP & Senior Partner; Global Leader: IBM’s Telecom Media Entertainment Industry Center of Competence), 
Ashek Mahmood (IoT Edge AI – Partner & Practice Lead, IBM), and Luca Marchi (Associate Partner – Innovation Leader – IBM’s Telecom Media Entertainment Industry Center of Competence)

Edge computing as an enabler

Artificial Intelligence is the engine of the most recent changes in business. Enterprises are infusing AI into their processes to digitally transform the way they operate.

  Operations intensive industries are transforming with AI powered IoT applications running at the edge of the physical environments. For examples, the number of connected and intelligent devices is already into billions, generating more data every day. These devices are currently or will be soon connected via 5G, making them able to share enriched data like videos or live streams.

These potential use cases are raising some concerns in terms of latency, network bandwidth, privacy and user experience. Edge computing, as an enabler of dynamic delivery of computing power is an answer to these concerns. 

Edge is faster, allowing new instantaneous computation even in absence of 5G. Low latency in fundamental in all mission critical use cases, like for example self-driving cars, remote healthcare and or certain manufacturing applications. AI alone would not be sufficient without the ability to make a decision in a matter of milliseconds. Edge improves the efficiency of some use cases: local computation reduced the load of network bandwidth generating savings to the enterprise. 

On the other side, edge boost the use cases efficiency allowing architects to deploy computational power and AI where it makes the most sense on the network: for example, directly on an oil rig to support a worker safety use case.

Finally, edge computing enables data security: deploying computational capabilities to a specific location at the edge of the network makes it possible to leverage data that is not supposed to leave that location and cannot be processed on a public cloud. 

Applying edge computing to mission critical situations

One good example of an edge application is the monitoring of public venues, for instance a train station. A train station is a complex and critical infrastructure: a large area, densely crowded where security is paramount. Edge computing allows the implementation of an innovative protection system through video analysis and data processing in real time, designed to recognize specific situations like unattended objects, people laying on the ground, abnormal flows of people. This information is critical to trigger the prompt reaction of the authorities in case of a particular event and people in need of help.

The use case entails 24/7 video stream from several cameras spread across the station. Centralizing the AI models in the cloud would prove very inefficient: streaming and storing the videos would be very expensive. Edge computing, instead, enables the deployment of the visual recognition machine learning model on a network node in the station: this way the data never leaves the station, the need for storage and bandwidth is reduced and he use case is financially viable. Also, keeping the AI models at the edge reduces the latency and speeds up the alert and reaction processes.

Continuing with the theme of the rail industry, safety inspection for trains is being automated with AI powered Vision systems running on the edge of the railway tracks. When a train goes through a portal of cameras, it can generate 10,000+ images capturing 360 view of the cars and there can be 200+ visual Inspection use cases. Complex cases may require object detection, measurement, image stitching and augmentations through deep learning neural networks. An intelligent edge solution for this case will have the AI systems be connected at close proximity of the cameras and will consider a full stack technology architecture to optimize data and compute intensive workloads. 

Below image gives a high-level view of architecture components that come to play here.

An integrated pipeline of edge AI applications are required to run autonomous aggregation and scheduling of these AI models. Hardware acceleration and GPU/CPU optimizations practices are often required to achieve the desired performance and throughputs. Different technology solutions will provide distinct competitive advantages — this is true in the world of high-precision computing and transaction processing, in the world of AI at the edge. The future is one full of heterogeneous compute architectures, orchestrated in an ecosystem based on open standards to help unleash the potential of data and advanced computation to create immense economic value.

A step forward

A robust edge computing architecture will develop and support a hybrid AI Ops architecture to manage workloads running on a series of heterogeneous systems of edge, core, & cloud nodes. These phenomena will allow artificial intelligence to take a step forward towards the adoption of mission critical applications. These next generational use cases will strengthen the partnership between humans and AI towards a sustainable future.


Utpal Mangla: VP & Senior Partner; Global Leader: IBM’s Telecom Media Entertainment Industry Center of Competence
Ashek Mahmood: IoT Edge AI – Partner & Practice Lead, IBM
Luca Marchi: Associate Partner – Innovation Leader – IBM’s Telecom Media Entertainment Industry Center of Competence