Akraino Release 9: Powering the Future of Physical AI at the Edge
The LF Edge project Akraino is preparing for its upcoming Release 9, which will mark a major step toward enabling Physical AI, the convergence of artificial intelligence, edge computing, and real-world interaction.
What Is Physical AI?
Physical AI represents a new phase in intelligent systems, combining recognition, prediction, and action in the physical world. Unlike purely digital AI systems, Physical AI connects sensing and decision-making with tangible movements and real-time feedback through robotics, drones, and other smart systems operating in real environments.
Akraino is evolving to incorporate this next phase by integrating the fundamentals of edge computing, including high performance, energy efficiency, cloud native architecture, telecom-grade reliability, IoT integration, and standards compliance, with Physical AI capabilities. As this vision advances, seamless coordination across edge, cloud, network, and disconnected (autonomous) operations will play an increasingly critical role.
Introducing the “Physical AI at the Edge” Blueprint Family
As part of Release 9, Akraino is introducing a new blueprint family: Physical AI at the Edge, designed to enable real-time, intelligent coordination between machines, humans, and their environments.
These blueprints will focus on several key areas:
1. Machine-to-Machine Communication and Collaboration
As fleets of drones, robots, and autonomous systems expand, reliable coordination becomes essential. The new Akraino blueprints aim to define open frameworks for collaboration that ensure safety, interoperability, and efficiency across vendors and operators.
Example use cases:
- Drone deliveries by multiple operators, avoiding collisions and property damage
- Remote landing procedures that adjust for battery loss or weather conditions
2. Machine-to-Human Interaction
Physical AI systems must interact safely and effectively with people. Akraino’s upcoming blueprints will introduce human-aware interaction models to help systems respond to voice commands, environmental cues, and safety alerts.
Example use cases:
- Voice-aware machines that respond to human instructions in safety-critical scenarios
- Automated notifications, reporting, and logging for transparency and traceability
3. Environmental Sensing and Management
Akraino edge architectures enable drones and mobile robots to perform environmental monitoring, prediction, and response in real time.
Example use cases:
- Flood and wildfire detection, prediction, and rescue coordination
- First-responder machine assistants supporting emergency operations
Built for the Next Generation of Connectivity
Release 9 will align with LTE, 5G-Advanced, and Beyond-5G specifications to support ultra-reliable, low-latency Physical AI applications. It will also explore cellular and satellite communication for operations where cloud connectivity is unavailable, leveraging multicast and peer-to-peer (P2P) methods for continued coordination in disconnected environments.
Why It Matters
As industries move toward autonomous systems and intelligent infrastructure, the need for open, interoperable, and edge-optimized platforms becomes increasingly vital. Akraino’s upcoming Release 9 positions the project as a key enabler of this evolution, providing the open source foundation for a world where intelligent systems can act, learn, and collaborate in the physical world.
Get Involved
Interested in contributing to the Akraino community? Reach out to the Akraino TSC members at tsc@lists.akraino.org.
The community can help you get started with Akraino blueprints, share your use cases, and publish your open source software stacks.
