Circuits and Systems for Embodied AI: Exploring uJ Multi-Modal Perception for Nano-UAVs on the Kraken Shield
METADATA ONLY
Loading...
Author / Producer
Date
2024
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
Embodied AI requires pushing complex multi-modal models to the extreme edge for time-constrained tasks such as autonomous navigation of robots and vehicles. On small form-factor devices, e.g., nano-UAVs, such challenges are exacerbated by stringent constraints on energy efficiency and weight. In this paper, we explore embodied multi-modal AI-based perception for Nano-UAVs with the Kraken shield, a 7g multi-sensor (frame-based and event-based imagers) board based on Kraken, a 22 nm SoC featuring multiple acceleration engines for multi-modal event and frame-based inference based on spiking (SNN) and ternary (TNN) neural networks, respectively. Kraken can execute SNN real-time inference for depth estimation at 1.02 k inf/s, 18 μJ/inf, TNN real-time inference for object classification at 10 k inf/s, 6μ J/inf, and real-time inference for obstacle avoidance at 221 frame/s, 750 μJ/inf.
Permanent link
Publication status
published
Editor
Book title
2024 IEEE European Solid-State Electronics Research Conference (ESSERC)
Journal / series
Volume
Pages / Article No.
1 - 4
Publisher
IEEE
Event
50th IEEE European Solid-State Electronics Research Conference (ESSERC 2024)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
embodied AI; Nano-UAVs; Multi-Modal AI
Organisational unit
03996 - Benini, Luca / Benini, Luca