- Conference Paper
Rights / licenseIn Copyright - Non-Commercial Use Permitted
Visual Inertial Odometry (VIO) has been widely used and researched to control and aid the automation of navigation of robots especially in the absence of absolute position measurements, such as GPS. However, when the observable landmarks in the scene lie far away, as in high-altitude flights for example, the fidelity of the metric scale estimate in VIO greatly degrades. Aiming to tackle this issue, in this work, we utilize the virtual stereo setup formed by two Unmanned Aerial Vehicles (UAVs), equipped with one camera and one Inertial Measurement Unit (IMU) each, exploiting their view overlap and relative distance measurements between them using onboard Ultrawideband Module (UWB) modules to enable collaborative VIO. In particular, we propose a decentralized collaborative estimation scheme, where each agent holds its own local map, achieving a low pose estimation latency, while ensuring consistency of each agents' estimates via consensus-based optimization. Following a thorough evaluation in photorealistic simulations, we demonstrate the effectiveness of the approach at high-altitude flights of up to 160m, going significantly beyond the capabilities of state-of-the-art VIO methods. Finally, we show the advantage of actively adjusting the baseline on-the-fly over a fixed, target baseline, resulting in a significant reduction of the estimation error. Show more
Pages / Article No.
SubjectCollaborative SLAM; Visual Inertial Odometry; Unmanned aerial vehicle (UAV); Sensor Fusion
Organisational unit09559 - Chli, Margarita (SNF-Professur) / Chli, Margarita (SNF-Professur)
183720 - Collaborative Vision-based Perception, Towards Intellingent Robotic Teams (SNF)
MoreShow all metadata