A 3D Mixed Reality Interface for Human-Robot Teaming
METADATA ONLY
Loading...
Author / Producer
Date
2024
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
This paper presents a mixed-reality human-robot teaming system. It allows human operators to see in real-time where robots are located, even if they are not in line of sight. The operator can also visualize the map that the robots create of their environment and can easily send robots to new goal positions. The system mainly consists of a mapping and a control module. The mapping module is a real-time multi-agent visual SLAM system that co-localizes all robots and mixed-reality devices to a common reference frame. Visualizations in the mixed-reality device then allow operators to see a virtual life-sized representation of the cumulative 3D map overlaid onto the real environment. As such, the operator can effectively "see through" walls into other rooms. To control robots and send them to new locations, we propose a drag-and-drop interface. An operator can grab any robot hologram in a 3D mini map and drag it to a new desired goal pose. We validate the proposed system through a user study and real-world deployments. We make the mixed-reality application publicly available at github.com/cvg/HoloLens_ros.
Permanent link
Publication status
published
Editor
Book title
2024 IEEE International Conference on Robotics and Automation (ICRA)
Journal / series
Volume
Pages / Article No.
11327 - 11333
Publisher
IEEE
Event
41st IEEE International Conference on Robotics and Automation (ICRA 2024)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Organisational unit
03766 - Pollefeys, Marc / Pollefeys, Marc
Notes
Funding
Related publications and datasets
Is supplemented by: https://github.com/cvg/HoloLens_ros