Mixed Reality Human-Robot Interface to Generate and Visualize 6DoF Trajectories: Application to Omnidirectional Aerial Vehicles
METADATA ONLY
Author / Producer
Date
2023
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
Omnidirectional aerial vehicles are an attractive tool for automated inspection tasks. Planning suitable trajectories in industrial environments is not trivial though and often requires human input. Existing trajectory planning tools generally rely on prior and accurate models of both the environment and the vehicle. Furthermore, their common 2D visualization for human operators is generally unsuitable for intuitive understanding of motions in SE(3). In this work, we exploit Mixed Reality to improve and simplify mission planning, by allowing the user to generate and perceive a trajectory directly in the real environment. The operator can precisely and intuitively plan a dynamically-feasible 6DoF trajectory by adding and modifying waypoints. Each waypoint is visualized as a holographic representation of the physical robot including the camera frustum for visual inspection tasks. Dynamic and static holograms corresponding to spatial and temporal information of the resulting trajectory are also overlaid onto the real world, allowing an operator to quickly assess potential collisions and inspection coverage. We experimentally demonstrate the effectiveness of the developed application and indicate its efficiency based on related work. The encouraging results motivate future quantitative evaluations in the form of user studies.
Permanent link
Publication status
published
Editor
Book title
2023 International Conference on Unmanned Aircraft Systems (ICUAS)
Journal / series
Volume
Pages / Article No.
395 - 400
Publisher
IEEE
Event
International Conference on Unmanned Aircraft Systems (ICUAS 2023)