Metadata only
Date
2024Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
Radiance field methods such as Neural Radiance Fields (NeRFs) or 3D Gaussian Splatting (3DGS), have revolutionized graphics and novel view synthesis. Their ability to synthesize new viewpoints with photo-realistic quality, as well as capture complex volumetric and specular scenes, makes them an ideal visualization for robotic teleoperation setups. Direct camera teleoperation provides high-fidelity operation at the cost of maneuverability, while reconstruction-based approaches offer controllable scenes with lower fidelity. With this in mind, we propose replacing the traditional reconstruction-visualization components of the robotic teleoperation pipeline with online Radiance Fields, offering highly maneuverable scenes with photorealistic quality. As such, there are three main contributions to state of the art: (1) online training of Radiance Fields using live data from multiple cameras, (2) support for a variety of radiance methods including NeRF and 3DGS, (3) visualization suite for these methods including a virtual reality scene. To enable seamless integration with existing setups, these components were tested with multiple robots in multiple configurations and were displayed using traditional tools as well as the VR headset. The results across methods and robots were compared quantitatively to a baseline of mesh reconstruction, and a user study was conducted to compare the different visualization methods. Show more
Publication status
publishedExternal links
Book title
2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)Pages / Article No.
Publisher
IEEEEvent
Organisational unit
02261 - Center for Sustainable Future Mobility / Center for Sustainable Future Mobility
Related publications and datasets
Is supplemented by: https://leggedrobotics.github.io/rffr.github.io/
Notes
Conference lecture held on October 18, 2024.More
Show all metadata
ETH Bibliography
yes
Altmetrics