Self-Supervised Learning of LiDAR Odometry for Robotic Applications
METADATA ONLY
Date
2020-11-10
Publication Type
Working Paper
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
Reliable robot pose estimation is a key building block of many robot autonomy pipelines, with LiDAR localization being an active research domain. In this work, a versatile self-supervised LiDAR odometry estimation method is presented, in order to enable the efficient utilization of all available LiDAR data while maintaining real-time performance. The proposed approach selectively applies geometric losses during training, being cognizant of the amount of information that can be extracted from scan points. In addition, no labeled or ground-truth data is required, hence making the presented approach suitable for pose estimation in applications where accurate ground-truth is difficult to obtain. Furthermore, the presented network architecture is applicable to a wide range of environments and sensor modalities without requiring any network or loss function adjustments. The proposed approach is thoroughly tested for both indoor and outdoor real-world applications through a variety of experiments using legged, tracked and wheeled robots, demonstrating the suitability of learning-based LiDAR odometry for complex robotic applications.
Permanent link
Publication status
published
External links
Editor
Book title
Journal / series
Volume
Pages / Article No.
2011.05418
Publisher
Cornell University
Event
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
deep learning; LIDAR Data Processing; Robot Localization; robotics
Organisational unit
09570 - Hutter, Marco / Hutter, Marco
Notes
Funding
852044 - Learning Mobility for Real Legged Robots (EC)
188596 - Perceptive Dynamic Locomotion on Rough Terrain (SNF)
188596 - Perceptive Dynamic Locomotion on Rough Terrain (SNF)
Related publications and datasets
Is previous version of: