LiDAR-guided object search and detection in Subterranean Environments
OPEN ACCESS
Loading...
Date
2022
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
OPEN ACCESS
Data
Rights / License
Abstract
Detecting objects of interest, such as human survivors, safety equipment, and structure access points, is critical to any search-and-rescue operation. Robots deployed for such time-sensitive efforts rely on their onboard sensors to perform their designated tasks. However, as disaster response operations are predominantly conducted under perceptually degraded conditions, commonly utilized sensors such as visual cameras and LiDARs suffer in terms of performance degradation. In response, this work presents a method that utilizes the complementary nature of vision and depth sensors to leverage multi-modal information to aid object detection at longer distances. In particular, depth and intensity values from sparse LiDAR returns are used to generate proposals for objects present in the environment. These proposals are then utilized by a Pan-Tilt-Zoom (PTZ) camera system to perform a directed search by adjusting its pose and zoom level for performing object detection and classification in difficult environments. The proposed work has been thoroughly verified using an ANYmal quadruped robot in underground settings and on datasets collected during the DARPA Subterranean Challenge finals.
Permanent link
Publication status
published
Editor
Book title
2022 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)
Journal / series
Volume
Pages / Article No.
41 - 46
Publisher
IEEE
Event
2022 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR 2022)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Robotics; Search and rescue robots; LiDAR; object detection; Computer vision
Organisational unit
09570 - Hutter, Marco / Hutter, Marco
Notes
Conference lecture held on November 8, 2022.