- Conference Paper
Rights / licenseIn Copyright - Non-Commercial Use Permitted
In this work, we present a perception-aware path-planning pipeline for Unmanned Aerial Vehicles (UAVs) for navigation in challenging environments. The objective is to reach a given destination safely and accurately by relying on monocular camera-based state estimators, such as Keyframe-based Visual-Inertial Odometry (VIO) systems. Motivated by the recent advances in semantic segmentation using deep learning, our path-planning architecture takes into consideration the semantic classes of parts of the scene that are perceptually more informative than others. This work proposes a planning strategy capable of avoiding both texture-less regions and problematic areas, such as lakes and oceans, that may cause large drift or failures in the robot’s pose estimation, by using the semantic information to compute the next best action with respect to perception quality. We design a hierarchical planner, composed of an A* path-search step followed by B-Spline trajectory optimization. While the A* steers the UAV towards informative areas, the optimizer keeps the most promising landmarks in the camera’s field of view. We extensively evaluate our approach in a set of photo-realistic simulations, showing a remarkable improvement with respect to the state-of-the-art in active perception. Show more
Book title2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Pages / Article No.
SubjectROBOTICS; Path Planning; Visual Inertial Odometry
Organisational unit09559 - Chli, Margarita (ehemalig) / Chli, Margarita (former)
NotesConference lecture held on October 27, 2020. Due to the Coronavirus (COVID-19) the conference was conducted virtually.
MoreShow all metadata