Vision-Based Navigation and Parameter Identification for Unmanned Aerial Vehicles
- Doctoral Thesis
Rights / licenseIn Copyright - Non-Commercial Use Permitted
With the ability to navigate freely in 3 D space, Micro Aerial Vehicles (MAVs) offer the possibility to reach places that are otherwise inaccessible to humans. This is especially important for industrial inspection where the structures are large and need to be checked on a regular basis. However, doing this inspection remotely is difficult for a human pilot due to the fast dynamics of MAVs and how difficult it is to estimate distances to far away objects. In this thesis, we therefore wish to advance the capabilities of such systems through the development of a framework that can generate a 3 D map of the environment, and navigate between any points within this map without human intervention. Since MAVs are heavily restricted in terms of payload, the primary sensors used for navigation in this work are a stereo camera and an IMU. These sensors introduce interesting challenges and need to be considered in the design of each part of the visual navigation framework. The first contribution of this thesis is in state estimation and control. We showed that both parts can strongly benefit from a good model which provides valuable information about possible motions. Including the model in the state estimator combined with a pressure sensor renders the velocity and the two inclination angles observable. This is critical for control. Model Predictive Control (MPC) allows including the future trajectory and constraints given by the actuators and sensors in the optimization. The result is good trajectory tracking performance with smooth motions which in turn improves the vision based state estimation. For fully autonomous navigation state estimation needs to be extended with the capability to build a map of the environment and re-localize with respect to this map. In this thesis, we show how such a global mapping system can be used for visual navigation tasks on-board MAVs and how it can be extended for global planning. We carefully consider the implications of re-localization and possible failures in the mapping process. This consideration has enabled the development of a visual navigation framework that is well suited for industrial inspection. Many aspects of MAV navigation can benefit from an accurate model. Our last contribution is therein how to identify the model parameters. Based on Maximum Likelihood (ML) estimation we developed two approaches for parameter identification. The first provides the parameters that maximize the likelihood of all measurements and is used for off-line identification. In the second approach, we use a similar formulation but directly marginalize out old states and only keep the current mean and covariance. This allows on-line estimation of the parameters and the possibility for direct feedback in the controller. One of our key observations is that the quality of the parameter identification strongly depends on the motion. We validate these findings by developing a planner that generates informative trajectories that improve the identification. This leads to repeatable results and eliminates the need for expert knowledge about how to best move the MAV. The theoretical contributions of this thesis are validated on real MAVs using our custom build camera for state estimation. We demonstrated that our visual navigation framework is able to create consistent 3 D maps of an industrial environment and safely navigate the MAV between any two points. Furthermore, we evaluated our parameter identification with the automatically generated informative trajectories. With trajectories of only 30s length we are able to identify the parameters of an accurate MAV model. Together, these comprise key building blocks for a vision based navigation framework in a variety of applications. Show more
External linksSearch print copy at ETH Library
Organisational unit03737 - Siegwart, Roland Y. / Siegwart, Roland Y.
MoreShow all metadata