- Doctoral Thesis
Rights / licenseIn Copyright - Non-Commercial Use Permitted
Many applications in robotics require awareness of the state of the robot and its environment. Faced with ambiguities arising in measurements from any single sensor, many applications turn to integrating multiple sensors with often complementary characteristics. This work addresses calibration of multi-sensor systems. It focuses on popular combinations of devices with numerous applications in robotics. Specifically, this work investigates sensor suites comprising cameras and IMU, cameras, IMU and LRF, and cameras and LRF. In this context, it pursues the objectives of providing accurate estimates of the spatial and temporal relations between these sensors and of advancing the understanding of individual measurement models to further improve robustness and accuracy in state estimation. This thesis builds on a large body of previous work on continuous-time estimation and formalizes each calibration problem in terms of probabilistic sensor models. Consequently, each calibration solution lives in the domain of MLE, which—under the condition of accurate sensor models—yields the most probable set of parameters to explain the sensor measurements. To this end, it introduces a novel approach to modeling of range measurements recorded by LRF. This model allows for accurate spatial and temporal calibration of the popular combination of cameras and LRF, yielding precisions in the order of 2 mm, a tenth of a degree and 1/20 ms for spatial and temporal parameters respectively. In contrast to established approaches that commonly employ an algebraic error formulation, the probabilistic model is extensible which enables improvements in the understanding of deterministic errors in range measurements. This capability is demonstrated for a deterministic range bias which, if accounted for, improves calibration precision. In many robotic systems, state estimation and low-level controls employ separate IMU, yielding a need for an accurate estimate of the transformation between these two devices. This thesis proposes a novel estimator that makes use of measurements from all cameras and IMU in a joint calibration. Building on the same underlying principle, it further advances the model of accelerometers by accounting for different displacements of the sensor structures that perceive specific forces in a single axis. The resulting calibration determines spatial and temporal parameters to precisions of 0.2 mm, a hundredth of a degree, and 2 microseconds respectively—to date the most precise for this class of approaches. Joint calibration is limited in the novel insights it can generate by the least sophisticated sensor model. Consequently, this work explores a more elaborated approach to formulating image sensor measurements. Drawing inspiration from similar approaches in state estimation, it models these directly on intensities rather than on abstracted quantities such as interest point locations. However, currently established direct methods lack a number of important factors. This thesis introduces a chain of models accounting for a number of factors ranging from target illumination over the PSF of the optics to motion blur from camera movements. Results highlight the potential of this approach, for example in rolling-shutter camera calibration, but also the challenges in matching precisions delivered by classical interest point methods. Show more
External linksSearch via SFX
Subjectspatial/temporal sensor calibration; camera calibration; maximum likelihood estimation; inertial measurement unit (IMU) calibration; laser range finder (LRF) calibration; camera/IMU calibration; multi-sensor systems
Organisational unit03737 - Siegwart, Roland Y. / Siegwart, Roland Y.
MoreShow all metadata