Show simple item record

dc.contributor.author
Rehder, Joern
dc.contributor.supervisor
Siegwart, Roland
dc.contributor.supervisor
Kelly, Jonathan
dc.date.accessioned
2019-04-01T06:29:56Z
dc.date.available
2019-03-30T11:25:52Z
dc.date.available
2019-04-01T06:29:56Z
dc.date.issued
2018
dc.identifier.uri
http://hdl.handle.net/20.500.11850/334970
dc.identifier.doi
10.3929/ethz-b-000334970
dc.description.abstract
Many applications in robotics require awareness of the state of the robot and its environment. Faced with ambiguities arising in measurements from any single sensor, many applications turn to integrating multiple sensors with often complementary characteristics. This work addresses calibration of multi-sensor systems. It focuses on popular combinations of devices with numerous applications in robotics. Specifically, this work investigates sensor suites comprising cameras and IMU, cameras, IMU and LRF, and cameras and LRF. In this context, it pursues the objectives of providing accurate estimates of the spatial and temporal relations between these sensors and of advancing the understanding of individual measurement models to further improve robustness and accuracy in state estimation. This thesis builds on a large body of previous work on continuous-time estimation and formalizes each calibration problem in terms of probabilistic sensor models. Consequently, each calibration solution lives in the domain of MLE, which—under the condition of accurate sensor models—yields the most probable set of parameters to explain the sensor measurements. To this end, it introduces a novel approach to modeling of range measurements recorded by LRF. This model allows for accurate spatial and temporal calibration of the popular combination of cameras and LRF, yielding precisions in the order of 2 mm, a tenth of a degree and 1/20 ms for spatial and temporal parameters respectively. In contrast to established approaches that commonly employ an algebraic error formulation, the probabilistic model is extensible which enables improvements in the understanding of deterministic errors in range measurements. This capability is demonstrated for a deterministic range bias which, if accounted for, improves calibration precision. In many robotic systems, state estimation and low-level controls employ separate IMU, yielding a need for an accurate estimate of the transformation between these two devices. This thesis proposes a novel estimator that makes use of measurements from all cameras and IMU in a joint calibration. Building on the same underlying principle, it further advances the model of accelerometers by accounting for different displacements of the sensor structures that perceive specific forces in a single axis. The resulting calibration determines spatial and temporal parameters to precisions of 0.2 mm, a hundredth of a degree, and 2 microseconds respectively—to date the most precise for this class of approaches. Joint calibration is limited in the novel insights it can generate by the least sophisticated sensor model. Consequently, this work explores a more elaborated approach to formulating image sensor measurements. Drawing inspiration from similar approaches in state estimation, it models these directly on intensities rather than on abstracted quantities such as interest point locations. However, currently established direct methods lack a number of important factors. This thesis introduces a chain of models accounting for a number of factors ranging from target illumination over the PSF of the optics to motion blur from camera movements. Results highlight the potential of this approach, for example in rolling-shutter camera calibration, but also the challenges in matching precisions delivered by classical interest point methods.
en_US
dc.format
application/pdf
en_US
dc.language.iso
en
en_US
dc.publisher
ETH Zurich
en_US
dc.rights.uri
http://rightsstatements.org/page/InC-NC/1.0/
dc.subject
spatial/temporal sensor calibration
en_US
dc.subject
camera calibration
en_US
dc.subject
maximum likelihood estimation
en_US
dc.subject
inertial measurement unit (IMU) calibration
en_US
dc.subject
laser range finder (LRF) calibration
en_US
dc.subject
camera/IMU calibration
en_US
dc.subject
multi-sensor systems
en_US
dc.title
Multi-Sensor System Calibrations
en_US
dc.type
Doctoral Thesis
dc.rights.license
In Copyright - Non-Commercial Use Permitted
dc.date.published
2019-04-01
ethz.size
171 p.
en_US
ethz.code.ddc
DDC - DDC::6 - Technology, medicine and applied sciences::620 - Engineering & allied operations
ethz.identifier.diss
24957
en_US
ethz.publication.place
Zurich
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02130 - Dep. Maschinenbau und Verfahrenstechnik / Dep. of Mechanical and Process Eng.::02620 - Inst. f. Robotik u. Intelligente Systeme / Inst. Robotics and Intelligent Systems::03737 - Siegwart, Roland Y. / Siegwart, Roland Y.
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02130 - Dep. Maschinenbau und Verfahrenstechnik / Dep. of Mechanical and Process Eng.::02620 - Inst. f. Robotik u. Intelligente Systeme / Inst. Robotics and Intelligent Systems::03737 - Siegwart, Roland Y. / Siegwart, Roland Y.
en_US
ethz.date.deposited
2019-03-30T11:25:56Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2019-04-01T06:30:37Z
ethz.rosetta.lastUpdated
2021-02-15T04:11:54Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Multi-Sensor%20System%20Calibrations&rft.date=2018&rft.au=Rehder,%20Joern&rft.genre=unknown&rft.btitle=Multi-Sensor%20System%20Calibrations
 Search print copy at ETH Library

Files in this item

Thumbnail

Publication type

Show simple item record