Show simple item record

dc.contributor.author
Scaramuzza, Davide
dc.contributor.supervisor
Van Gool, Luc
dc.contributor.supervisor
Siegwart, Roland
dc.date.accessioned
2017-10-25T10:10:19Z
dc.date.available
2017-06-08T21:20:28Z
dc.date.available
2017-10-25T10:10:19Z
dc.date.issued
2007
dc.identifier.uri
http://hdl.handle.net/20.500.11850/14333
dc.identifier.doi
10.3929/ethz-a-005567197
dc.description.abstract
For mobile robots to be able to work with and for people and thus operatein our everyday environments, they need to be able to acquire knowledge through perception. In other words they need to collect sensor measure- ments from which they extract meaningful information. This thesis covers some of the essential components of a robot perception system combining omnidirectional vision, odometry, and 3D laser range finders, from modeling to extrinsic calibration, from feature extraction to ego-motion estimation. We covers all these topics from the “point of view” of an omnidirectional camera. The contributions of this work are several and are listed here. The thesis starts with an overview of the geometry of central omnidirectional cameras and gives also an overview of previous calibration methods. The contributions of this section are three. The first two are a new generalized model for describing both dioptric and catadioptric cameras and a calibration method which takes advantage of planar grids shown around the cameras, like the method in use for standard perspective cameras. The third contribution is the implementation of a toolbox for Matlab (called OCamCalib and freely available on-line), which implements the proposed calibration procedure. The second part of the thesis is dedicated to the extraction and matching of vertical features from omnidirectional images. Vertical features are usually very predominant in indoor and outdoor structured environments and can then be very useful for robot navigation. The contribution of this part is a new method for matching vertical lines. The proposed method takes ad-vantage of a descriptor that is very distinctive for each feature. Furthermore, this descriptor is invariant to rotation and slight changes of illumination. The third part of the thesis is devoted to the extrinsic calibration of an omnidirectional camera with the odometry (i.e. wheel encoders) of a mobile robot. The contribution of this part is a new method of automatic self-iii calibration while the robot is moving. The method is based on an extended Kalman filter that combines the encoder readings with the bearing angle observations of one ore more vertical features in the environment. Furthermore, an example of robot motion estimation is shown using the so calibrated camera-odometry system. The fourth part of the thesis is dedicated to the extrinsic calibration of an omnidirectional camera with a 3D laser range finder. The contribution of this method is that it uses no calibration object. Conversely, calibration is performed using laser-camera correspondences of natural points that are manually selected by the user. The novelty of the method resides in a new technique to visualize the usually ambiguous 3D information of range finders. We show that is possible to transform the range information into a new image where natural features of the environment are highlighted. Therefore, finding laser-camera correspondences becomes as easy as image pairing. The last part of the thesis is devoted to visual odometry for outdoor ground vehicles. We show a new method to recover the trajectory of a calibrated omnidirectional camera over several hundred of meters by combining a feature based with an appearance based approach. All the contributions of this thesis are validated through experimental results using both simulated and real data.
en_US
dc.format
application/pdf
dc.language.iso
en
en_US
dc.publisher
ETH
en_US
dc.rights.uri
http://rightsstatements.org/page/InC-NC/1.0/
dc.subject
MOTION PLANNING + MOTION COURSE + TRAJECTORY PLANNING (ROBOTICS)
en_US
dc.subject
BEWEGUNGSVERLAUF + BEWEGUNGSABLAUF + BAHNPLANUNG (ROBOTIK)
en_US
dc.subject
BEWEGUNGSWAHRNEHMUNG (COMPUTERVISION)
en_US
dc.subject
MOTION PERCEPTION (COMPUTER VISION)
en_US
dc.subject
MOBILE ROBOTS
en_US
dc.subject
MOBILE ROBOTER
en_US
dc.title
Omnidirectional Vision
en_US
dc.type
Doctoral Thesis
dc.rights.license
In Copyright - Non-Commercial Use Permitted
ethz.title.subtitle
From Calibration to Root Motion Estimation
en_US
ethz.size
1 Band
en_US
ethz.code.ddc
DDC - DDC::6 - Technology, medicine and applied sciences::621.3 - Electric engineering
en_US
ethz.identifier.diss
17635
en_US
ethz.identifier.nebis
005567197
ethz.publication.place
Zürich
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02130 - Dep. Maschinenbau und Verfahrenstechnik / Dep. of Mechanical and Process Eng.::02620 - Inst. f. Robotik u. Intelligente Systeme / Inst. Robotics and Intelligent Systems::03737 - Siegwart, Roland Y. / Siegwart, Roland Y.
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02130 - Dep. Maschinenbau und Verfahrenstechnik / Dep. of Mechanical and Process Eng.::02620 - Inst. f. Robotik u. Intelligente Systeme / Inst. Robotics and Intelligent Systems::03737 - Siegwart, Roland Y. / Siegwart, Roland Y.
ethz.date.deposited
2017-06-08T21:20:39Z
ethz.source
ECOL
ethz.source
ECIT
ethz.identifier.importid
imp59364c3cbaa3942575
ethz.identifier.importid
imp59366abd26a3458406
ethz.ecolpid
eth:30301
ethz.ecitpid
pub:25933
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2017-07-20T17:24:37Z
ethz.rosetta.lastUpdated
2021-02-14T19:42:39Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Omnidirectional%20Vision&rft.date=2007&rft.au=Scaramuzza,%20Davide&rft.genre=unknown&rft.btitle=Omnidirectional%20Vision
 Search print copy at ETH Library

Files in this item

Thumbnail
Thumbnail

Publication type

Show simple item record