
Open access
Date
2019Type
- Conference Paper
Abstract
This work addresses the problem of semantic scene understanding under foggy road conditions. Although marked progress has been made in semantic scene understanding over the recent years, it is mainly concentrated on clear weather outdoor scenes. Extending semantic segmentation methods to adverse weather conditions like fog is crucially important for outdoor applications such as self-driving cars. In this paper, we propose a novel method, which uses purely synthetic data to improve the performance on unseen real-world foggy scenes captured in the streets of Zurich and its surroundings. Our results highlight the potential and power of photo-realistic synthetic images for training and especially fine-tuning deep neural nets. Our contributions are threefold, 1) we created a purely synthetic, high-quality foggy dataset of 25,000 unique outdoor scenes, that we call Foggy Synscapes and plan to release publicly 2) we show that with this data we outperform previous approaches on real-world foggy test data 3) we show that a combination of our data and previously used data can even further improve the performance on real-world foggy data. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000387150Publication status
publishedExternal links
Editor
Book title
2019 IEEE Intelligent Transportation Systems Conference (ITSC)Pages / Article No.
Publisher
IEEEEvent
Subject
machine learning; computer vision; autonomous driving; adverse weatherOrganisational unit
03514 - Van Gool, Luc (emeritus) / Van Gool, Luc (emeritus)
Related publications and datasets
Is part of: https://doi.org/10.3929/ethz-b-000578470
Notes
Conference lecture held on October 30, 2019More
Show all metadata