- Conference Paper
Rights / licenseIn Copyright - Non-Commercial Use Permitted
In this paper, we explore the possibility of achieving a more accurate depth estimation by fusing monocular images and Radar points using a deep neural network. We give a comprehensive study of the fusion between RGB images and Radar measurements from different aspects and proposed a working solution based on the observations. We find that the noise existing in Radar measurements is one of the main key reasons that prevents one from applying the existing fusion methods developed for LiDAR data and images to the new fusion problem between Radar data and images. The experiments are conducted on the nuScenes dataset, which is one of the first datasets which features Camera, Radar, and LiDAR recordings in diverse scenes and weather conditions. Extensive experiments demonstrate that our method outperforms existing fusion methods. We also provide detailed ablation studies to show the effectiveness of each component in our method. © 2020 IEEE. Show more
Book title2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Pages / Article No.
Organisational unit03514 - Van Gool, Luc / Van Gool, Luc
NotesConference lecture held on October 28, 2020. Due to the Coronavirus (COVID-19) the conference was conducted virtually.
MoreShow all metadata