- Conference Paper
In this work, we address the difficulty of matching local features between images captured at distant points in time resulting in a global appearance change. Inspired by recent neural style transfer techniques, we propose to use an image transformation network to translate night images into day-like appearance, with the objective of better matching performance. We extend traditional style transfer, that optimize for content and style, with a keypoint matching loss function. The joint optimization of these losses allows our model to generate images that can significantly improve the performance of local feature matching, in a self-supervised way. As a result, our approach is flexible and does not require paired training data, which is difficult to obtain in practice. We show how our method can be used as an extension to a state-of-the-art differentiable feature extractor to improve its performance in challenging scenarios. This is demonstrated in our evaluation on day-night image matching and visual localization tasks with night-rain image queries. © 2020 IEEE Show more
Book title2020 International Conference on 3D Vision (3DV)
Pages / Article No.
NotesDue to the Coronavirus (COVID-19) the conference was conducted virtually.
MoreShow all metadata