NeRF On-the-go: Exploiting Uncertainty for Distractor-free NeRFs in the Wild
METADATA ONLY
Author / Producer
Date
2024
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
Neural Radiance Fields (NeRFs) have shown remarkable success in synthesizing photorealistic views from multi-view images of static scenes, but face challenges in dynamic, real-world environments with distractors like moving objects, shadows, and lighting changes. Existing methods manage controlled environments and low occlusion ra-tios but fall short in render quality, especially under high occlusion scenarios. In this paper, we introduce NeRF On-the-go, a simple yet effective approach that enables the ro-bust synthesis of novel views in complex, in-the-wild scenes from only casually captured image sequences. Delving into uncertainty, our method not only efficiently eliminates dis-tractors, even when they are predominant in captures, but also achieves a notably faster convergence speed. Through comprehensive experiments on various scenes, our method demonstrates a significant improvement over state-of-the-art techniques. This advancement opens new avenues for NeRF in diverse and dynamic real-world applications.
Permanent link
Publication status
published
External links
Editor
Book title
2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Journal / series
Volume
Pages / Article No.
8931 - 8940
Publisher
IEEE
Event
2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR 2024)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Organisational unit
03766 - Pollefeys, Marc / Pollefeys, Marc