Temporal- and Viewpoint-Invariant Registration for Under-Canopy Footage using Deep-Learning-based Bird's-Eye View Prediction
Open access
Datum
2024Typ
- Conference Paper
ETH Bibliographie
yes
Altmetrics
Abstract
Conducting visual assessments under the canopy using mobile robots is an emerging task in smart farming and forestry. However, it is challenging to register images across different data-collection days, especially across seasons, due to the self-occluding geometry and temporal dynamics in forests and orchards. This paper proposes a new approach for registering under-canopy image sequences in general and in these situations. Our methodology leverages standard GPS data and deep-learning-based perspective to bird’s-eye view conversion to provide an initial estimation of the positions of the trees in images and their association across datasets. Furthermore, it introduces an innovative strategy for extracting tree trunks and clean ground surfaces from noisy and sparse 3D reconstructions created from the image sequences, utilizing these features to achieve precise alignment. Our robust alignment method effectively mitigates position and scale drift, which may arise from GPS inaccuracies and Sparse Structure from Motion (SfM) limitations. We evaluate our approach on three challenging real-world datasets, demonstrating that our method outperforms ICP-based methods on average by 50%, and surpasses FGR and TEASER++ by over 90% in alignment accuracy. These results highlight our method’s cost efficiency and robustness, even in the presence of severe outliers and sparsity. https://github.com/VIS4ROBlab/bev_undercanopy_registration Mehr anzeigen
Persistenter Link
https://doi.org/10.3929/ethz-b-000693951Publikationsstatus
acceptedKonferenz
Organisationseinheit
03737 - Siegwart, Roland Y. / Siegwart, Roland Y.09570 - Hutter, Marco / Hutter, Marco
Anmerkungen
This work has been partly funded by the European Research Council (ERC), as part of the project SkEyes (Grant agreement no. 101089328), ETH Zurich Research Grant No. 21-1 ETH-27, and by Unity Technologies.ETH Bibliographie
yes
Altmetrics