Domain-Adaptive Semantic Segmentation with Memory-Efficient Cross-Domain Transformers
Open access
Date
2023Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
Unsupervised Domain Adaptation (UDA), a process by which a model trained on a well-annotated source dataset is adapted to an unlabeled target dataset, has emerged as a promising solution for deploying semantic segmentation models in scenarios where annotating extensive amounts of data is cost-prohibitive. Although the recent development of UDA strategies exploiting Transformer-based architectures has represented a major advance in the field, current approaches struggle to effectively learn context dependencies in the target domain, leading to suboptimal semantic label predictions. Aiming at addressing this issue, in this work we introduce a generic three-branch Transformer block that combines self- and cross-attention mechanisms for better source and target feature alignment.
%in UDA tasks. We then show how the proposed architecture can be seamlessly incorporated into state-of-the-art self-training UDA schemes for semantic segmentation, yielding enhanced adaptation capabilities without increasing the GPU memory footprint during training. The resulting framework significantly outperforms its baseline on benchmarking datasets for synthetic-to-real (+1.4 mIoU on GTA$\rightarrow$Cityscapes and +1.1 mIoU on SYNTHIA$\rightarrow$Cityscapes) and clear-to-adverse-weather (+3.4 mIoU on Cityscapes$\rightarrow$ACDC) UDA. In addition, it achieves superior robustness compared to using existing cross-domain Transformer architectures that require substantially more GPU memory for training. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000642496Publication status
publishedExternal links
Book title
34th British Machine Vision Conference ProceedingsPages / Article No.
Publisher
BMVA PressEvent
Subject
Semantic Scene Understanding; Unsupervised Domain AdaptationOrganisational unit
09559 - Chli, Margarita (ehemalig) / Chli, Margarita (former)
Related publications and datasets
Is part of: https://doi.org/10.3929/ethz-b-000646462
Notes
Poster presentation on October 23, 2023.More
Show all metadata
ETH Bibliography
yes
Altmetrics