Two Is Better Than One: Aligned Clusters Improve Anomaly Detection
OPEN ACCESS
Author / Producer
Date
2024-12-14
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
OPEN ACCESS
Data
Rights / License
Abstract
Anomaly detection focuses on identifying samples that deviate from the norm. When working with high-dimensional data such as images, a crucial requirement for detecting anomalous patterns is learning lower-dimensional representations that capture concepts of normality. Recent advances in self-supervised learning have shown great promise in this regard. However, many successful self-supervised anomaly detection methods assume prior knowledge about anomalies to create synthetic outliers during training. Yet, in real-world applications, we often do not know what to expect from unseen data, and we can solely leverage knowledge about normal data. In this work, we propose Con
, which learns representations through context augmentations that model invariances of normal data while letting us observe samples from two distinct perspectives. At test time, representations of anomalies that do not adhere to these invariances deviate from the representation structure learned during training, allowing us to detect anomalies without relying on prior knowledge about them.
Permanent link
Publication status
published
External links
Editor
Book title
NeurIPS 2024 Workshop: Self-Supervised Learning - Theory and Practice
Journal / series
Volume
Pages / Article No.
Publisher
OpenReview
Event
NeurIPS 2024 Workshop: Self-Supervised Learning - Theory and Practice
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Organisational unit
09670 - Vogt, Julia / Vogt, Julia
Notes
Funding
Related publications and datasets
Is identical to: https://doi.org/10.3929/ethz-c-000788565