Learning Layer-wise Equivariances Automatically using Gradients


METADATA ONLY
Loading...

Date

2024-07

Publication Type

Conference Paper

ETH Bibliography

yes

Citations

Altmetric
METADATA ONLY

Data

Rights / License

Abstract

Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance. However, symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted. Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients. Learning symmetry and associated weight connectivity structures from scratch is difficult for two reasons. First, it requires efficient and flexible parameterisations of layer-wise equivariances. Secondly, symmetries act as constraints and are therefore not encouraged by training losses measuring data fit. To overcome these challenges, we improve parameterisations of soft equivariance and learn the amount of equivariance in layers by optimising the marginal likelihood, estimated using differentiable Laplace approximations. The objective balances data fit and model complexity enabling layer-wise symmetry discovery in deep networks. We demonstrate the ability to automatically learn layer-wise equivariances on image classification tasks, achieving equivalent or improved performance over baselines with hard-coded symmetry.

Publication status

published

Book title

Advances in Neural Information Processing Systems 36

Journal / series

Volume

Pages / Article No.

28365 - 28377

Publisher

Curran

Event

37th Annual Conference on Neural Information Processing Systems (NeurIPS 2023)

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Organisational unit

09568 - Rätsch, Gunnar / Rätsch, Gunnar check_circle

Notes

Spotlight poster presented on December 13, 2023.

Funding

Related publications and datasets