Metadata only
Datum
2021Typ
- Conference Paper
ETH Bibliographie
yes
Altmetrics
Abstract
Combining different models is a widely used paradigm in machine learning applications. While the most common approach is to form an ensemble of models and average their individual predictions, this approach is often rendered infeasible by given resource constraints in terms of memory and computation, which grow linearly with the number of models. We present a layer-wise model fusion algorithm for neural networks that utilizes optimal transport to (soft-) align neurons across the models before averaging their associated parameters.
We show that this can successfully yield "one-shot" knowledge transfer (i.e, without requiring any retraining) between neural networks trained on heterogeneous non-i.i.d. data. In both i.i.d. and non-i.i.d. settings, we illustrate that our approach significantly outperforms vanilla averaging, as well as how it can serve as an efficient replacement for the ensemble with moderate fine-tuning, for standard convolutional networks (like VGG11), residual networks (like ResNet18), and multi-layer perceptrons on CIFAR10, CIFAR100, and MNIST. Finally, our approach also provides a principled way to combine the parameters of neural networks with different widths, and we explore its application for model compression.
The code is available at the following link, https://github.com/sidak/otfusion. Mehr anzeigen
Publikationsstatus
publishedHerausgeber(in)
Buchtitel
Advances in Neural Information Processing Systems 33Seiten / Artikelnummer
Verlag
CurranKonferenz
Organisationseinheit
09462 - Hofmann, Thomas / Hofmann, Thomas
Anmerkungen
Due to the Coronavirus (COVID-19) the conference was conducted virtually.ETH Bibliographie
yes
Altmetrics