Show simple item record

dc.contributor.author
Bense, Alexandre
dc.contributor.author
Joudaki, Amir
dc.contributor.author
Rudner, Tim G.J.
dc.contributor.author
Fortuin, Vincent
dc.date.accessioned
2024-01-31T09:48:16Z
dc.date.available
2024-01-29T15:35:15Z
dc.date.available
2024-01-31T09:48:16Z
dc.date.issued
2021
dc.identifier.uri
http://hdl.handle.net/20.500.11850/656241
dc.description.abstract
Bayesian Neural Networks are often sought after for their strong and trustworthy predictive power. However, inference in these models is often computationally expensive and can be reduced using dimensionality reduction where the key goal is to find an appropriate subspace in which to perform the inference, while retaining significant predictive power. In this work, we propose a theoretical comparative study of the Principal Component Analysis versus the random projection for Bayesian Linear Regression. We find that the PCA is not always the optimal dimensionality reduction method and that the random projection can actually be superior, especially in cases where the data distribution is shifted and the labels have a small norm. We then confirm these results experimentally. Therefore, this work suggests to consider dimension reduction by random projection for Bayesian inference when noisy data are expected.
en_US
dc.language.iso
en
en_US
dc.publisher
OpenReview
en_US
dc.title
PCA Subspaces Are Not Always Optimal for Bayesian Learning
en_US
dc.type
Conference Paper
ethz.book.title
NeurIPS 2021 Workshop on Distribution Shifts: Connecting Methods and Applications
en_US
ethz.size
14 p.
en_US
ethz.event
NeurIPS 2021 Workshop on Distribution Shifts: Connecting Methods and Applications (NeurIPS 2021 Workshop DistShift)
en_US
ethz.event.location
Online
en_US
ethz.event.date
December 13, 2021
en_US
ethz.notes
Poster presentation
en_US
ethz.publication.place
s.l.
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02150 - Dep. Informatik / Dep. of Computer Science::02661 - Institut für Maschinelles Lernen / Institute for Machine Learning::09568 - Rätsch, Gunnar / Rätsch, Gunnar
en_US
ethz.identifier.url
https://openreview.net/forum?id=iPYHorHDtPh
ethz.identifier.url
https://neurips.cc/virtual/2021/35438
ethz.date.deposited
2024-01-29T15:35:15Z
ethz.source
BATCH
ethz.eth
yes
en_US
ethz.availability
Metadata only
en_US
ethz.rosetta.installDate
2024-01-31T09:48:17Z
ethz.rosetta.lastUpdated
2024-01-31T09:48:17Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=PCA%20Subspaces%20Are%20Not%20Always%20Optimal%20for%20Bayesian%20Learning&rft.date=2021&rft.au=Bense,%20Alexandre&Joudaki,%20Amir&Rudner,%20Tim%20G.J.&Fortuin,%20Vincent&rft.genre=proceeding&rft.btitle=NeurIPS%202021%20Workshop%20on%20Distribution%20Shifts:%20Connecting%20Methods%20and%20Applications
 Search print copy at ETH Library

Files in this item

FilesSizeFormatOpen in viewer

There are no files associated with this item.

Publication type

Show simple item record