Show simple item record

dc.contributor.author
Borsos, Zalán
dc.contributor.author
Mutný, Mojmír
dc.contributor.author
Krause, Andreas
dc.contributor.editor
Larochelle, Hugo
dc.contributor.editor
Ranzato, Marc'Aurelio
dc.contributor.editor
Hadsell, Raia
dc.contributor.editor
Balcan, Maria F.
dc.contributor.editor
Lin, H.
dc.date.accessioned
2021-07-21T07:25:03Z
dc.date.available
2020-12-14T07:13:54Z
dc.date.available
2020-12-14T07:40:41Z
dc.date.available
2021-03-02T14:41:39Z
dc.date.available
2021-07-21T07:25:03Z
dc.date.issued
2021
dc.identifier.isbn
978-1-7138-2954-6
en_US
dc.identifier.uri
http://hdl.handle.net/20.500.11850/456148
dc.description.abstract
Coresets are small data summaries that are sufficient for model training. They can be maintained online, enabling efficient handling of large data streams under resource constraints. However, existing constructions are limited to simple models such as k-means and logistic regression. In this work, we propose a novel coreset construction via cardinality-constrained bilevel optimization. We show how our framework can efficiently generate coresets for deep neural networks, and demonstrate its empirical benefits in continual learning and in streaming settings.
en_US
dc.language.iso
en
en_US
dc.publisher
Curran
en_US
dc.title
Coresets via Bilevel Optimization for Continual Learning and Streaming
en_US
dc.type
Conference Paper
dc.date.published
2020
ethz.book.title
Advances in Neural Information Processing Systems 33
en_US
ethz.pages.start
14879
en_US
ethz.pages.end
14890
en_US
ethz.event
34th Annual Conference on Neural Information Processing Systems (NeurIPS 2020)
en_US
ethz.event.location
Online
en_US
ethz.event.date
December 6-12, 2020
en_US
ethz.notes
Due to the Coronavirus (COVID-19) the conference was conducted virtually.
en_US
ethz.grant
Scaling Up by Scaling Down: Big ML via Small Coresets
en_US
ethz.grant
Reliable Data-Driven Decision Making in Cyber-Physical Systems
en_US
ethz.publication.place
Red Hook, NY
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02150 - Dep. Informatik / Dep. of Computer Science::02661 - Institut für Maschinelles Lernen / Institute for Machine Learning::03908 - Krause, Andreas / Krause, Andreas
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02150 - Dep. Informatik / Dep. of Computer Science::02661 - Institut für Maschinelles Lernen / Institute for Machine Learning::03908 - Krause, Andreas / Krause, Andreas
en_US
ethz.identifier.url
https://papers.nips.cc/paper/2020/hash/aa2a77371374094fe9e0bc1de3f94ed9-Abstract.html
ethz.grant.agreementno
167212
ethz.grant.agreementno
815943
ethz.grant.fundername
SNF
ethz.grant.fundername
EC
ethz.grant.funderDoi
10.13039/501100001711
ethz.grant.funderDoi
10.13039/501100000780
ethz.grant.program
NFP 75: Gesuch
ethz.grant.program
H2020
ethz.date.deposited
2020-12-14T07:14:03Z
ethz.source
BATCH
ethz.eth
yes
en_US
ethz.availability
Metadata only
en_US
ethz.rosetta.installDate
2021-03-02T14:41:49Z
ethz.rosetta.lastUpdated
2022-03-29T10:33:28Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Coresets%20via%20Bilevel%20Optimization%20for%20Continual%20Learning%20and%20Streaming&rft.date=2021&rft.spage=14879&rft.epage=14890&rft.au=Borsos,%20Zal%C3%A1n&Mutn%C3%BD,%20Mojm%C3%ADr&Krause,%20Andreas&rft.isbn=978-1-7138-2954-6&rft.genre=proceeding&rft.btitle=Advances%20in%20Neural%20Information%20Processing%20Systems%2033
 Search print copy at ETH Library

Files in this item

FilesSizeFormatOpen in viewer

There are no files associated with this item.

Publication type

Show simple item record