
Open access
Date
2021Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
Meta-learning improves generalization of machine learning models when faced with previously unseen tasks by leveraging experiences from different, yet re- lated prior tasks. To allow for better generalization, we propose a novel task representation called model-aware task embedding (MATE) that incorporates not only the data distributions of different tasks, but also the complexity of the tasks through the models used. The task complexity is taken into account by a novel variant of kernel mean embedding, combined with an instance-adaptive attention mechanism inspired by an SVM-based feature selection algorithm. Together with conditioning layers in deep neural networks, MATE can be easily incorporated into existing meta learners as a plug-and-play module. While MATE is widely applicable to general tasks where the concept of task/environment is involved, we demonstrate its effectiveness in few-shot learning by improving a state-of-the-art model consistently on two benchmarks. Source codes for this paper are available at https://github.com/VITA-Group/MATE. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000466892Publication status
publishedExternal links
Book title
Advances in Neural Information Processing Systems 33Pages / Article No.
Publisher
CurranEvent
Organisational unit
09686 - Tang, Siyu / Tang, Siyu
Notes
Due to the Coronavirus (COVID-19) the conference was conducted virtually.More
Show all metadata
ETH Bibliography
yes
Altmetrics