- Conference Paper
Rights / licenseIn Copyright - Non-Commercial Use Permitted
Meta-learning improves generalization of machine learning models when faced with previously unseen tasks by leveraging experiences from different, yet re- lated prior tasks. To allow for better generalization, we propose a novel task representation called model-aware task embedding (MATE) that incorporates not only the data distributions of different tasks, but also the complexity of the tasks through the models used. The task complexity is taken into account by a novel variant of kernel mean embedding, combined with an instance-adaptive attention mechanism inspired by an SVM-based feature selection algorithm. Together with conditioning layers in deep neural networks, MATE can be easily incorporated into existing meta learners as a plug-and-play module. While MATE is widely applicable to general tasks where the concept of task/environment is involved, we demonstrate its effectiveness in few-shot learning by improving a state-of-the-art model consistently on two benchmarks. Source codes for this paper are available at https://github.com/VITA-Group/MATE. Show more
Book titleAdvances in Neural Information Processing Systems 33
Pages / Article No.
Organisational unit09686 - Tang, Siyu / Tang, Siyu
NotesDue to the Coronavirus (COVID-19) the conference was conducted virtually.
MoreShow all metadata