Metadata only
Date
2020Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
Sharing knowledge between tasks is vital for efficient learning in a multi-task setting. However, most research so far has focused on the easier case where knowledge transfer is not harmful, i.e., where knowledge from one task cannot negatively impact the performance on another task. In contrast, we present an approach to multi-task deep reinforcement learning based on attention that does not require any a-priori assumptions about the relationships between tasks. Our attention network automatically groups task knowledge into sub-networks on a state level granularity. It thereby achieves positive knowledge transfer if possible, and avoids negative transfer in cases where tasks interfere. We test our algorithm against two state-of-the-art multi-task/transfer learning approaches and show comparable or superior performance while requiring fewer network parameters. Show more
Publication status
publishedExternal links
Editor
Book title
Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2019Journal / series
Lecture Notes in Computer ScienceVolume
Pages / Article No.
Publisher
SpringerEvent
Organisational unit
03604 - Wattenhofer, Roger / Wattenhofer, Roger
Related publications and datasets
Is part of: https://ecmlpkdd2019.org/programme/accepted/
Is part of: http://hdl.handle.net/20.500.11850/452698
Notes
Presented on Wednesday, 18. September 2019.More
Show all metadata
ETH Bibliography
yes
Altmetrics