Adversarial Continual Learning
Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/Gutachten › Beitrag in Konferenzband › Beigetragen › Begutachtung
Beitragende
Abstract
Continual learning aims to learn new tasks without forgetting previously learned ones. We hypothesize that representations learned to solve each task in a sequence have a shared structure while containing some task-specific properties. We show that shared features are significantly less prone to forgetting and propose a novel hybrid continual learning framework that learns a disjoint representation for task-invariant and task-specific features required to solve a sequence of tasks. Our model combines architecture growth to prevent forgetting of task-specific skills and an experience replay approach to preserve shared skills. We demonstrate our hybrid approach is effective in avoiding forgetting and show it is superior to both architecture-based and memory-based approaches on class incrementally learning of a single dataset as well as a sequence of multiple datasets in image classification. Our code is available at https://github.com/facebookresearch/Adversarial-Continual-Learning.
Details
Originalsprache | Englisch |
---|---|
Titel | Computer Vision – ECCV 2020 - 16th European Conference, 2020, Proceedings |
Redakteure/-innen | Andrea Vedaldi, Horst Bischof, Thomas Brox, Jan-Michael Frahm |
Herausgeber (Verlag) | Springer, Berlin [u. a.] |
Seiten | 386-402 |
Seitenumfang | 17 |
ISBN (Print) | 9783030586201 |
Publikationsstatus | Veröffentlicht - 2020 |
Peer-Review-Status | Ja |
Publikationsreihe
Reihe | Lecture Notes in Computer Science, Volume 12356 |
---|---|
ISSN | 0302-9743 |
Konferenz
Titel | 16th European Conference on Computer Vision, ECCV 2020 |
---|---|
Dauer | 23 - 28 August 2020 |
Stadt | Glasgow |
Land | Großbritannien/Vereinigtes Königreich |
Externe IDs
ORCID | /0000-0001-9430-8433/work/146646289 |
---|