Adversarial Continual Learning
Research output: Contribution to book/Conference proceedings/Anthology/Report › Conference contribution › Contributed › peer-review
Contributors
Abstract
Continual learning aims to learn new tasks without forgetting previously learned ones. We hypothesize that representations learned to solve each task in a sequence have a shared structure while containing some task-specific properties. We show that shared features are significantly less prone to forgetting and propose a novel hybrid continual learning framework that learns a disjoint representation for task-invariant and task-specific features required to solve a sequence of tasks. Our model combines architecture growth to prevent forgetting of task-specific skills and an experience replay approach to preserve shared skills. We demonstrate our hybrid approach is effective in avoiding forgetting and show it is superior to both architecture-based and memory-based approaches on class incrementally learning of a single dataset as well as a sequence of multiple datasets in image classification. Our code is available at https://github.com/facebookresearch/Adversarial-Continual-Learning.
Details
Original language | English |
---|---|
Title of host publication | Computer Vision – ECCV 2020 - 16th European Conference, 2020, Proceedings |
Editors | Andrea Vedaldi, Horst Bischof, Thomas Brox, Jan-Michael Frahm |
Publisher | Springer, Berlin [u. a.] |
Pages | 386-402 |
Number of pages | 17 |
ISBN (print) | 9783030586201 |
Publication status | Published - 2020 |
Peer-reviewed | Yes |
Publication series
Series | Lecture Notes in Computer Science, Volume 12356 |
---|---|
ISSN | 0302-9743 |
Conference
Title | 16th European Conference on Computer Vision, ECCV 2020 |
---|---|
Duration | 23 - 28 August 2020 |
City | Glasgow |
Country | United Kingdom |
External IDs
ORCID | /0000-0001-9430-8433/work/146646289 |
---|