Adversarial Continual Learning

Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/GutachtenBeitrag in KonferenzbandBeigetragenBegutachtung

Beitragende

Abstract

Continual learning aims to learn new tasks without forgetting previously learned ones. We hypothesize that representations learned to solve each task in a sequence have a shared structure while containing some task-specific properties. We show that shared features are significantly less prone to forgetting and propose a novel hybrid continual learning framework that learns a disjoint representation for task-invariant and task-specific features required to solve a sequence of tasks. Our model combines architecture growth to prevent forgetting of task-specific skills and an experience replay approach to preserve shared skills. We demonstrate our hybrid approach is effective in avoiding forgetting and show it is superior to both architecture-based and memory-based approaches on class incrementally learning of a single dataset as well as a sequence of multiple datasets in image classification. Our code is available at https://github.com/facebookresearch/Adversarial-Continual-Learning.

Details

OriginalspracheEnglisch
TitelComputer Vision – ECCV 2020 - 16th European Conference, 2020, Proceedings
Redakteure/-innenAndrea Vedaldi, Horst Bischof, Thomas Brox, Jan-Michael Frahm
Herausgeber (Verlag)Springer, Berlin [u. a.]
Seiten386-402
Seitenumfang17
ISBN (Print)9783030586201
PublikationsstatusVeröffentlicht - 2020
Peer-Review-StatusJa

Publikationsreihe

ReiheLecture Notes in Computer Science, Volume 12356
ISSN0302-9743

Konferenz

Titel16th European Conference on Computer Vision, ECCV 2020
Dauer23 - 28 August 2020
StadtGlasgow
LandGroßbritannien/Vereinigtes Königreich

Externe IDs

ORCID /0000-0001-9430-8433/work/146646289

Schlagworte

Bibliotheksschlagworte