Online Continual Learning under Extreme Memory Constraints - Télécom Paris Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Online Continual Learning under Extreme Memory Constraints

Enrico Fini
  • Fonction : Auteur
Enver Sangineto
  • Fonction : Auteur
Moin Nabi
  • Fonction : Auteur
Elisa Ricci
  • Fonction : Auteur

Résumé

Continual Learning (CL) aims to develop agents emulating the human ability to sequentially learn new tasks while being able to retain knowledge obtained from past experiences. In this paper, we introduce the novel problem of Memory-Constrained Online Continual Learning (MC-OCL) which imposes strict constraints on the memory overhead that a possible algorithm can use to avoid catastrophic forgetting. As most, if not all, previous CL methods violate these constraints, we propose an algorithmic solution to MC-OCL: Batch-level Distillation (BLD), a regularization-based CL approach, which effectively balances stability and plasticity in order to learn from data streams, while preserving the ability to solve old tasks through distillation. Our extensive experimental evaluation, conducted on three publicly available benchmarks, empirically demonstrates that our approach successfully addresses the MC-OCL problem and achieves comparable accuracy to prior distillation methods requiring higher memory overhead.

Dates et versions

hal-02941923 , version 1 (17-09-2020)

Identifiants

Citer

Enrico Fini, Stéphane Lathuilière, Enver Sangineto, Moin Nabi, Elisa Ricci. Online Continual Learning under Extreme Memory Constraints. European Conference on Computer Vision, Aug 2020, edinburgh, United Kingdom. ⟨hal-02941923⟩
29 Consultations
2 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More