Skip to Main content Skip to Navigation
Conference papers

Online Continual Learning under Extreme Memory Constraints

Abstract : Continual Learning (CL) aims to develop agents emulating the human ability to sequentially learn new tasks while being able to retain knowledge obtained from past experiences. In this paper, we introduce the novel problem of Memory-Constrained Online Continual Learning (MC-OCL) which imposes strict constraints on the memory overhead that a possible algorithm can use to avoid catastrophic forgetting. As most, if not all, previous CL methods violate these constraints, we propose an algorithmic solution to MC-OCL: Batch-level Distillation (BLD), a regularization-based CL approach, which effectively balances stability and plasticity in order to learn from data streams, while preserving the ability to solve old tasks through distillation. Our extensive experimental evaluation, conducted on three publicly available benchmarks, empirically demonstrates that our approach successfully addresses the MC-OCL problem and achieves comparable accuracy to prior distillation methods requiring higher memory overhead.
Document type :
Conference papers
Complete list of metadatas

https://hal.telecom-paris.fr/hal-02941923
Contributor : Stéphane Lathuilière <>
Submitted on : Thursday, September 17, 2020 - 2:22:46 PM
Last modification on : Friday, October 16, 2020 - 10:30:54 AM

Links full text

Identifiers

  • HAL Id : hal-02941923, version 1
  • ARXIV : 2008.01510

Collections

Citation

Enrico Fini, Stéphane Lathuilière, Enver Sangineto, Moin Nabi, Elisa Ricci. Online Continual Learning under Extreme Memory Constraints. European Conference on Computer Vision, Aug 2020, edinburgh, United Kingdom. ⟨hal-02941923⟩

Share

Metrics

Record views

12