Skip to yearly menu bar Skip to main content


Poster

F-OAL: Foward-only Online Analytic Learning with Fast Training and Low Memory Footprint in Class Incremental Learning

HUIPING ZHUANG · Yuchen Liu · Run He · Kai Tong · Ziqian Zeng · Cen Chen · Yi Wang · Lap-Pui Chau

West Ballroom A-D #5910
[ ] [ Project Page ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Online Class Incremental Learning (OCIL) aims to train models incrementally, where data arrive in mini-batches, and previous data are not accessible. A major challenge in OCIL is Catastrophic Forgetting, i.e., the loss of previously learned knowledge. Among existing baselines, replay-based methods show competitive results but compromise data privacy, while exemplar-free methods safeguard privacy but often lack accuracy. In this paper, we propose an exemplar-free approach—Foward-only Online Analytic Learning (F-OAL). Unlike traditional methods, F-OAL does not rely on back-propagation and is forward-only, significantly reducing memory usage and computational time. Cooperating with a pre-trained frozen encoder with feature fusion, F-OAL only needs to update a linear classifier by recursive least square. This approach simultaneously achieves high accuracy, low resource consumption, and data privacy protection. Extensive experiments on benchmark datasets demonstrate F-OAL's robust performance in OCIL scenarios.

Live content is unavailable. Log in and register to view live content