Ye, Fei and Bors, Adrian Gheorghe orcid.org/0000-0001-7838-0021 (2024) Continual compression model for online continual learning. APPLIED SOFT COMPUTING. 112427. ISSN 1568-4946
Abstract
Task-Free Continual Learning (TFCL) presents a notably demanding but realistic ongoing learning concept, aiming to address catastrophic forgetting in sequential learning systems. In this paper, we tackle catastrophic forgetting by introducing an innovative dynamic expansion framework designed to adaptively enhance the model’s capacity for novel data learning while also remembering the information learnt in the past, by using a minimal-size processing architecture. Our proposed framework incorporates three key mechanisms to mitigate model’ forgetting: (1) by employing a Maximum Mean Discrepancy (MMD)-based expansion mechanism that assesses the disparity between previously acquired knowledge and that from the new training data, serving as a signal for the model’s architecture expansion; (2) a component discarding mechanism that eliminates components characterized by redundant information, thereby optimizing the model size while fostering knowledge diversity; (3) a novel training sample selection strategy that leads to the diversity of the training data for each task. We conduct a series of TFCL experiments that demonstrate the superiority of the proposed framework over all baselines while utilizing fewer components than alternative dynamic expansion models. The results on the Split Mini ImageNet dataset, after splitting the original dataset into multiple tasks, are improved by more than 2% when compared to the closest baseline.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2024 The Authors |
Dates: |
|
Institution: | The University of York |
Academic Units: | The University of York > Faculty of Sciences (York) > Computer Science (York) |
Depositing User: | Pure (York) |
Date Deposited: | 20 Dec 2024 11:50 |
Last Modified: | 21 Dec 2024 00:26 |
Published Version: | https://doi.org/10.1016/j.asoc.2024.112427 |
Status: | Published |
Refereed: | Yes |
Identification Number: | 10.1016/j.asoc.2024.112427 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:221042 |
Download
Filename: 1-s2.0-S1568494624012018-main.pdf
Description: Continual compression model for online continual learning
Licence: CC-BY 2.5