TY - JOUR
T1 - Class Incremental Learning via Feature Space Calibration
AU - Kim, Jeonghoon
AU - Cao, Jinming
AU - Kim, Jihie
AU - Zimmermann, Roger
AU - Park, Seongsik
N1 - Publisher Copyright:
© 2024 Tsinghua University Press.
PY - 2025
Y1 - 2025
N2 - Class incremental learning (CIL) has attracted a great deal of attention as an effective way to realize lifelong learning. However, existing works still face catastrophic forgetting, i.e., performance degradation on old tasks after learning new category information. In this work, we aim to alleviate this problem through feature space calibration. Specifically, we propose a novel loss function that allows the network to focus more on inter- and intra-class information to extract effective features. The balance between remembering old classes and learning new classes is achieved by learning class relationships rather than just information about a particular class, which can effectively alleviate catastrophic forgetting. Unlike existing methods, the approach proposed in this paper is highly general and flexible and can be directly integrated into existing CIL methods to effectively improve their performance. Our proposed approach is shown to be effective through comparative experiments on three popular datasets: CIFAR100, ImageNet100, and ImageNetlk. To ensure a robust comparison, we utilized three state-of-the-art methods as our baseline models. The results of these experiments demonstrate that our approach outperforms the baseline models on a range of benchmark datasets, showcasing its superiority and potential for broader application.
AB - Class incremental learning (CIL) has attracted a great deal of attention as an effective way to realize lifelong learning. However, existing works still face catastrophic forgetting, i.e., performance degradation on old tasks after learning new category information. In this work, we aim to alleviate this problem through feature space calibration. Specifically, we propose a novel loss function that allows the network to focus more on inter- and intra-class information to extract effective features. The balance between remembering old classes and learning new classes is achieved by learning class relationships rather than just information about a particular class, which can effectively alleviate catastrophic forgetting. Unlike existing methods, the approach proposed in this paper is highly general and flexible and can be directly integrated into existing CIL methods to effectively improve their performance. Our proposed approach is shown to be effective through comparative experiments on three popular datasets: CIFAR100, ImageNet100, and ImageNetlk. To ensure a robust comparison, we utilized three state-of-the-art methods as our baseline models. The results of these experiments demonstrate that our approach outperforms the baseline models on a range of benchmark datasets, showcasing its superiority and potential for broader application.
KW - deep learning
KW - image classification
KW - incremental learning
KW - loss function
UR - https://www.scopus.com/pages/publications/105021035247
U2 - 10.26599/CVM.2025.9450426
DO - 10.26599/CVM.2025.9450426
M3 - Article
AN - SCOPUS:105021035247
SN - 2096-0433
VL - 11
SP - 1025
EP - 1039
JO - Computational Visual Media
JF - Computational Visual Media
IS - 5
ER -