Isolation and impartial aggregation: a paradigm of incremental learning without interference
Isolation and impartial aggregation: a paradigm of incremental learning without interference
This paper focuses on the prevalent stage interference and stage performance imbalance of incremental learning. To avoid obvious stage learning bottlenecks, we propose a new incremental learning framework, which leverages a series of stage-isolated classifiers to perform the learning task at each stage, without interference from others. To be concrete, to aggregate multiple stage classifiers as a uniform one impartially, we first introduce a temperature-controlled energy metric for indicating the confidence score levels of the stage classifiers. We then propose an anchor-based energy self-normalization strategy to ensure the stage classifiers work at the same energy level. Finally, we design a voting-based inference augmentation strategy for robust inference. The proposed method is rehearsal-free and can work for almost all incremental learning scenarios. We evaluate the proposed method on four large datasets. Extensive results demonstrate the superiority of the proposed method in setting up new state-of-the-art overall performance. Code is available at https://github.com/iamwangyabin/ESN.
10209-10217
Wang, Yabin
e671a413-03b2-4d1e-9e5a-bbc996c04b6a
Ma, Zhiheng
44ab7589-9140-4227-9c9d-ac2a2d2ec2fe
Huang, Zhiwu
84f477cd-9097-44dd-a33e-ff71f253d36b
Wang, Yaowei
eec19698-be34-4e95-a6ef-f1b1b23f2215
Su, Zhou
7df6d5f5-99c0-45df-b86a-0d7781e538dd
Hong, Xiaopeng
51d1da34-f005-4ff5-97d9-c5a24a2e3a65
7 February 2023
Wang, Yabin
e671a413-03b2-4d1e-9e5a-bbc996c04b6a
Ma, Zhiheng
44ab7589-9140-4227-9c9d-ac2a2d2ec2fe
Huang, Zhiwu
84f477cd-9097-44dd-a33e-ff71f253d36b
Wang, Yaowei
eec19698-be34-4e95-a6ef-f1b1b23f2215
Su, Zhou
7df6d5f5-99c0-45df-b86a-0d7781e538dd
Hong, Xiaopeng
51d1da34-f005-4ff5-97d9-c5a24a2e3a65
Wang, Yabin, Ma, Zhiheng, Huang, Zhiwu, Wang, Yaowei, Su, Zhou and Hong, Xiaopeng
(2023)
Isolation and impartial aggregation: a paradigm of incremental learning without interference.
In Association for the Advancement of Artificial Intelligence.
.
(doi:10.1609/aaai.v37i8.26216).
Record type:
Conference or Workshop Item
(Paper)
Abstract
This paper focuses on the prevalent stage interference and stage performance imbalance of incremental learning. To avoid obvious stage learning bottlenecks, we propose a new incremental learning framework, which leverages a series of stage-isolated classifiers to perform the learning task at each stage, without interference from others. To be concrete, to aggregate multiple stage classifiers as a uniform one impartially, we first introduce a temperature-controlled energy metric for indicating the confidence score levels of the stage classifiers. We then propose an anchor-based energy self-normalization strategy to ensure the stage classifiers work at the same energy level. Finally, we design a voting-based inference augmentation strategy for robust inference. The proposed method is rehearsal-free and can work for almost all incremental learning scenarios. We evaluate the proposed method on four large datasets. Extensive results demonstrate the superiority of the proposed method in setting up new state-of-the-art overall performance. Code is available at https://github.com/iamwangyabin/ESN.
This record has no associated files available for download.
More information
Published date: 7 February 2023
Venue - Dates:
The Association for the Advancement of Artificial Intelligence (AAAI), , Washington, United States, 2023-02-07
Identifiers
Local EPrints ID: 501689
URI: http://eprints.soton.ac.uk/id/eprint/501689
PURE UUID: e7715f62-d1fd-4238-811d-4e14182d4235
Catalogue record
Date deposited: 05 Jun 2025 16:58
Last modified: 06 Jun 2025 02:06
Export record
Altmetrics
Contributors
Author:
Yabin Wang
Author:
Zhiheng Ma
Author:
Zhiwu Huang
Author:
Yaowei Wang
Author:
Zhou Su
Author:
Xiaopeng Hong
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics