The University of Southampton
University of Southampton Institutional Repository

Dedicated class subnetworks for SNN class incremental learning

Dedicated class subnetworks for SNN class incremental learning
Dedicated class subnetworks for SNN class incremental learning
We explore an unconventional approach to the Class Incremental Learning (CIL) problem that dedicates a separate subnetwork for the independent learning of each class. This separation ensures that the learning of a new class has no effect on the model’s previously acquired knowledge. We present a learning strategy loosely inspired by biological neuron apoptosis (neuron death) and neurogenesis (neuron birth) combined with other methods to achieve effective learning and minimize the model’s resource requirements. The network is entirely feed-forward and uses low precision inter-neuron spikes combined with simple neuron behaviors, making it suitable for very low power Spiking Neural Network (SNN) realization on appropriate future neuromorphic platforms. We demonstrate the model in an abstract setting to explore the tradeoffs between optimizing for accuracy, network size, and run-time costs and show that, despite no competition between the classes during learning, it is possible to achieve top-1 accuracy of 85% on MNIST.
Class Incremental Learning, low power, Spiking Neural Networks, CIL, SNN, neuromorphic, neurogenesis, feed-forward
IEEE
Warr, Katy
689d3c51-f77d-40f6-83e5-392560f8e9ed
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Thomas, David
5701997d-7de3-4e57-a802-ea2bd3e6ab6c
Warr, Katy
689d3c51-f77d-40f6-83e5-392560f8e9ed
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Thomas, David
5701997d-7de3-4e57-a802-ea2bd3e6ab6c

Warr, Katy, Hare, Jonathon and Thomas, David (2025) Dedicated class subnetworks for SNN class incremental learning. In 2025 Neuro Inspired Computational Elements Conference (NICE). IEEE. 10 pp . (In Press)

Record type: Conference or Workshop Item (Paper)

Abstract

We explore an unconventional approach to the Class Incremental Learning (CIL) problem that dedicates a separate subnetwork for the independent learning of each class. This separation ensures that the learning of a new class has no effect on the model’s previously acquired knowledge. We present a learning strategy loosely inspired by biological neuron apoptosis (neuron death) and neurogenesis (neuron birth) combined with other methods to achieve effective learning and minimize the model’s resource requirements. The network is entirely feed-forward and uses low precision inter-neuron spikes combined with simple neuron behaviors, making it suitable for very low power Spiking Neural Network (SNN) realization on appropriate future neuromorphic platforms. We demonstrate the model in an abstract setting to explore the tradeoffs between optimizing for accuracy, network size, and run-time costs and show that, despite no competition between the classes during learning, it is possible to achieve top-1 accuracy of 85% on MNIST.

Text
IEEE-NICE2025-Warr - Accepted Manuscript
Available under License Creative Commons Attribution.
Download (2MB)

More information

Submitted date: 1 December 2024
Accepted/In Press date: 3 March 2025
Keywords: Class Incremental Learning, low power, Spiking Neural Networks, CIL, SNN, neuromorphic, neurogenesis, feed-forward

Identifiers

Local EPrints ID: 499752
URI: http://eprints.soton.ac.uk/id/eprint/499752
PURE UUID: 3e775ad7-753c-4dec-adda-42fa3c27b12e
ORCID for Katy Warr: ORCID iD orcid.org/0009-0006-2444-9701
ORCID for Jonathon Hare: ORCID iD orcid.org/0000-0003-2921-4283
ORCID for David Thomas: ORCID iD orcid.org/0000-0002-9671-0917

Catalogue record

Date deposited: 02 Apr 2025 16:54
Last modified: 03 Apr 2025 04:02

Export record

Contributors

Author: Katy Warr ORCID iD
Author: Jonathon Hare ORCID iD
Author: David Thomas ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×