Evolutionary data subset selection for class-Incremental learning on memory-constrained systems
Evolutionary data subset selection for class-Incremental learning on memory-constrained systems
Training Machine Learning classifiers on extreme edge devices with non-volatile memory size ≤ 10 MB is challenging because of a) the small number of data examples that can be preserved on-device and b) the dynamic nature of the training dataset caused by the continual collection of new examples. Learning from a stream of data batches, each consisting of examples from a new class, is studied by class-incremental Continual Learning. We address the challenge of training an ML classifier on an extreme edge device that collects examples in a class-incremental fashion and propose a framework for selecting which examples should be preserved in non-volatile memory. We apply a genetic algorithm to identify data subsets with minimal degradation in the top-1 accuracy of a k-Nearest-Neighbors classifier. We evaluate the proposed subset selection method on MNIST and FashionMNIST datasets and observe top-1 accuracy gains of 26.3 % and 18.6 % relative to the mean accuracy obtained by random balanced subsets containing only 0.05 % of the training set. When the genetic algorithm was applied to a class-incremental setting, it achieved comparable accuracy to random balanced selection with 8× less non-volatile memory in specific cases.
Genetic algorithms (GA), Machine Learning, Microcontrollers
255-258
Baikas, Epifanios
9d2e161f-828d-4751-a37c-2504ce917b08
Tarapore, Danesh
fe8ec8ae-1fad-4726-abef-84b538542ee4
Thomas, David
5701997d-7de3-4e57-a802-ea2bd3e6ab6c
Baikas, Epifanios
9d2e161f-828d-4751-a37c-2504ce917b08
Tarapore, Danesh
fe8ec8ae-1fad-4726-abef-84b538542ee4
Thomas, David
5701997d-7de3-4e57-a802-ea2bd3e6ab6c
Baikas, Epifanios, Tarapore, Danesh and Thomas, David
(2024)
Evolutionary data subset selection for class-Incremental learning on memory-constrained systems.
Genetic and Evolutionary Computation Conference 2024, Melbourne Convention and Exhibition Centre, Melbourne, Australia.
14 - 18 Jul 2024.
.
(In Press)
(doi:10.1145/3638530.3654109).
Record type:
Conference or Workshop Item
(Paper)
Abstract
Training Machine Learning classifiers on extreme edge devices with non-volatile memory size ≤ 10 MB is challenging because of a) the small number of data examples that can be preserved on-device and b) the dynamic nature of the training dataset caused by the continual collection of new examples. Learning from a stream of data batches, each consisting of examples from a new class, is studied by class-incremental Continual Learning. We address the challenge of training an ML classifier on an extreme edge device that collects examples in a class-incremental fashion and propose a framework for selecting which examples should be preserved in non-volatile memory. We apply a genetic algorithm to identify data subsets with minimal degradation in the top-1 accuracy of a k-Nearest-Neighbors classifier. We evaluate the proposed subset selection method on MNIST and FashionMNIST datasets and observe top-1 accuracy gains of 26.3 % and 18.6 % relative to the mean accuracy obtained by random balanced subsets containing only 0.05 % of the training set. When the genetic algorithm was applied to a class-incremental setting, it achieved comparable accuracy to random balanced selection with 8× less non-volatile memory in specific cases.
Text
evodss
- Author's Original
Restricted to Repository staff only
Request a copy
More information
Accepted/In Press date: 11 April 2024
Venue - Dates:
Genetic and Evolutionary Computation Conference 2024, Melbourne Convention and Exhibition Centre, Melbourne, Australia, 2024-07-14 - 2024-07-18
Keywords:
Genetic algorithms (GA), Machine Learning, Microcontrollers
Identifiers
Local EPrints ID: 489891
URI: http://eprints.soton.ac.uk/id/eprint/489891
PURE UUID: 6f69c131-2616-431f-9722-eee5668b4c99
Catalogue record
Date deposited: 07 May 2024 16:36
Last modified: 03 Sep 2024 02:06
Export record
Altmetrics
Contributors
Author:
Epifanios Baikas
Author:
Danesh Tarapore
Author:
David Thomas
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics