FedTMOS: efficient one-shot federated learning with Tsetlin machine
FedTMOS: efficient one-shot federated learning with Tsetlin machine
One-Shot Federated Learning (OFL) has emerged as a promising alternative to address the communication bottleneck. OFL restricts communication to a single round, thus minimizing communication errors, cost and reducing the risk of interference caused by iterative updates [1]. Current OFL methods that rely on Knowledge Distillation (KD) and ensemble learning aggregate local models into an ensemble before distilling it into a global model. A key challenge with these methods is their dependence on public datasets, which may be unsuitable for certain tasks [1]. Datafree methods with generative models [2], suffer from additional computational overhead. Neuron matching and model fusion techniques, eliminates the need for server-side training but struggle with performance when models are trained on heterogeneous data distributions due to misalignment of models [3]. On the client side, existing methods rely on Deep Neural Networks (DNNs), which are resource-intensive and impractical for clients with limited computational capabilities, such as edge devices. Therefore, we proposed a solution based on the Tsetlin Machine (TM) for efficient OFL [4], [5].
How, Shannon Shi Qi
f0753707-bb20-42e3-a1aa-bebf9f6c9545
Chauhan, Jagmohan
0a447841-def6-420d-a208-a79003a7e546
Merrett, Geoff
89b3a696-41de-44c3-89aa-b0aa29f54020
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
8 July 2025
How, Shannon Shi Qi
f0753707-bb20-42e3-a1aa-bebf9f6c9545
Chauhan, Jagmohan
0a447841-def6-420d-a208-a79003a7e546
Merrett, Geoff
89b3a696-41de-44c3-89aa-b0aa29f54020
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
How, Shannon Shi Qi, Chauhan, Jagmohan, Merrett, Geoff and Hare, Jonathon
(2025)
FedTMOS: efficient one-shot federated learning with Tsetlin machine.
Seventh UK Mobile, Wearable and Ubiquitous Systems Research Symposium, University of Edinburgh, Edinburgh, United Kingdom.
07 - 08 Jul 2025.
1 pp
.
Record type:
Conference or Workshop Item
(Paper)
Abstract
One-Shot Federated Learning (OFL) has emerged as a promising alternative to address the communication bottleneck. OFL restricts communication to a single round, thus minimizing communication errors, cost and reducing the risk of interference caused by iterative updates [1]. Current OFL methods that rely on Knowledge Distillation (KD) and ensemble learning aggregate local models into an ensemble before distilling it into a global model. A key challenge with these methods is their dependence on public datasets, which may be unsuitable for certain tasks [1]. Datafree methods with generative models [2], suffer from additional computational overhead. Neuron matching and model fusion techniques, eliminates the need for server-side training but struggle with performance when models are trained on heterogeneous data distributions due to misalignment of models [3]. On the client side, existing methods rely on Deep Neural Networks (DNNs), which are resource-intensive and impractical for clients with limited computational capabilities, such as edge devices. Therefore, we proposed a solution based on the Tsetlin Machine (TM) for efficient OFL [4], [5].
Text
S5_P3_How_FedTMOS
- Version of Record
More information
Submitted date: 6 May 2025
Accepted/In Press date: 2 June 2025
Published date: 8 July 2025
Venue - Dates:
Seventh UK Mobile, Wearable and Ubiquitous Systems Research Symposium, University of Edinburgh, Edinburgh, United Kingdom, 2025-07-07 - 2025-07-08
Identifiers
Local EPrints ID: 506168
URI: http://eprints.soton.ac.uk/id/eprint/506168
PURE UUID: 14f236db-1760-428d-a35c-22ce2dc55b38
Catalogue record
Date deposited: 29 Oct 2025 17:42
Last modified: 30 Oct 2025 02:39
Export record
Contributors
Author:
Shannon Shi Qi How
Author:
Jagmohan Chauhan
Author:
Geoff Merrett
Author:
Jonathon Hare
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics