The University of Southampton
University of Southampton Institutional Repository

Data-local autonomous LLM-guided neural architecture search for multiclass multimodal time-series classification

Data-local autonomous LLM-guided neural architecture search for multiclass multimodal time-series classification
Data-local autonomous LLM-guided neural architecture search for multiclass multimodal time-series classification
Applying machine learning to sensitive time-series data is often bottlenecked by the iteration loop: Performance depends strongly on preprocessing and architecture, yet training often has to run on-premise under strict data-local constraints. This is a common problem in healthcare and other privacy-constrained domains (e.g., a hospital developing deep learning models on patient EEG). This bottleneck is particularly challenging in multimodal fusion, where sensor modalities must be individually preprocessed and then combined. LLM-guided neural architecture search (NAS) can automate this exploration, but most existing workflows assume cloud execution or access to data-derived artifacts that cannot be exposed. We present a novel data-local, LLM-guided search framework that handles candidate pipelines remotely while executing all training and evaluation locally under a fixed protocol. The controller observes only trial-level summaries, such as pipeline descriptors, metrics, learning-curve statistics, and failure logs, without ever accessing raw samples or intermediate feature representations. Our framework targets multiclass, multimodal learning via one-vs-rest binary experts per class and modality, a lightweight fusion MLP, and joint search over expert architectures and modality-specific preprocessing. We evaluate our method on two regimes: UEA30 (public multivariate time-series classification dataset) and SleepEDFx sleep staging (heterogeneous clinical modalities such as EEG, EOG, and EMG). The results show that the modular baseline model is strong, and the LLM-guided NAS further improves it. Notably, our method finds models that perform within published ranges across most benchmark datasets. Across both settings, our method reduces manual intervention by enabling unattended architecture search while keeping sensitive data on-premise.
cs.LG, cs.AI
arXiv
Hardarson, Emil
bbdcb067-9e1b-4995-9340-cab3e48b981e
Biedebach, Luka
35f63dbe-4f6f-4f27-b007-52708d3a89af
Ómarsson, Ómar Bessi
42e5a495-e6c1-4701-a340-e4c8d846af22
Hrólfsson, Teitur
9a316d47-cc3c-426c-9228-3050b9a1e931
Islind, Anna Sigridur
aefc8cda-7d3e-4367-bfca-e9a2261fe87f
Óskarsdóttir, María
d159ed8f-9dd3-4ff3-8b00-d43579ab71be
Hardarson, Emil
bbdcb067-9e1b-4995-9340-cab3e48b981e
Biedebach, Luka
35f63dbe-4f6f-4f27-b007-52708d3a89af
Ómarsson, Ómar Bessi
42e5a495-e6c1-4701-a340-e4c8d846af22
Hrólfsson, Teitur
9a316d47-cc3c-426c-9228-3050b9a1e931
Islind, Anna Sigridur
aefc8cda-7d3e-4367-bfca-e9a2261fe87f
Óskarsdóttir, María
d159ed8f-9dd3-4ff3-8b00-d43579ab71be

[Unknown type: UNSPECIFIED]

Record type: UNSPECIFIED

Abstract

Applying machine learning to sensitive time-series data is often bottlenecked by the iteration loop: Performance depends strongly on preprocessing and architecture, yet training often has to run on-premise under strict data-local constraints. This is a common problem in healthcare and other privacy-constrained domains (e.g., a hospital developing deep learning models on patient EEG). This bottleneck is particularly challenging in multimodal fusion, where sensor modalities must be individually preprocessed and then combined. LLM-guided neural architecture search (NAS) can automate this exploration, but most existing workflows assume cloud execution or access to data-derived artifacts that cannot be exposed. We present a novel data-local, LLM-guided search framework that handles candidate pipelines remotely while executing all training and evaluation locally under a fixed protocol. The controller observes only trial-level summaries, such as pipeline descriptors, metrics, learning-curve statistics, and failure logs, without ever accessing raw samples or intermediate feature representations. Our framework targets multiclass, multimodal learning via one-vs-rest binary experts per class and modality, a lightweight fusion MLP, and joint search over expert architectures and modality-specific preprocessing. We evaluate our method on two regimes: UEA30 (public multivariate time-series classification dataset) and SleepEDFx sleep staging (heterogeneous clinical modalities such as EEG, EOG, and EMG). The results show that the modular baseline model is strong, and the LLM-guided NAS further improves it. Notably, our method finds models that perform within published ranges across most benchmark datasets. Across both settings, our method reduces manual intervention by enabling unattended architecture search while keeping sensitive data on-premise.

Text
2603.15939v1 - Author's Original
Download (505kB)

More information

Published date: 16 March 2026
Keywords: cs.LG, cs.AI

Identifiers

Local EPrints ID: 510767
URI: http://eprints.soton.ac.uk/id/eprint/510767
PURE UUID: bd9d2913-7b9b-4005-ab30-bb89f5b10def
ORCID for María Óskarsdóttir: ORCID iD orcid.org/0000-0001-5095-5356

Catalogue record

Date deposited: 21 Apr 2026 16:50
Last modified: 22 Apr 2026 02:14

Export record

Altmetrics

Contributors

Author: Emil Hardarson
Author: Luka Biedebach
Author: Ómar Bessi Ómarsson
Author: Teitur Hrólfsson
Author: Anna Sigridur Islind
Author: María Óskarsdóttir ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×