The University of Southampton
University of Southampton Institutional Repository

MLPs are all you need for human activity recognition

MLPs are all you need for human activity recognition
MLPs are all you need for human activity recognition

Convolution, recurrent, and attention-based deep learning techniques have produced the most recent state-of-the-art results in multiple sensor-based human activity recognition (HAR) datasets. However, these techniques have high computing costs, restricting their use in low-powered devices. Different methods have been employed to increase the efficiency of these techniques; however, this often results in worse performance. Recently, pure multi-layer perceptron (MLP) architectures have demonstrated competitive performance in vision-based tasks with lower computation costs than other deep-learning techniques. The MLP-Mixer is a pioneering pureMLP architecture that produces competitive results with state-of-the-art models in computer vision tasks. This paper shows the viability of the MLP-Mixer in sensor-based HAR. Furthermore, experiments are performed to gain insight into the Mixer modules essential for HAR, and a visual analysis of the Mixer’s weights is provided, validating the Mixer’s learning capabilities. As a result, the Mixer achieves (Formula presented.) scores of 97%, 84.2%, 91.2%, and 90% on the PAMAP2, Daphnet Gait, Opportunity Gestures, and Opportunity Locomotion datasets, respectively, outperforming state-of-the-art models in all datasets except Opportunity Gestures.

efficiency, human activity recognition, MLP-Mixer
2076-3417
Ojiako, Kamsiriochukwu
7c0f1902-b6df-4f2c-914c-74b94db7678e
Farrahi, Katayoun
bc848b9c-fc32-475c-b241-f6ade8babacb
Ojiako, Kamsiriochukwu
7c0f1902-b6df-4f2c-914c-74b94db7678e
Farrahi, Katayoun
bc848b9c-fc32-475c-b241-f6ade8babacb

Ojiako, Kamsiriochukwu and Farrahi, Katayoun (2023) MLPs are all you need for human activity recognition. Applied Sciences (Switzerland), 13 (20), [11154]. (doi:10.3390/app132011154).

Record type: Article

Abstract

Convolution, recurrent, and attention-based deep learning techniques have produced the most recent state-of-the-art results in multiple sensor-based human activity recognition (HAR) datasets. However, these techniques have high computing costs, restricting their use in low-powered devices. Different methods have been employed to increase the efficiency of these techniques; however, this often results in worse performance. Recently, pure multi-layer perceptron (MLP) architectures have demonstrated competitive performance in vision-based tasks with lower computation costs than other deep-learning techniques. The MLP-Mixer is a pioneering pureMLP architecture that produces competitive results with state-of-the-art models in computer vision tasks. This paper shows the viability of the MLP-Mixer in sensor-based HAR. Furthermore, experiments are performed to gain insight into the Mixer modules essential for HAR, and a visual analysis of the Mixer’s weights is provided, validating the Mixer’s learning capabilities. As a result, the Mixer achieves (Formula presented.) scores of 97%, 84.2%, 91.2%, and 90% on the PAMAP2, Daphnet Gait, Opportunity Gestures, and Opportunity Locomotion datasets, respectively, outperforming state-of-the-art models in all datasets except Opportunity Gestures.

Text
applsci-13-11154 - Version of Record
Available under License Creative Commons Attribution.
Download (869kB)

More information

Accepted/In Press date: 29 September 2023
Published date: 11 October 2023
Keywords: efficiency, human activity recognition, MLP-Mixer

Identifiers

Local EPrints ID: 500499
URI: http://eprints.soton.ac.uk/id/eprint/500499
ISSN: 2076-3417
PURE UUID: cbd0846a-ea55-407f-b267-93c34ac04090
ORCID for Katayoun Farrahi: ORCID iD orcid.org/0000-0001-6775-127X

Catalogue record

Date deposited: 02 May 2025 16:31
Last modified: 22 Aug 2025 02:20

Export record

Altmetrics

Contributors

Author: Kamsiriochukwu Ojiako
Author: Katayoun Farrahi ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×