The University of Southampton
University of Southampton Institutional Repository

DEff-ARTS: differentiable efficient ARchiTecture search

DEff-ARTS: differentiable efficient ARchiTecture search
DEff-ARTS: differentiable efficient ARchiTecture search
Manual design of efficient Deep Neural Networks (DNNs) for mobile and edge devices is an involved process which requires expert human knowledge to improve efficiency in different dimensions. In this paper, we present DEff-ARTS, a differentiable efficient architecture search method for automatically deriving CNN architectures for resource constrained devices. We frame the search as a multi-objective optimisation problem where we minimise the classification loss and the computational complexity of performing inference on the target hardware. Our formulation allows for easy trading-off between the sub-objectives depending on user requirements. Experimental results on CIFAR-10 classification showed that our approach achieved a highly competitive test error rate of 3:24% with 30% fewer parameters and multiply and accumulate (MAC) operations compared to Differentiable ARchiTecture Search (DARTS).
Sadiq, Sulaiman
e82e1fe2-6b8c-4c49-b051-8aef0dabe99a
Maji, Partha
d9041c15-c0b1-4a97-98a1-d1f023b48162
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Merrett, Geoff
89b3a696-41de-44c3-89aa-b0aa29f54020
Sadiq, Sulaiman
e82e1fe2-6b8c-4c49-b051-8aef0dabe99a
Maji, Partha
d9041c15-c0b1-4a97-98a1-d1f023b48162
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Merrett, Geoff
89b3a696-41de-44c3-89aa-b0aa29f54020

Sadiq, Sulaiman, Maji, Partha, Hare, Jonathon and Merrett, Geoff (2020) DEff-ARTS: differentiable efficient ARchiTecture search. NeurIPS 2020 Workshop on ML for Systems, , Vancouver, Canada.

Record type: Conference or Workshop Item (Paper)

Abstract

Manual design of efficient Deep Neural Networks (DNNs) for mobile and edge devices is an involved process which requires expert human knowledge to improve efficiency in different dimensions. In this paper, we present DEff-ARTS, a differentiable efficient architecture search method for automatically deriving CNN architectures for resource constrained devices. We frame the search as a multi-objective optimisation problem where we minimise the classification loss and the computational complexity of performing inference on the target hardware. Our formulation allows for easy trading-off between the sub-objectives depending on user requirements. Experimental results on CIFAR-10 classification showed that our approach achieved a highly competitive test error rate of 3:24% with 30% fewer parameters and multiply and accumulate (MAC) operations compared to Differentiable ARchiTecture Search (DARTS).

Text
DEff_ARTS_NeurIPS2020_MLforSys_CamReady - Accepted Manuscript
Restricted to Repository staff only
Request a copy

More information

Accepted/In Press date: 6 November 2020
Published date: 12 December 2020
Venue - Dates: NeurIPS 2020 Workshop on ML for Systems, , Vancouver, Canada, 2020-12-12

Identifiers

Local EPrints ID: 446198
URI: http://eprints.soton.ac.uk/id/eprint/446198
PURE UUID: 9573607d-a803-434e-8d12-c6e220a144e6
ORCID for Jonathon Hare: ORCID iD orcid.org/0000-0003-2921-4283
ORCID for Geoff Merrett: ORCID iD orcid.org/0000-0003-4980-3894

Catalogue record

Date deposited: 27 Jan 2021 17:33
Last modified: 18 Feb 2021 17:07

Export record

Contributors

Author: Sulaiman Sadiq
Author: Partha Maji
Author: Jonathon Hare ORCID iD
Author: Geoff Merrett ORCID iD

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×