The University of Southampton
University of Southampton Institutional Repository

Deep optimisation: multi-scale evolution by inducing and searching in deep representations

Deep optimisation: multi-scale evolution by inducing and searching in deep representations
Deep optimisation: multi-scale evolution by inducing and searching in deep representations
The ability of evolutionary processes to innovate and scale up over long periods of time, observed in nature, remains a central mystery in evolutionary biology, and a challenge for algorithm designers to emulate and explain in evolutionary computation (EC). The Major Transitions in Evolution is a compelling theory that explains evolvability through a multi-scale process whereby individuality (and hence selection and variation) is continually revised by the formation of associations between formerly independent entities, a process still not fully explored in EC. Deep Optimisation (DO) is a new type of model-building optimization algorithm (MBOA) that exploits deep learning methods to enable multi-scale optimization. DO uses an autoencoder model to induce a multi-level representation of solutions, capturing the relationships between the lower-level units that contribute to the quality of a solution. Variation and selection are then performed within the induced representations, causing model-informed changes to multiple solution variables simultaneously. Here, we first show that DO has impressive performance compared with other leading MBOAs (and other rival methods) on multiple knapsack problems, a standard combinatorial optimization problem of general interest. Going deeper, we then carry out a detailed investigation to understand the differences between DO and other MBOAs, identifying key problem characteristics where other MBOAs are afflicted by exponential running times, and DO is not. This study serves to concretize our understanding of the Major Transitions theory, and why that leads to evolvability, and also provides a strong motivation for further investigation of deep learning methods in optimization.
0302-9743
506-521
Springer Cham
Caldwell, Jamie
aa2247c2-39af-46d1-be8f-df7aab07bde0
Knowles, Joshua
f14fcc72-8a20-4542-90cc-663fed9971c5
Thies, Christoph
Kubacki, Filip
b2cddf80-4f6f-4f3f-93f7-70e0388086a2
Watson, Richard
ce199dfc-d5d4-4edf-bd7b-f9e224c96c75
Castillo, Pedro A.
Jiménez Laredo, Juan Luis
Caldwell, Jamie
aa2247c2-39af-46d1-be8f-df7aab07bde0
Knowles, Joshua
f14fcc72-8a20-4542-90cc-663fed9971c5
Thies, Christoph
Kubacki, Filip
b2cddf80-4f6f-4f3f-93f7-70e0388086a2
Watson, Richard
ce199dfc-d5d4-4edf-bd7b-f9e224c96c75
Castillo, Pedro A.
Jiménez Laredo, Juan Luis

Caldwell, Jamie, Knowles, Joshua, Thies, Christoph, Kubacki, Filip and Watson, Richard (2021) Deep optimisation: multi-scale evolution by inducing and searching in deep representations. In, Castillo, Pedro A. and Jiménez Laredo, Juan Luis (eds.) Applications of Evolutionary Computation. (Applications of Evolutionary Computation, 12694) 1 ed. Springer Cham, pp. 506-521. (doi:10.1007/978-3-030-72699-7_32).

Record type: Book Section

Abstract

The ability of evolutionary processes to innovate and scale up over long periods of time, observed in nature, remains a central mystery in evolutionary biology, and a challenge for algorithm designers to emulate and explain in evolutionary computation (EC). The Major Transitions in Evolution is a compelling theory that explains evolvability through a multi-scale process whereby individuality (and hence selection and variation) is continually revised by the formation of associations between formerly independent entities, a process still not fully explored in EC. Deep Optimisation (DO) is a new type of model-building optimization algorithm (MBOA) that exploits deep learning methods to enable multi-scale optimization. DO uses an autoencoder model to induce a multi-level representation of solutions, capturing the relationships between the lower-level units that contribute to the quality of a solution. Variation and selection are then performed within the induced representations, causing model-informed changes to multiple solution variables simultaneously. Here, we first show that DO has impressive performance compared with other leading MBOAs (and other rival methods) on multiple knapsack problems, a standard combinatorial optimization problem of general interest. Going deeper, we then carry out a detailed investigation to understand the differences between DO and other MBOAs, identifying key problem characteristics where other MBOAs are afflicted by exponential running times, and DO is not. This study serves to concretize our understanding of the Major Transitions theory, and why that leads to evolvability, and also provides a strong motivation for further investigation of deep learning methods in optimization.

Text
DO 978-3-030-72699-7 - Version of Record
Restricted to Repository staff only
Request a copy

More information

Published date: 1 April 2021

Identifiers

Local EPrints ID: 468234
URI: http://eprints.soton.ac.uk/id/eprint/468234
ISSN: 0302-9743
PURE UUID: 9d3564f4-80c5-4380-8b08-9dc58b2fa5a6
ORCID for Richard Watson: ORCID iD orcid.org/0000-0002-2521-8255

Catalogue record

Date deposited: 08 Aug 2022 16:39
Last modified: 13 May 2025 01:39

Export record

Altmetrics

Contributors

Author: Jamie Caldwell
Author: Joshua Knowles
Author: Christoph Thies
Author: Filip Kubacki
Author: Richard Watson ORCID iD
Editor: Pedro A. Castillo
Editor: Juan Luis Jiménez Laredo

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×