The University of Southampton
University of Southampton Institutional Repository

Model predictive control with self-supervised representation learning

Model predictive control with self-supervised representation learning
Model predictive control with self-supervised representation learning
Over the last few years, we have not seen any major developments in model-free or model-based learning methods that would make one obsolete relative to the other. In most cases, the used technique is heavily dependent on the use case scenario or other attributes, e.g. the environment. Both approaches have their own advantages, for example, sample efficiency or computational efficiency. However, when combining the two, the advantages of each can be combined and hence achieve better performance. The TD-MPC framework is an example of this approach. On the one hand, a world model in combination with model predictive control is used to get a good initial estimate of the value function. On the other hand, a Q function is used to provide a good long-term estimate. Similar to algorithms like MuZero a latent state representation is used, where only task-relevant information is encoded to reduce the complexity. In this paper, we propose the use of a reconstruction function within the TD-MPC framework, so that the agent can reconstruct the original observation given the internal state representation. This allows our agent to have a more stable learning signal during training and also improves sample efficiency. Our proposed addition of another loss term leads to improved performance on both state- and image-based tasks from the DeepMind-Control suite.
arXiv
Matthies, Jonas
955e8a72-d54e-407e-adc6-a7e0251819c8
Hafez, Muhammad Burhan
e8c991ab-d800-46f2-abeb-cb169a1ed47e
Kotb, Mostafa
8b832bc0-91bf-46ed-b15d-aa69445431da
Wermter, Stefan
80682cc6-4251-420a-af8a-f4d616fb0fcc
Matthies, Jonas
955e8a72-d54e-407e-adc6-a7e0251819c8
Hafez, Muhammad Burhan
e8c991ab-d800-46f2-abeb-cb169a1ed47e
Kotb, Mostafa
8b832bc0-91bf-46ed-b15d-aa69445431da
Wermter, Stefan
80682cc6-4251-420a-af8a-f4d616fb0fcc

[Unknown type: UNSPECIFIED]

Record type: UNSPECIFIED

Abstract

Over the last few years, we have not seen any major developments in model-free or model-based learning methods that would make one obsolete relative to the other. In most cases, the used technique is heavily dependent on the use case scenario or other attributes, e.g. the environment. Both approaches have their own advantages, for example, sample efficiency or computational efficiency. However, when combining the two, the advantages of each can be combined and hence achieve better performance. The TD-MPC framework is an example of this approach. On the one hand, a world model in combination with model predictive control is used to get a good initial estimate of the value function. On the other hand, a Q function is used to provide a good long-term estimate. Similar to algorithms like MuZero a latent state representation is used, where only task-relevant information is encoded to reduce the complexity. In this paper, we propose the use of a reconstruction function within the TD-MPC framework, so that the agent can reconstruct the original observation given the internal state representation. This allows our agent to have a more stable learning signal during training and also improves sample efficiency. Our proposed addition of another loss term leads to improved performance on both state- and image-based tasks from the DeepMind-Control suite.

Text
preprint - Author's Original
Available under License Other.
Download (812kB)

More information

Published date: 14 April 2023

Identifiers

Local EPrints ID: 496838
URI: http://eprints.soton.ac.uk/id/eprint/496838
PURE UUID: 0e6823ef-09f5-426f-8b93-f6b1ec19a555
ORCID for Muhammad Burhan Hafez: ORCID iD orcid.org/0000-0003-1670-8962

Catalogue record

Date deposited: 08 Jan 2025 08:13
Last modified: 22 Aug 2025 02:42

Export record

Altmetrics

Contributors

Author: Jonas Matthies
Author: Muhammad Burhan Hafez ORCID iD
Author: Mostafa Kotb
Author: Stefan Wermter

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×