The University of Southampton
University of Southampton Institutional Repository

Efficient intrinsically motivated robotic grasping with learning-adaptive imagination in latent space

Efficient intrinsically motivated robotic grasping with learning-adaptive imagination in latent space
Efficient intrinsically motivated robotic grasping with learning-adaptive imagination in latent space
Combining model-based and model-free deep reinforcement learning has shown great promise for improving sample efficiency on complex control tasks while still retaining high performance. Incorporating imagination is a recent effort in this direction inspired by human mental simulation of motor behavior. We propose a learning-adaptive imagination approach which, unlike previous approaches, takes into account the reliability of the learned dynamics model used for imagining the future. Our approach learns an ensemble of disjoint local dynamics models in latent space and derives an intrinsic reward based on learning progress, motivating the controller to take actions leading to data that improves the models. The learned models are used to generate imagined experiences, augmenting the training set of real experiences. We evaluate our approach on learning vision-based robotic grasping and show that it significantly improves sample efficiency and achieves near-optimal performance in a sparse reward environment.
240-246
Hafez, Muhammad Burhan
e8c991ab-d800-46f2-abeb-cb169a1ed47e
Weber, Cornelius
4e097e6c-840c-460a-8572-e8759f137e43
Kerzel, Matthias
a7ec71f0-3fa1-4acb-a46b-9198ba76ff14
Wermter, Stefan
80682cc6-4251-420a-af8a-f4d616fb0fcc
Hafez, Muhammad Burhan
e8c991ab-d800-46f2-abeb-cb169a1ed47e
Weber, Cornelius
4e097e6c-840c-460a-8572-e8759f137e43
Kerzel, Matthias
a7ec71f0-3fa1-4acb-a46b-9198ba76ff14
Wermter, Stefan
80682cc6-4251-420a-af8a-f4d616fb0fcc

Hafez, Muhammad Burhan, Weber, Cornelius, Kerzel, Matthias and Wermter, Stefan (2019) Efficient intrinsically motivated robotic grasping with learning-adaptive imagination in latent space. Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics, , Oslo, Norway. 19 - 22 Aug 2019. pp. 240-246 . (doi:10.1109/DEVLRN.2019.8850723).

Record type: Conference or Workshop Item (Paper)

Abstract

Combining model-based and model-free deep reinforcement learning has shown great promise for improving sample efficiency on complex control tasks while still retaining high performance. Incorporating imagination is a recent effort in this direction inspired by human mental simulation of motor behavior. We propose a learning-adaptive imagination approach which, unlike previous approaches, takes into account the reliability of the learned dynamics model used for imagining the future. Our approach learns an ensemble of disjoint local dynamics models in latent space and derives an intrinsic reward based on learning progress, motivating the controller to take actions leading to data that improves the models. The learned models are used to generate imagined experiences, augmenting the training set of real experiences. We evaluate our approach on learning vision-based robotic grasping and show that it significantly improves sample efficiency and achieves near-optimal performance in a sparse reward environment.

Text
hafez_epirob2019 - Accepted Manuscript
Restricted to Repository staff only
Request a copy

More information

Published date: 30 September 2019
Venue - Dates: Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics, , Oslo, Norway, 2019-08-19 - 2019-08-22

Identifiers

Local EPrints ID: 495849
URI: http://eprints.soton.ac.uk/id/eprint/495849
PURE UUID: 30b62611-ad92-4117-a540-0b7024e53a41
ORCID for Muhammad Burhan Hafez: ORCID iD orcid.org/0000-0003-1670-8962

Catalogue record

Date deposited: 25 Nov 2024 17:50
Last modified: 26 Nov 2024 03:10

Export record

Altmetrics

Contributors

Author: Muhammad Burhan Hafez ORCID iD
Author: Cornelius Weber
Author: Matthias Kerzel
Author: Stefan Wermter

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×