Applying vision-guided graph neural networks for adaptive task planning in dynamic human robot collaborative scenarios
Applying vision-guided graph neural networks for adaptive task planning in dynamic human robot collaborative scenarios
The Assemble-To-Order (ATO) strategy is increasingly becoming prevalent in the manufacturing sector due to the high demand for high-volume personalized and customized goods. The use of Human-Robot Collaborative (HRC) Systems are increasingly being investigated in order to make use of the dexterous strength of human hands while at the same time make use of the ability of robots to carry massive loads. However, current HRC systems struggle to adapt dynamically to varying human actions and cluttered workspaces.
In this paper, we propose a novel neural network framework that integrates both Graph Neural Network (GNN) and Long Short-Term Memory (LSTM) for adaptive response during HRC scenarios. Our framework enables a robot to interpret human actions and generate detailed action plans while dealing with objects in a cluttered workspace thereby addressing the challenges of dynamic human-robot collaboration. Experimental results demonstrate improvements in assembly efficiency and flexibil- ity, making our approach the first integration of iterative grasping and flexible HRC within a unified neural network architecture.
Ma, Ruidong
5bc4e62f-9ec7-41f6-8939-e13a63bcaabe
Liu, Yanan
b0b0c73b-61e5-4198-ad04-8897cb589c3e
Graf, Erich W.
1a5123e2-8f05-4084-a6e6-837dcfc66209
Oyekan, John
6f644c7c-eeb0-4abc-ade0-53a126fe769a
Ma, Ruidong
5bc4e62f-9ec7-41f6-8939-e13a63bcaabe
Liu, Yanan
b0b0c73b-61e5-4198-ad04-8897cb589c3e
Graf, Erich W.
1a5123e2-8f05-4084-a6e6-837dcfc66209
Oyekan, John
6f644c7c-eeb0-4abc-ade0-53a126fe769a
Ma, Ruidong, Liu, Yanan, Graf, Erich W. and Oyekan, John
(2024)
Applying vision-guided graph neural networks for adaptive task planning in dynamic human robot collaborative scenarios.
Advanced Robotics.
(doi:10.1080/01691864.2024.2407115).
Abstract
The Assemble-To-Order (ATO) strategy is increasingly becoming prevalent in the manufacturing sector due to the high demand for high-volume personalized and customized goods. The use of Human-Robot Collaborative (HRC) Systems are increasingly being investigated in order to make use of the dexterous strength of human hands while at the same time make use of the ability of robots to carry massive loads. However, current HRC systems struggle to adapt dynamically to varying human actions and cluttered workspaces.
In this paper, we propose a novel neural network framework that integrates both Graph Neural Network (GNN) and Long Short-Term Memory (LSTM) for adaptive response during HRC scenarios. Our framework enables a robot to interpret human actions and generate detailed action plans while dealing with objects in a cluttered workspace thereby addressing the challenges of dynamic human-robot collaboration. Experimental results demonstrate improvements in assembly efficiency and flexibil- ity, making our approach the first integration of iterative grasping and flexible HRC within a unified neural network architecture.
Text
Graph_based_robot_planning_in_human_robot_collaboraiton
- Accepted Manuscript
Text
Applying vision-guided graph neural networks for adaptive task planning in dynamic human robot collaborative scenarios
- Version of Record
More information
Accepted/In Press date: 30 August 2024
e-pub ahead of print date: 30 September 2024
Identifiers
Local EPrints ID: 494188
URI: http://eprints.soton.ac.uk/id/eprint/494188
PURE UUID: d05a0e0b-a0a1-4a28-b174-dc3f8ca0967a
Catalogue record
Date deposited: 26 Sep 2024 17:05
Last modified: 09 Oct 2024 01:43
Export record
Altmetrics
Contributors
Author:
Ruidong Ma
Author:
Yanan Liu
Author:
John Oyekan
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics