The University of Southampton
University of Southampton Institutional Repository

Humans can visually judge grasp quality and refine their judgments through visual and haptic feedback

Humans can visually judge grasp quality and refine their judgments through visual and haptic feedback
Humans can visually judge grasp quality and refine their judgments through visual and haptic feedback

How humans visually select where to grasp objects is determined by the physical object properties (e.g., size, shape, weight), the degrees of freedom of the arm and hand, as well as the task to be performed. We recently demonstrated that human grasps are near-optimal with respect to a weighted combination of different cost functions that make grasps uncomfortable, unstable, or impossible, e.g., due to unnatural grasp apertures or large torques. Here, we ask whether humans can consciously access these rules. We test if humans can explicitly judge grasp quality derived from rules regarding grasp size, orientation, torque, and visibility. More specifically, we test if grasp quality can be inferred (i) by using visual cues and motor imagery alone, (ii) from watching grasps executed by others, and (iii) through performing grasps, i.e., receiving visual, proprioceptive and haptic feedback. Stimuli were novel objects made of 10 cubes of brass and wood (side length 2.5 cm) in various configurations. On each object, one near-optimal and one sub-optimal grasp were selected based on one cost function (e.g., torque), while the other constraints (grasp size, orientation, and visibility) were kept approximately constant or counterbalanced. Participants were visually cued to the location of the selected grasps on each object and verbally reported which of the two grasps was best. Across three experiments, participants were required to either (i) passively view the static objects and imagine executing the two competing grasps, (ii) passively view videos of other participants grasping the objects, or (iii) actively grasp the objects themselves. Our results show that, for a majority of tested objects, participants could already judge grasp optimality from simply viewing the objects and imagining to grasp them, but were significantly better in the video and grasping session. These findings suggest that humans can determine grasp quality even without performing the grasp—perhaps through motor imagery—and can further refine their understanding of how to correctly grasp an object through sensorimotor feedback but also by passively viewing others grasp objects.

action observation, grasping, material, motor imagery, precision grip, shape, visual grasp selection
1662-4548
Maiello, Guido
c122b089-1bbc-4d3e-b178-b0a1b31a5295
Schepko, Marcel
27175152-d7a9-4671-bbd5-13d1ff4b3a31
Klein, Lina K.
647f7604-4630-4cf1-9ae4-c0b84d28e97e
Paulun, Vivian C.
1f6ebb55-bae1-4c6b-87ba-46d2e2313b8e
Fleming, Roland W.
f9a60356-03e6-4931-a332-f3a7aa9f9915
Maiello, Guido
c122b089-1bbc-4d3e-b178-b0a1b31a5295
Schepko, Marcel
27175152-d7a9-4671-bbd5-13d1ff4b3a31
Klein, Lina K.
647f7604-4630-4cf1-9ae4-c0b84d28e97e
Paulun, Vivian C.
1f6ebb55-bae1-4c6b-87ba-46d2e2313b8e
Fleming, Roland W.
f9a60356-03e6-4931-a332-f3a7aa9f9915

Maiello, Guido, Schepko, Marcel, Klein, Lina K., Paulun, Vivian C. and Fleming, Roland W. (2021) Humans can visually judge grasp quality and refine their judgments through visual and haptic feedback. Frontiers in Neuroscience, 14, [591898]. (doi:10.3389/fnins.2020.591898).

Record type: Article

Abstract

How humans visually select where to grasp objects is determined by the physical object properties (e.g., size, shape, weight), the degrees of freedom of the arm and hand, as well as the task to be performed. We recently demonstrated that human grasps are near-optimal with respect to a weighted combination of different cost functions that make grasps uncomfortable, unstable, or impossible, e.g., due to unnatural grasp apertures or large torques. Here, we ask whether humans can consciously access these rules. We test if humans can explicitly judge grasp quality derived from rules regarding grasp size, orientation, torque, and visibility. More specifically, we test if grasp quality can be inferred (i) by using visual cues and motor imagery alone, (ii) from watching grasps executed by others, and (iii) through performing grasps, i.e., receiving visual, proprioceptive and haptic feedback. Stimuli were novel objects made of 10 cubes of brass and wood (side length 2.5 cm) in various configurations. On each object, one near-optimal and one sub-optimal grasp were selected based on one cost function (e.g., torque), while the other constraints (grasp size, orientation, and visibility) were kept approximately constant or counterbalanced. Participants were visually cued to the location of the selected grasps on each object and verbally reported which of the two grasps was best. Across three experiments, participants were required to either (i) passively view the static objects and imagine executing the two competing grasps, (ii) passively view videos of other participants grasping the objects, or (iii) actively grasp the objects themselves. Our results show that, for a majority of tested objects, participants could already judge grasp optimality from simply viewing the objects and imagining to grasp them, but were significantly better in the video and grasping session. These findings suggest that humans can determine grasp quality even without performing the grasp—perhaps through motor imagery—and can further refine their understanding of how to correctly grasp an object through sensorimotor feedback but also by passively viewing others grasp objects.

Text
fnins-14-591898 - Version of Record
Available under License Creative Commons Attribution.
Download (1MB)

More information

Accepted/In Press date: 16 November 2020
Published date: 12 January 2021
Additional Information: Funding Information: This research was supported by the DFG (IRTG-1901: “The Brain in Action” and SFB-TRR-135: “Cardinal Mechanisms of Perception,” and project PA 3723/1-1), and an ERC Consolidator Award (ERC-2015-CoG-682859: “SHAPE”). GM was supported by a Marie-Skłodowska-Curie Actions Individual Fellowship (H2020-MSCA-IF-2017: “VisualGrasping” Project ID: 793660). Publisher Copyright: © Copyright © 2021 Maiello, Schepko, Klein, Paulun and Fleming.
Keywords: action observation, grasping, material, motor imagery, precision grip, shape, visual grasp selection

Identifiers

Local EPrints ID: 484867
URI: http://eprints.soton.ac.uk/id/eprint/484867
ISSN: 1662-4548
PURE UUID: b5f0cc48-03fd-44ca-b0e2-7ea3ce790355
ORCID for Guido Maiello: ORCID iD orcid.org/0000-0001-6625-2583

Catalogue record

Date deposited: 23 Nov 2023 17:56
Last modified: 18 Mar 2024 04:11

Export record

Altmetrics

Contributors

Author: Guido Maiello ORCID iD
Author: Marcel Schepko
Author: Lina K. Klein
Author: Vivian C. Paulun
Author: Roland W. Fleming

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×