Stereo viewing modulates three-dimensional shape processing during object recognition: a high-density ERP study
Stereo viewing modulates three-dimensional shape processing during object recognition: a high-density ERP study
The role of stereo disparity in the recognition of 3-dimensional (3D) object shape remains an unresolved issue for theoretical models of the human visual system. We examined this issue using high-density (128 channel) recordings of event-related potentials (ERPs). A recognition memory task was used in which observers were trained to recognize a subset of complex, multipart, 3D novel objects under conditions of either (bi-) monocular or stereo viewing. In a subsequent test phase they discriminated previously trained targets from untrained distractor objects that shared either local parts, 3D spatial configuration, or neither dimension, across both previously seen and novel viewpoints. The behavioral data showed a stereo advantage for target recognition at untrained viewpoints. ERPs showed early differential amplitude modulations to shape similarity defined by local part structure and global 3D spatial configuration. This occurred initially during an N1 component around 145–190 ms poststimulus onset, and then subsequently during an N2/P3 component around 260–385 ms poststimulus onset. For mono viewing, amplitude modulation during the N1 was greatest between targets and distracters with different local parts for trained views only. For stereo viewing, amplitude modulation during the N2/P3 was greatest between targets and distracters with different global 3D spatial configurations and generalized across trained and untrained views. The results show that image classification is modulated by stereo information about the local part, and global 3D spatial configuration of object shape. The findings challenge current theoretical models that do not attribute functional significance to stereo input during the computation of 3D object shape.
518 - 534
Oliver, Zoe J.
7e07aa6e-f5a5-45a4-a8fc-05495ac29ed5
Cristino, Filipe
b47224fa-e770-4e31-9371-4737be3e1e50
Roberts, Mark V.
cfbeb631-bda8-471e-8b76-af330292c9bb
Pegna, Alan J.
c24ee439-41ee-4014-bab8-7296e9d24a3d
Leek, Elwyn
6f63c405-e28f-4f8c-8ead-3b0a79c7dc88
2018
Oliver, Zoe J.
7e07aa6e-f5a5-45a4-a8fc-05495ac29ed5
Cristino, Filipe
b47224fa-e770-4e31-9371-4737be3e1e50
Roberts, Mark V.
cfbeb631-bda8-471e-8b76-af330292c9bb
Pegna, Alan J.
c24ee439-41ee-4014-bab8-7296e9d24a3d
Leek, Elwyn
6f63c405-e28f-4f8c-8ead-3b0a79c7dc88
Oliver, Zoe J., Cristino, Filipe, Roberts, Mark V., Pegna, Alan J. and Leek, Elwyn
(2018)
Stereo viewing modulates three-dimensional shape processing during object recognition: a high-density ERP study.
Reviewed Journal of Experimental Psychology: Human Perception and Performance, 44 (4), .
(doi:10.1037/xhp0000444).
Abstract
The role of stereo disparity in the recognition of 3-dimensional (3D) object shape remains an unresolved issue for theoretical models of the human visual system. We examined this issue using high-density (128 channel) recordings of event-related potentials (ERPs). A recognition memory task was used in which observers were trained to recognize a subset of complex, multipart, 3D novel objects under conditions of either (bi-) monocular or stereo viewing. In a subsequent test phase they discriminated previously trained targets from untrained distractor objects that shared either local parts, 3D spatial configuration, or neither dimension, across both previously seen and novel viewpoints. The behavioral data showed a stereo advantage for target recognition at untrained viewpoints. ERPs showed early differential amplitude modulations to shape similarity defined by local part structure and global 3D spatial configuration. This occurred initially during an N1 component around 145–190 ms poststimulus onset, and then subsequently during an N2/P3 component around 260–385 ms poststimulus onset. For mono viewing, amplitude modulation during the N1 was greatest between targets and distracters with different local parts for trained views only. For stereo viewing, amplitude modulation during the N2/P3 was greatest between targets and distracters with different global 3D spatial configurations and generalized across trained and untrained views. The results show that image classification is modulated by stereo information about the local part, and global 3D spatial configuration of object shape. The findings challenge current theoretical models that do not attribute functional significance to stereo input during the computation of 3D object shape.
Text
2017-46029-001
- Version of Record
More information
e-pub ahead of print date: 12 October 2017
Published date: 2018
Identifiers
Local EPrints ID: 494418
URI: http://eprints.soton.ac.uk/id/eprint/494418
PURE UUID: 9a94d9ca-f9b7-4a59-b81a-44dfed55d8f3
Catalogue record
Date deposited: 07 Oct 2024 17:26
Last modified: 08 Oct 2024 02:11
Export record
Altmetrics
Contributors
Author:
Zoe J. Oliver
Author:
Filipe Cristino
Author:
Mark V. Roberts
Author:
Alan J. Pegna
Author:
Elwyn Leek
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics