The University of Southampton
University of Southampton Institutional Repository

Classification images for aerial images capture visual expertise for binocular disparity and a prior for lighting from above

Classification images for aerial images capture visual expertise for binocular disparity and a prior for lighting from above
Classification images for aerial images capture visual expertise for binocular disparity and a prior for lighting from above
Using a novel approach to classification images (CIs), we investigated the visual expertise of surveyors for luminance and binocular disparity cues simultaneously after screening for stereoacuity. Stereoscopic aerial images of hedges and ditches were classified in 10,000 trials by six trained remote sensing surveyors and six novices. Images were heavily masked with luminance and disparity noise simultaneously. Hedge and ditch images had reversed disparity on around half the trials meaning hedges became ditch-like and vice versa. The hedge and ditch images were also flipped vertically on around half the trials, changing the direction of the light source and completing a 2 × 2 × 2 stimulus design. CIs were generated by accumulating the noise textures associated with “hedge” and “ditch” classifications, respectively, and subtracting one from the other. Typical CIs had a central peak with one or two negative side-lobes. We found clear differences in the amplitudes and shapes of perceptual templates across groups and noise-type, with experts prioritizing binocular disparity and using this more effectively. Contrariwise, novices used luminance cues more than experts meaning that task motivation alone could not explain group differences. Asymmetries in the luminance CIs revealed individual differences for lighting interpretation, with experts less prone to assume lighting from above, consistent with their training on aerial images of UK scenes lit by a southerly sun. Our results show that (i) dual noise in images can be used to produce simultaneous CI pairs, (ii) expertise for disparity cues does not depend on stereoacuity, (iii) CIs reveal the visual strategies developed by experts, (iv) top-down perceptual biases can be overcome with long-term learning effects, and (v) CIs have practical potential for directing visual training.
1534-7362
Skog, Emil
06559ab0-6015-4d3a-bd0b-fe9cbd51e30d
Meese, Timothy S.
c81251c4-8513-4d82-b55b-6336ebe33a55
Sargent, Isabel M. J.
5d618872-c1aa-4758-9c63-84c3c923fc2f
Ormerod, Andrew
e9eb0254-dc71-4235-b5c4-831436574392
Schofield, Andrew J.
b7d4ef7d-7f03-47a9-9236-39839a953b79
Skog, Emil
06559ab0-6015-4d3a-bd0b-fe9cbd51e30d
Meese, Timothy S.
c81251c4-8513-4d82-b55b-6336ebe33a55
Sargent, Isabel M. J.
5d618872-c1aa-4758-9c63-84c3c923fc2f
Ormerod, Andrew
e9eb0254-dc71-4235-b5c4-831436574392
Schofield, Andrew J.
b7d4ef7d-7f03-47a9-9236-39839a953b79

Skog, Emil, Meese, Timothy S., Sargent, Isabel M. J., Ormerod, Andrew and Schofield, Andrew J. (2024) Classification images for aerial images capture visual expertise for binocular disparity and a prior for lighting from above. Journal of Vision, 24 (4), [11]. (doi:10.1167/jov.24.4.11).

Record type: Article

Abstract

Using a novel approach to classification images (CIs), we investigated the visual expertise of surveyors for luminance and binocular disparity cues simultaneously after screening for stereoacuity. Stereoscopic aerial images of hedges and ditches were classified in 10,000 trials by six trained remote sensing surveyors and six novices. Images were heavily masked with luminance and disparity noise simultaneously. Hedge and ditch images had reversed disparity on around half the trials meaning hedges became ditch-like and vice versa. The hedge and ditch images were also flipped vertically on around half the trials, changing the direction of the light source and completing a 2 × 2 × 2 stimulus design. CIs were generated by accumulating the noise textures associated with “hedge” and “ditch” classifications, respectively, and subtracting one from the other. Typical CIs had a central peak with one or two negative side-lobes. We found clear differences in the amplitudes and shapes of perceptual templates across groups and noise-type, with experts prioritizing binocular disparity and using this more effectively. Contrariwise, novices used luminance cues more than experts meaning that task motivation alone could not explain group differences. Asymmetries in the luminance CIs revealed individual differences for lighting interpretation, with experts less prone to assume lighting from above, consistent with their training on aerial images of UK scenes lit by a southerly sun. Our results show that (i) dual noise in images can be used to produce simultaneous CI pairs, (ii) expertise for disparity cues does not depend on stereoacuity, (iii) CIs reveal the visual strategies developed by experts, (iv) top-down perceptual biases can be overcome with long-term learning effects, and (v) CIs have practical potential for directing visual training.

This record has no associated files available for download.

More information

Published date: 12 April 2024

Identifiers

Local EPrints ID: 499513
URI: http://eprints.soton.ac.uk/id/eprint/499513
ISSN: 1534-7362
PURE UUID: ad1af9f9-886b-4e71-8f89-06f6816dfeb7

Catalogue record

Date deposited: 24 Mar 2025 17:30
Last modified: 24 Mar 2025 17:30

Export record

Altmetrics

Contributors

Author: Emil Skog
Author: Timothy S. Meese
Author: Isabel M. J. Sargent
Author: Andrew Ormerod
Author: Andrew J. Schofield

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×