Using multidimensional scaling to quantify visual similarity in visual search and beyond
Using multidimensional scaling to quantify visual similarity in visual search and beyond
Visual search is one of the most widely studied topics in vision science, both as an independent topic of interest, and as a tool for studying attention and visual cognition. A wide literature exists that seeks to understand how people find things under varying conditions of difficulty and complexity, and in situations ranging from the mundane (e.g., looking for one's keys) to those with significant societal importance (e.g., baggage or medical screening). A primary determinant of the ease and probability of success during search are the similarity relationships that exist in the search environment, such as the similarity between the background and the target, or the likeness of the non-targets to one another. A sense of similarity is often intuitive, but it is seldom quantified directly. This presents a problem in that similarity relationships are imprecisely specified, limiting the capacity of the researcher to examine adequately their influence. In this article, we present a novel approach to overcoming this problem that combines multi-dimensional scaling (MDS) analyses with behavioral and eye-tracking measurements. We propose a method whereby MDS can be repurposed to successfully quantify the similarity of experimental stimuli, thereby opening up theoretical questions in visual search and attention that cannot currently be addressed. These quantifications, in conjunction with behavioral and oculomotor measures, allow for critical observations about how similarity affects performance, information selection, and information processing. We provide a demonstration and tutorial of the approach, identify documented examples of its use, discuss how complementary computer vision methods could also be adopted, and close with a discussion of potential avenues for future application of this technique.
methods, similarity, multi-dimensional scaling, visual search, eye-movements
3-20
Hout, M.C.
6284da91-ecbd-4e5b-8a61-1332645c0665
Godwin, H.J.
df22dc0c-01d1-440a-a369-a763801851e5
Fitzsimmons, Gemma
ac6b7c69-8992-44f1-92ca-05aa22e75129
Robbins, A.
39255b8b-8bbf-4d39-92e1-59d51a351e07
Menneer, T.
d684eaf6-1494-4004-9973-cb8ccc628efa
Goldinger, S.D.
e6e8b921-7b5c-4e27-8e86-63522ef2fb34
January 2016
Hout, M.C.
6284da91-ecbd-4e5b-8a61-1332645c0665
Godwin, H.J.
df22dc0c-01d1-440a-a369-a763801851e5
Fitzsimmons, Gemma
ac6b7c69-8992-44f1-92ca-05aa22e75129
Robbins, A.
39255b8b-8bbf-4d39-92e1-59d51a351e07
Menneer, T.
d684eaf6-1494-4004-9973-cb8ccc628efa
Goldinger, S.D.
e6e8b921-7b5c-4e27-8e86-63522ef2fb34
Hout, M.C., Godwin, H.J., Fitzsimmons, Gemma, Robbins, A., Menneer, T. and Goldinger, S.D.
(2016)
Using multidimensional scaling to quantify visual similarity in visual search and beyond.
Attention, Perception, & Psychophysics, 78 (1), .
(doi:10.3758/s13414-015-1010-6).
Abstract
Visual search is one of the most widely studied topics in vision science, both as an independent topic of interest, and as a tool for studying attention and visual cognition. A wide literature exists that seeks to understand how people find things under varying conditions of difficulty and complexity, and in situations ranging from the mundane (e.g., looking for one's keys) to those with significant societal importance (e.g., baggage or medical screening). A primary determinant of the ease and probability of success during search are the similarity relationships that exist in the search environment, such as the similarity between the background and the target, or the likeness of the non-targets to one another. A sense of similarity is often intuitive, but it is seldom quantified directly. This presents a problem in that similarity relationships are imprecisely specified, limiting the capacity of the researcher to examine adequately their influence. In this article, we present a novel approach to overcoming this problem that combines multi-dimensional scaling (MDS) analyses with behavioral and eye-tracking measurements. We propose a method whereby MDS can be repurposed to successfully quantify the similarity of experimental stimuli, thereby opening up theoretical questions in visual search and attention that cannot currently be addressed. These quantifications, in conjunction with behavioral and oculomotor measures, allow for critical observations about how similarity affects performance, information selection, and information processing. We provide a demonstration and tutorial of the approach, identify documented examples of its use, discuss how complementary computer vision methods could also be adopted, and close with a discussion of potential avenues for future application of this technique.
Text
Hout et al APP2015 Rev2.docx
- Author's Original
More information
Accepted/In Press date: 10 October 2015
e-pub ahead of print date: 22 October 2015
Published date: January 2016
Keywords:
methods, similarity, multi-dimensional scaling, visual search, eye-movements
Identifiers
Local EPrints ID: 388782
URI: http://eprints.soton.ac.uk/id/eprint/388782
ISSN: 1943-3921
PURE UUID: 648d2609-e0c6-4db1-81bc-aac2ccd11203
Catalogue record
Date deposited: 03 Mar 2016 10:12
Last modified: 15 Mar 2024 03:34
Export record
Altmetrics
Contributors
Author:
M.C. Hout
Author:
Gemma Fitzsimmons
Author:
A. Robbins
Author:
T. Menneer
Author:
S.D. Goldinger
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics