Egocentric visual distance estimation in vista space in real and virtual environments
Egocentric visual distance estimation in vista space in real and virtual environments
Dataset supporting Experiment 1 of Doctoral thesis: "Auditory fitness for duty: Acoustic stealth awareness".
DESCRIPTION OF THE DATA
Participants had to estimate the distance between themselves and a human target at distances of 25, 50, 75, 100 and 125 m, on Southampton Common (real environment) and in a virtual model of Southampton Common (virtual environment, viewed via Oculus Rift virtual reality headset). Each participant (n = 23) viewed the target at three different azimuths: -10, 0 and 10 degrees relative to facing forward (in the dataset this is referred to as L/C/R). They estimated the egocentric distance at each target location 3 times. The procedure was identical in both real and virtual environments. Participants gave their estimates verbally.
This dataset contains the verbal estimates for each environment, at each distance, at each azimuth, for each repeat. Each row represents a different participant. Each column represents a different condition. The dataset should be opened using Microsoft Excel.
This experiment was approved by the University Ethics board (ERGO ID: 26625).
University of Southampton
Blyth, Matthew
6ef1dfce-6fb5-435c-a48c-bd296306aa6a
Blyth, Matthew
6ef1dfce-6fb5-435c-a48c-bd296306aa6a
Blyth, Matthew
(2019)
Egocentric visual distance estimation in vista space in real and virtual environments.
University of Southampton
doi:10.5258/SOTON/D1110
[Dataset]
Abstract
Dataset supporting Experiment 1 of Doctoral thesis: "Auditory fitness for duty: Acoustic stealth awareness".
DESCRIPTION OF THE DATA
Participants had to estimate the distance between themselves and a human target at distances of 25, 50, 75, 100 and 125 m, on Southampton Common (real environment) and in a virtual model of Southampton Common (virtual environment, viewed via Oculus Rift virtual reality headset). Each participant (n = 23) viewed the target at three different azimuths: -10, 0 and 10 degrees relative to facing forward (in the dataset this is referred to as L/C/R). They estimated the egocentric distance at each target location 3 times. The procedure was identical in both real and virtual environments. Participants gave their estimates verbally.
This dataset contains the verbal estimates for each environment, at each distance, at each azimuth, for each repeat. Each row represents a different participant. Each column represents a different condition. The dataset should be opened using Microsoft Excel.
This experiment was approved by the University Ethics board (ERGO ID: 26625).
Spreadsheet
MBlyth_ThesisData_Experiment1.xlsx
- Dataset
Text
README_Exp1_MBlyth.rtf
- Dataset
Text
26625_ConsentForm_V1.pdf
- Dataset
Text
26625_ParticipantInfoSheet_V1.pdf
- Dataset
More information
Published date: October 2019
Identifiers
Local EPrints ID: 435253
URI: http://eprints.soton.ac.uk/id/eprint/435253
PURE UUID: 4ec7ed26-9007-40c7-a5a5-918861271142
Catalogue record
Date deposited: 28 Oct 2019 21:47
Last modified: 05 May 2023 15:24
Export record
Altmetrics
Contributors
Creator:
Matthew Blyth
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics