The University of Southampton
University of Southampton Institutional Repository

Aural detectability judgements in normal-hearing civilians

Aural detectability judgements in normal-hearing civilians
Aural detectability judgements in normal-hearing civilians
This dataset supports the thesis entitled: “Auditory fitness for duty: Acoustic stealth awareness” AWARDED BY: University of Southampton DATE OF AWARD: 2019 DESCRIPTION OF THE DATA This dataset contains the aural detectability judgements of normal-hearing civilians which underpin experiment 2 of the thesis “Auditory fitness for duty: Acoustic stealth awareness”. The participant was stood in an anechoic chamber wearing a virtual reality headset, and was viewing a virtual environment resembling Southampton Common. The participant viewed a human target located 25, 50 or 100 m away in the distance, then listened to a sound presented at their feet or 1 m behind them (at head height), and had to respond “Yes” or “No” (via an xbox controller) whether they judged that sound to be aurally detectable by the target in the distance (Yes = the sound was audible to the target; No = the sound was not audible to the target). Three sounds were used (whispered digits, white noise and pine cone crunching sounds). There were 9 conditions in total. The method of constants was used in each condition: - Sounds were presented at six different levels. - Each level had 10 repeats, in order to calculate the proportion of “Yes” responds at each level so the psychometric function could be obtained. - Levels for each condition were selected based on a practice period of 108 trials prior to the main data collection period. - All trials for all conditions were presented in a random order. - The subject’s view of the target was impeded between each trial. This dataset contains a MATLAB .mat file containing the structure ‘PFs’. Within this structure there are two nested structures for each participant (n = 28). Each structure contains data for each session (2 sessions in total). Each session structure (for each participant) contains: - condition identifier (note, each row represents a different condition) - the sound levels presented in that condition (expressed as: 1. sensation level (dB SL) at the target, predicted by a sound propagation and loudness model; 2. The RMS level (dB SPL) at the subject; 3. The peak level (dB SPL) at the subject) - the number of repeats - the number of ‘yes’ responses at each level - the psychometric function parameters and fit, as fitted by the Palamedes toolbox. Note, this dataset can only be opened using MATLAB. This experiment was approved by the University of Southampton ethics board (ERGO ID: 23783). Date of data collection: October 2018 - December 2018 Legal: Copyright: CC BY This doctoral project was funded by the Royal Centre for Defence Medicine.
University of Southampton
Blyth, Matthew
6ef1dfce-6fb5-435c-a48c-bd296306aa6a
Blyth, Matthew
6ef1dfce-6fb5-435c-a48c-bd296306aa6a

Blyth, Matthew (2019) Aural detectability judgements in normal-hearing civilians. University of Southampton doi:10.5258/SOTON/D1109 [Dataset]

Record type: Dataset

Abstract

This dataset supports the thesis entitled: “Auditory fitness for duty: Acoustic stealth awareness” AWARDED BY: University of Southampton DATE OF AWARD: 2019 DESCRIPTION OF THE DATA This dataset contains the aural detectability judgements of normal-hearing civilians which underpin experiment 2 of the thesis “Auditory fitness for duty: Acoustic stealth awareness”. The participant was stood in an anechoic chamber wearing a virtual reality headset, and was viewing a virtual environment resembling Southampton Common. The participant viewed a human target located 25, 50 or 100 m away in the distance, then listened to a sound presented at their feet or 1 m behind them (at head height), and had to respond “Yes” or “No” (via an xbox controller) whether they judged that sound to be aurally detectable by the target in the distance (Yes = the sound was audible to the target; No = the sound was not audible to the target). Three sounds were used (whispered digits, white noise and pine cone crunching sounds). There were 9 conditions in total. The method of constants was used in each condition: - Sounds were presented at six different levels. - Each level had 10 repeats, in order to calculate the proportion of “Yes” responds at each level so the psychometric function could be obtained. - Levels for each condition were selected based on a practice period of 108 trials prior to the main data collection period. - All trials for all conditions were presented in a random order. - The subject’s view of the target was impeded between each trial. This dataset contains a MATLAB .mat file containing the structure ‘PFs’. Within this structure there are two nested structures for each participant (n = 28). Each structure contains data for each session (2 sessions in total). Each session structure (for each participant) contains: - condition identifier (note, each row represents a different condition) - the sound levels presented in that condition (expressed as: 1. sensation level (dB SL) at the target, predicted by a sound propagation and loudness model; 2. The RMS level (dB SPL) at the subject; 3. The peak level (dB SPL) at the subject) - the number of repeats - the number of ‘yes’ responses at each level - the psychometric function parameters and fit, as fitted by the Palamedes toolbox. Note, this dataset can only be opened using MATLAB. This experiment was approved by the University of Southampton ethics board (ERGO ID: 23783). Date of data collection: October 2018 - December 2018 Legal: Copyright: CC BY This doctoral project was funded by the Royal Centre for Defence Medicine.

Other
MBlyth_ThesisData_Experiment2.mat - Dataset
Available under License Creative Commons Attribution.
Download (6MB)
Text
README_Exp2_MBlyth.rtf - Dataset
Download (3kB)
Text
ERGO23783_Consent_form_v1.pdf - Dataset
Download (58kB)
Text
ERGO23783_PIS.pdf - Dataset
Download (77kB)

More information

Published date: October 2019

Identifiers

Local EPrints ID: 435252
URI: http://eprints.soton.ac.uk/id/eprint/435252
PURE UUID: 0b17aea8-4211-4b86-8c6d-e024949da1ff
ORCID for Matthew Blyth: ORCID iD orcid.org/0000-0002-1584-3946

Catalogue record

Date deposited: 28 Oct 2019 21:47
Last modified: 05 May 2023 15:24

Export record

Altmetrics

Contributors

Creator: Matthew Blyth ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×