The University of Southampton
University of Southampton Institutional Repository

An Investigation of methods for Improving survey quality

An Investigation of methods for Improving survey quality
An Investigation of methods for Improving survey quality
Survey data can reduce the risk of making poor public policies and business decisions. It is therefore essential that we continually seek to understand how survey practices affect data quality. The quality of survey data is affected by how well survey questions measure constructs of interest as well as how generalisable such data is to the target population. This thesis consists of three papers, and each addresses the issues of how survey data quality is affected by different methodological choices.

The first paper provides an assessment of the effectiveness of a Bayesian framework to improve predictions of survey nonresponse using response propensity models. Generally, response propensity models exhibit low predictive power for survey nonresponse. This limits their effective application in monitoring and controlling the performance of the survey processes which, in turn, affect survey data quality. This paper explores the utility of a Bayesian approach in improving the predictions of response propensities by using informative priors derived from historical response data. The estimates from the response propensity models fitted to existing data are used as a source for specifying prior distributions in subsequent data collection rounds. The results show that informative priors only lead to a slight improvement in predictions and discriminative ability of response propensity models.

The second paper investigates whether interviewers moderate the effect of monetary incentives on response and cooperation rates in household interview surveys. Incentives play an important role in maintaining response rates and interviewers are the key conduit of information about the existence and level of incentives offered. This paper uses multilevel models to assess whether some interviewers are more successful than others in the deployment of incentives to leverage survey response and cooperation. This paper also investigates whether interviewer variability on incentives is systematically related to interviewer characteristics. The results show significant and substantial variability between interviewers in the effectiveness of monetary incentives on the probability of response and cooperation, but no observed characteristics of interviewers are related to this tendency.

The third paper focuses on whether low response rate online probability surveys provide data of comparable quality than high response rate face-to-face interviews. Declining response rates and increasing survey costs have promoted many surveys to switch from face-to-face interviews to online administration. The available evidence on data quality between face-to-face and online surveys is mixed. This paper examines measurement differences in online and face-to-face surveys while adjusting for selection effects using propensity score matching. In addition, different methods of handling survey weights in propensity score models and outcome analyses are evaluated. The results show that measurement effects contribute the majority of mode differences with sample compositional differences playing a secondary role. However, propensity score matching had only a minimal effect on the magnitude of mode effects for surveys considered.
University of Southampton
Kibuchi, Eliud Muriithi
a8e48182-8b0a-48f9-8c53-f610726b0974
Kibuchi, Eliud Muriithi
a8e48182-8b0a-48f9-8c53-f610726b0974
Durrant, Gabriele
14fcc787-2666-46f2-a097-e4b98a210610

Kibuchi, Eliud Muriithi (2018) An Investigation of methods for Improving survey quality. University of Southampton, Doctoral Thesis, 242pp.

Record type: Thesis (Doctoral)

Abstract

Survey data can reduce the risk of making poor public policies and business decisions. It is therefore essential that we continually seek to understand how survey practices affect data quality. The quality of survey data is affected by how well survey questions measure constructs of interest as well as how generalisable such data is to the target population. This thesis consists of three papers, and each addresses the issues of how survey data quality is affected by different methodological choices.

The first paper provides an assessment of the effectiveness of a Bayesian framework to improve predictions of survey nonresponse using response propensity models. Generally, response propensity models exhibit low predictive power for survey nonresponse. This limits their effective application in monitoring and controlling the performance of the survey processes which, in turn, affect survey data quality. This paper explores the utility of a Bayesian approach in improving the predictions of response propensities by using informative priors derived from historical response data. The estimates from the response propensity models fitted to existing data are used as a source for specifying prior distributions in subsequent data collection rounds. The results show that informative priors only lead to a slight improvement in predictions and discriminative ability of response propensity models.

The second paper investigates whether interviewers moderate the effect of monetary incentives on response and cooperation rates in household interview surveys. Incentives play an important role in maintaining response rates and interviewers are the key conduit of information about the existence and level of incentives offered. This paper uses multilevel models to assess whether some interviewers are more successful than others in the deployment of incentives to leverage survey response and cooperation. This paper also investigates whether interviewer variability on incentives is systematically related to interviewer characteristics. The results show significant and substantial variability between interviewers in the effectiveness of monetary incentives on the probability of response and cooperation, but no observed characteristics of interviewers are related to this tendency.

The third paper focuses on whether low response rate online probability surveys provide data of comparable quality than high response rate face-to-face interviews. Declining response rates and increasing survey costs have promoted many surveys to switch from face-to-face interviews to online administration. The available evidence on data quality between face-to-face and online surveys is mixed. This paper examines measurement differences in online and face-to-face surveys while adjusting for selection effects using propensity score matching. In addition, different methods of handling survey weights in propensity score models and outcome analyses are evaluated. The results show that measurement effects contribute the majority of mode differences with sample compositional differences playing a secondary role. However, propensity score matching had only a minimal effect on the magnitude of mode effects for surveys considered.

Text
Kibuchi Thesis - Version of Record
Available under License University of Southampton Thesis Licence.
Download (2MB)
Text
e-prints form
Restricted to Repository staff only

More information

Published date: December 2018

Identifiers

Local EPrints ID: 444046
URI: http://eprints.soton.ac.uk/id/eprint/444046
PURE UUID: e0d1c57f-11b2-4f8e-b86e-3bdb717c7b27

Catalogue record

Date deposited: 23 Sep 2020 16:49
Last modified: 12 Dec 2021 10:38

Export record

Contributors

Author: Eliud Muriithi Kibuchi
Thesis advisor: Gabriele Durrant

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×