On the use of double sampling schemes to correct for measurement error in discrete longitudinal data
On the use of double sampling schemes to correct for measurement error in discrete longitudinal data
Longitudinal surveys provide a key source of information for analysing dynamic phenomena. Typical examples of longitudinal data are gross flows, which are defined as transition counts between a finite number of states from one point in time to another. There are, however, a number of methodological problems associated with the use of longitudinal surveys. This thesis focuses on the measurement error problem or more naturally in a discrete framework on the misclassification problem.
We investigate the use of double sampling for correcting discrete longitudinal data for misclassification. In a double sampling context, we assume that along with the main measurement device, which is affected by misclassification, we can use a secondary measurement device (validation survey), which is free of error but more expensive to apply. Due to its higher cost, the secondary measurement device is employed only for a subset of units. Inference, using double sampling, is based on combining information from both measurement devices.
Traditional moment-based inference is reviewed and alternative moment-type estimators, which attempt to overcome the drawbacks of the traditional approach, are proposed. We subsequently argue that a more efficient parameterisation is offered in a likelihood-based framework by simultaneously modelling the true transition process and the measurement error process within the context of a missing data problem. Variants of likelihood-based inference, which allow for alternative double sampling schemes, for a complex survey design and for observed heterogeneity, are investigated. Constrained maximum likelihood estimation is also considered for relaxing some of the model assumptions. Variance estimation for the moment-type and the likelihood-based estimators is illustrated. In addition, empirical research aimed at identifying optimal design characteristics for validation surveys is presented.
The methodology is applied in the context of the UK Labour Force Survey (LES) by estimating labour force gross flows adjusted for misclassification. Results from Monte-Carlo simulation experiments indicate that the proposed likelihood-based parameterisation offers significant gains in efficiency over the traditional moment-based parameterisation while interval estimation for the adjusted estimates can be reliably performed using the proposed variance estimators.
University of Southampton
Tzavidis, Nikolaos
6e83b47a-ebde-49ec-9e76-93c52804bdd8
2004
Tzavidis, Nikolaos
6e83b47a-ebde-49ec-9e76-93c52804bdd8
Tzavidis, Nikolaos
(2004)
On the use of double sampling schemes to correct for measurement error in discrete longitudinal data.
University of Southampton, Doctoral Thesis.
Record type:
Thesis
(Doctoral)
Abstract
Longitudinal surveys provide a key source of information for analysing dynamic phenomena. Typical examples of longitudinal data are gross flows, which are defined as transition counts between a finite number of states from one point in time to another. There are, however, a number of methodological problems associated with the use of longitudinal surveys. This thesis focuses on the measurement error problem or more naturally in a discrete framework on the misclassification problem.
We investigate the use of double sampling for correcting discrete longitudinal data for misclassification. In a double sampling context, we assume that along with the main measurement device, which is affected by misclassification, we can use a secondary measurement device (validation survey), which is free of error but more expensive to apply. Due to its higher cost, the secondary measurement device is employed only for a subset of units. Inference, using double sampling, is based on combining information from both measurement devices.
Traditional moment-based inference is reviewed and alternative moment-type estimators, which attempt to overcome the drawbacks of the traditional approach, are proposed. We subsequently argue that a more efficient parameterisation is offered in a likelihood-based framework by simultaneously modelling the true transition process and the measurement error process within the context of a missing data problem. Variants of likelihood-based inference, which allow for alternative double sampling schemes, for a complex survey design and for observed heterogeneity, are investigated. Constrained maximum likelihood estimation is also considered for relaxing some of the model assumptions. Variance estimation for the moment-type and the likelihood-based estimators is illustrated. In addition, empirical research aimed at identifying optimal design characteristics for validation surveys is presented.
The methodology is applied in the context of the UK Labour Force Survey (LES) by estimating labour force gross flows adjusted for misclassification. Results from Monte-Carlo simulation experiments indicate that the proposed likelihood-based parameterisation offers significant gains in efficiency over the traditional moment-based parameterisation while interval estimation for the adjusted estimates can be reliably performed using the proposed variance estimators.
Text
949082.pdf
- Version of Record
More information
Published date: 2004
Identifiers
Local EPrints ID: 465384
URI: http://eprints.soton.ac.uk/id/eprint/465384
PURE UUID: 091a5304-948d-490c-b14d-bdcfa4842961
Catalogue record
Date deposited: 05 Jul 2022 00:41
Last modified: 16 Mar 2024 20:08
Export record
Contributors
Author:
Nikolaos Tzavidis
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics