Chambers, Marcus J. and Kyriacou, Maria
Jackknife bias reduction in autoregressive models with a unit root , London, GB Cass Business School 29pp.
(CEA@Cass Working Paper Series, WP–CEA–02-2012).
- Author's Original
Restricted to Registered users only
This paper is concerned with the application of jackknife methods as a means of bias reduction in the estimation of autoregressive models with a unit root. It is shown that the usual jackknife estimator based on non-overlapping sub-samples does not remove fully the first-order bias as intended, but that an ‘optimal’ jackknife estimator can be de- fined that is capable of removing this bias. The results are based on a demonstration that the sub-sample estimators converge to different limiting distributions, and the joint moment generating function of the numerator and denominator of these distributions (which are func- tionals of a Wiener process over a sub-interval of [0,1]) is derived and utilised to extract the optimal weights. Simulations demonstrate the ability of the jackknife estimator to produce substantial bias reductions in the parameter of interest. It is also shown that incorporating an intercept in the regressions allows the standard jackknife estimator to be used and it is able also to produce substantial bias reduction despite the fact that the distributions of the full-sample and sub-sample estimators have greater bias in this case. Of interest, too, is the fact that the jackknife estimators can also reduce the overall root mean squared error compared to the ordinary least squares estimator, this requiring a larger (though still small) number of sub-samples compared to the value that produces maximum bias reduction (which is typically equal to two)
Actions (login required)