Author:
Han Chirok,Phillips Peter C. B.,Sul Donggyu
Abstract
While differencing transformations can eliminate nonstationarity, they typically reduce signal strength and correspondingly reduce rates of convergence in unit root autoregressions. The present paper shows that aggregating moment conditions that are formulated in differences provides an orderly mechanism for preserving information and signal strength in autoregressions with some very desirable properties. In first order autoregression, a partially aggregated estimator based on moment conditions in differences is shown to have a limiting normal distribution that holds uniformly in the autoregressive coefficient ρ, including stationary and unit root cases. The rate of convergence is $\root \of n $ when $\left| \rho \right| < 1$ and the limit distribution is the same as the Gaussian maximum likelihood estimator (MLE), but when ρ = 1 the rate of convergence to the normal distribution is within a slowly varying factor of n. A fully aggregated estimator (FAE) is shown to have the same limit behavior in the stationary case and to have nonstandard limit distributions in unit root and near integrated cases, which reduce both the bias and the variance of the MLE. This result shows that it is possible to improve on the asymptotic behavior of the MLE without using an artificial shrinkage technique or otherwise accelerating convergence at unity at the cost of performance in the neighborhood of unity. Confidence intervals constructed from the FAE using local asymptotic theory around unity also lead to improvements over the MLE.
Publisher
Cambridge University Press (CUP)
Subject
Economics and Econometrics,Social Sciences (miscellaneous)
Cited by
18 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献