Abstract
The Akaike information criterion, AIC, for autoregressive model selection is derived by adopting −2T times the expected predictive density of a future observation of an independent process as a loss function, where T is the length of the observed time series. The conditions under which AIC provides an asymptotically unbiased estimator of the corresponding risk function are derived. When the unbiasedness property fails, the use of AIC is justified heuristically. However, a method for estimating the risk function, which is applicable for all fitted orders, is given. A derivation of the generalized information criterion, AICα, is also given; the loss function used being obtained by a modification of the Kullback-Leibler information measure. Results paralleling those for AIC are also obtained for the AICα criterion.
Publisher
Cambridge University Press (CUP)
Subject
Applied Mathematics,Statistics and Probability
Reference55 articles.
1. The estimation of the prediction error variance;Hannan;J. Amer. Statist. Assoc.,1977
2. Information measures and model selection;Akaike;Bull. Internat. Statist. Inst.,1983
3. Bhansali R. J. (1986) The criterion autoregressive transfer function of Parzen. J. Time Series Anal. To appear.
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献