Properties of predictors in overdifferenced nearly nonstationary autoregression (Q2722251)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Properties of predictors in overdifferenced nearly nonstationary autoregression |
scientific article; zbMATH DE number 1617478
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Properties of predictors in overdifferenced nearly nonstationary autoregression |
scientific article; zbMATH DE number 1617478 |
Statements
11 July 2001
0 references
autoregressive processes
0 references
near nonstationarity
0 references
parsimony
0 references
predictive mean squared error
0 references
unit roots
0 references
0.8774508
0 references
0.8725785
0 references
0.8586471
0 references
0.85181123
0 references
0.84540975
0 references
Properties of predictors in overdifferenced nearly nonstationary autoregression (English)
0 references
The consequences in estimation and prediction of overdifferencing a stationary \(AR(p+1)\) with a root close to unity is investigated. Differencing is normally used to transform a homogeneous, linear nonstationary time series into a stationary process that is often modeled as an \(ARMA(p,q)\) process. It is said that the original series follows an \(ARIMA(p,d,q)\) process, where \(d\) is the number of differences required to obtain stationarity. It is assumed that the process is not a long-memory process [\textit{C.W.J. Granger} and \textit{R. Joyeux}, J. Time Ser. Anal. 1, 15-29 (1980; Zbl 0503.62079)], and thus \(d\) is an integer equal to the number of unit roots in the autoregressive characteristic equation. When a stationary process has an autoregressive characteristic equation with a root close to unit it is said to be nearly nonstationary. Given a small or moderate sample of this process it is very likely to be concluded, due to the low power of unit root tests in this case, that a difference should be applied. The differenced series will be noninvertible and the process is called overdifferenced.NEWLINENEWLINENEWLINEIn this paper are justified theoretically the advantages of the overdifferenced predictor in a general autoregression and the effect of other factors like the remaining roots, sample size \((T)\) and horizon \((H)\) are analyzed. It is assumed that a root of the \(AR(p+1)\) is close to unity and thus it is adopted as a more plausible overdifferenced predictor of the \(ARIMA(p,1,0)\) model, where no moving-average component is involved. It is proved that the prediction mean squared error (PMSE) of the overdifferenced model \(ARIMA(p,1,0)\) is lower, to terms of small order, than the PMSE of the correct model \(AR(p+1)\) if the root that is closer to unity, \(\rho^{-1},\) follows \(\rho=\exp(-c/T^{\beta}),\;\beta>1.\) The advantage of the overdifferenced predictor is due to its parsimony. An important consequence of these results is that, for forecasting purposes, it is better to overdifference than to underdifference.
0 references