Mean estimation bias in least squares estimation of autoregressive processes (Q1058799)

From MaRDI portal





scientific article; zbMATH DE number 3901863
Language Label Description Also known as
English
Mean estimation bias in least squares estimation of autoregressive processes
scientific article; zbMATH DE number 3901863

    Statements

    Mean estimation bias in least squares estimation of autoregressive processes (English)
    0 references
    0 references
    0 references
    1985
    0 references
    Let \(X_{ti}\) \((t=1,...,n\); \(i=1,...,r)\) be given constants (e.g., \(X_{ti}=t^{i-1})\) and let \(\{e_ t\}\) be i.i.d. \(N(0,\sigma^ 2)\) variables. Consider the model \[ Y_ t=\sum^{r}_{i=1}X_{ti}\beta_ i+P_ t,\quad P_ t=\sum^{p}_{j=1}\alpha_ jP_{t-j}+e_ t, \] where \(\beta_ i\) and \(\alpha_ j\) are unknown parameters such that \(\{P_ t\}\) is a stationary AR(p) process. Let \({\tilde \alpha}\) be the estimator of \(\alpha =(\alpha_ 1,...,\alpha_ p)\) which is given by the regression of \(Y_ t\) on \((X_{t1},...,X_{tr}, Y_{t-1},...,Y_{t-p})\). The other estimator \({\hat \alpha}\) of \(\alpha\) arises from the regression of \(\hat P_ t\) on \((\hat P_{t-1},...,\hat P_{t-p})\), where \(\hat P_ t\) are the least squares residuals. The authors prove that \(E({\hat \alpha}-{\tilde \alpha})=O(n^{-2})\) and propose a reparametrization that isolates the bias of the estimators. A Monte Carlo study of the second-order autoregressive process is presented which includes also the case of the generalized least squares estimator of the mean function.
    0 references
    approximate expressions
    0 references
    stationary AR(p) process
    0 references
    least squares residuals
    0 references
    reparametrization
    0 references
    bias
    0 references
    Monte Carlo study
    0 references
    second-order autoregressive process
    0 references
    generalized least squares estimator of the mean function
    0 references
    0 references

    Identifiers