Necessary conditions of \(L_ 1\)-convergence of kernel regression estimators (Q1111288)

From MaRDI portal





scientific article; zbMATH DE number 4076343
Language Label Description Also known as
English
Necessary conditions of \(L_ 1\)-convergence of kernel regression estimators
scientific article; zbMATH DE number 4076343

    Statements

    Necessary conditions of \(L_ 1\)-convergence of kernel regression estimators (English)
    0 references
    0 references
    1987
    0 references
    Let \((X_ 1,Y_ 1),...,(X_ n,Y_ n)\) be i.i.d. and \({\mathbb{R}}^ d\times {\mathbb{R}}\)-valued samples of a random variable (X,Y) and let \[ m_ n(x)=\sum^{n}_{i=1}Y_ iK((X_ i-x)/h_ n)/\sum^{n}_{j=1}H((X_ j-x)/h_ n) \] be the kernel estimator of the regression function \(m(x)=E(Y| X=x)\) which is assumed to exist. Many authors have discussed the convergence of \(m_ n(x)\) to m(x) in various senses, under the conditions \(h_ n\to 0\) and \(nh^ d_ n\to \infty\). The author raises the question of the necessity of these conditions, and shows that when the kernel K is not negative and bounded the conditions are indeed necessary for \(L_ 1\)-convergence.
    0 references
    necessary conditions
    0 references
    regression estimators
    0 references
    L(1)-convergence
    0 references
    kernel estimator
    0 references

    Identifiers