A lower bound on the error in nonparametric regression type problems
From MaRDI portal
Publication:1106583
DOI10.1214/aos/1176350954zbMath0651.62028OpenAlexW2082102720MaRDI QIDQ1106583
Publication date: 1988
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1176350954
conditional densitynonparametric regressionoptimal rates of convergencelocal behavior of the Kullback informationlower bound of loss in probabilitylower bound on minimax risk
Related Items (16)
Directional mixture models and optimal estimation of the mixing density ⋮ Nonparametric matrix regression function estimation over symmetric positive definite matrices ⋮ Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression ⋮ Unnamed Item ⋮ Optimal global rate of convergence in nonparametric regression with left-truncated and right-censored data. ⋮ Logspline density estimation for binned data ⋮ Kernel estimation of discontinuous regression functions ⋮ Global nonparametric estimation of conditional quantile functions and their derivatives ⋮ \(L_ 1\)-optimal estimates for a regression type function in \(R^ d\) ⋮ Some theoretical results on neural spike train probability models ⋮ A general lower bound of minimax risk for absolute‐error loss ⋮ Dependence and the dimensionality reduction principle ⋮ On consistent statistical procedures in regression ⋮ Convergence rates for kernel regression in infinite-dimensional spaces ⋮ Information-theoretic determination of minimax rates of convergence ⋮ Optimal spherical deconvolution
This page was built for publication: A lower bound on the error in nonparametric regression type problems