On estimating a density using Hellinger distance and some other strange facts
From MaRDI portal
Publication:2266536
DOI10.1007/BF00332312zbMath0561.62029MaRDI QIDQ2266536
Publication date: 1986
Published in: Probability Theory and Related Fields (Search for Journal in Brave)
Hellinger distanceloss functiondensity estimationmetric entropyminimax riskrates of convergence of estimatorsdensities with compact supportmethods for computing lower bounds
Nonparametric estimation (62G05) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Related Items
Rates of convergence for minimum contrast estimators, Another proof of a slow convergence result of Birgé, Lower bounds in estimation at a point under multi-index constraint, Resampling: consistency of substitution estimators, Estimating a density and its derivatives via the minimum distance method, Statistical estimation with model selection, Extrema of some Gaussian processes with large trends and density estimation in \(L_{\infty}\)-norm, Mutual information, metric entropy and cumulative relative entropy risk, Optimal estimation of high-dimensional Gaussian location mixtures, Uncompactly supported density estimation with \(L^1\) risk, On the risk of estimates for block decreasing densities, Minimax rates for conditional density estimation via empirical entropy, A central limit theorem for the Hellinger loss of Grenander‐type estimators, Comparing and Weighting Imperfect Models Using D-Probabilities, Parametric or nonparametric? A parametricness index for model selection, Anisotropic function estimation using multi-bandwidth Gaussian processes, Lower bounds on the rate of convergence of nonparametric regression estimates, Optimal learning with anisotropic Gaussian SVMs, A strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation, Asymptotic theory for maximum likelihood in nonparametric mixture models, Entropy estimate for high-dimensional monotonic functions, Optimal estimation of variance in nonparametric regression with random design, Robust Bayes-like estimation: rho-Bayes estimation, Convergence properties of functional estimates for discrete distributions, Adaptive Bayesian density regression for high-dimensional data, Linear and convex aggregation of density estimators, Estimating the intensity of a random measure by histogram type estimators, Information theory and superefficiency, LOCALIZED MODEL SELECTION FOR REGRESSION, On the impossibility of estimating densities in the extreme tail, Nonparametric density estimates with improved . performance on given sets of densities, On the rate of convergence of the maximum likelihood estimator of a \(k\)-monotone density, Information-theoretic determination of minimax rates of convergence, The statistical work of Lucien Le Cam., Density estimation by kernel and wavelets methods: optimality of Besov spaces, Lower bounds for the rate of convergence in nonparametric pattern recognition
Cites Work
- Asymptotic methods in statistical decision theory
- Optimal rates of convergence for nonparametric estimators
- Smoothing techniques for curve estimation. Proceedings of a workshop held in Heidelberg, April 2-4, 1979
- Convergence of estimates under dimensionality restrictions
- [https://portal.mardi4nfdi.de/wiki/Publication:3048064 Estimation des densit�s: risque minimax]
- A Lower Bound on the Risks of Non-Parametric Estimates of Densities in the Uniform Metric
- On arbitrarily slow rates of global convergence in density estimation
- [https://portal.mardi4nfdi.de/wiki/Publication:4743580 Approximation dans les espaces m�triques et th�orie de l'estimation]
- Metric entropy and approximation
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item