Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators
From MaRDI portal
Publication:434565
DOI10.1016/J.JSPI.2012.01.002zbMath1242.62026OpenAlexW2005899734MaRDI QIDQ434565
Publication date: 16 July 2012
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2012.01.002
\(\mathcal P\)-bias\(\mathcal P\)-variancedistribution unbiaseddual KL riskKL biasKL meanKL riskKL variance
Applications of statistics to biology and medical sciences; meta analysis (62P10) Nonparametric estimation (62G05) Genetics and epigenetics (92D10)
Related Items (4)
Using geometry to select one dimensional exponential families that are monotone likelihood ratio in the sample space, are weakly unimodal and can be parametrized by a measure of central tendency ⋮ Letter to the Editor: Zhang, J. (2021), “The Mean Relative Entropy: An Invariant Measure of Estimation Error,” The American Statistician, 75, 117–123: comment by Vos and Wu ⋮ Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families ⋮ Generalized estimators, slope, efficiency, and Fisher information bounds
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sanov property, generalized I-projection and a conditional limit theorem
- Asymptotical improvement of maximum likelihood estimators on Kullback-Leibler loss
- On Kullback-Leibler loss and density estimation
- Approximation of density functions by sequences of exponential families
- Differential-geometrical methods in statistics.
- I-divergence geometry of probability distributions and minimization problems
- The Kullback-Leibler risk of the Stein estimator and the conditional MLE
- Decomposing posterior variance
- Intrinsic analysis of statistical estimation
- Recursive nonlinear estimation. A geometric approach
- Desiderata for a predictive theory of statistics
- Distribution estimation consistent in total variation and in two types of information divergence
- Simultaneous estimation of the hardy-weinberg proportions
This page was built for publication: Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators