Pages that link to "Item:Q434565"
From MaRDI portal
The following pages link to Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators (Q434565):
Displaying 7 items.
- Using geometry to select one dimensional exponential families that are monotone likelihood ratio in the sample space, are weakly unimodal and can be parametrized by a measure of central tendency (Q296490) (← links)
- Estimators for the binomial distribution that dominate the MLE in terms of Kullback-Leibler risk (Q421424) (← links)
- Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families (Q888476) (← links)
- The Kullback-Leibler risk of the Stein estimator and the conditional MLE (Q1336540) (← links)
- Letter to the Editor: Zhang, J. (2021), “The Mean Relative Entropy: An Invariant Measure of Estimation Error,” <i>The American Statistician</i>, 75, 117–123: comment by Vos and Wu (Q5057006) (← links)
- Generalized estimators, slope, efficiency, and Fisher information bounds (Q6138793) (← links)
- Lower bounds for the trade-off between bias and mean absolute deviation (Q6589425) (← links)