The Mean Relative Entropy: An Invariant Measure of Estimation Error
From MaRDI portal
Publication:5056956
DOI10.1080/00031305.2018.1543139OpenAlexW2905183503WikidataQ128767840 ScholiaQ128767840MaRDI QIDQ5056956
Publication date: 14 December 2022
Published in: The American Statistician (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00031305.2018.1543139
Related Items (1)
Cites Work
- A Mathematical Theory of Communication
- On entropy-based goodness-of-fit tests
- Statistical decision theory and Bayesian analysis. 2nd ed
- Simultaneous estimation of parameters under entropy loss
- Density-free convergence properties of various estimators of entropy
- Simultaneous estimation of Poisson means under entropy loss
- On Kullback-Leibler loss and density estimation
- On the estimation of entropy
- Entropy-Based Tests of Uniformity
- Order Statistics
- Powerful Goodness-of-fit Tests Based on the Likelihood Ratio
- Linear Statistical Inference and its Applications
- On Information and Sufficiency
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: The Mean Relative Entropy: An Invariant Measure of Estimation Error