Generalized estimators, slope, efficiency, and Fisher information bounds
From MaRDI portal
Publication:6138793
DOI10.1007/s41884-022-00085-7arXiv2208.03630OpenAlexW4311402481MaRDI QIDQ6138793
No author found.
Publication date: 16 January 2024
Published in: Information Geometry (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2208.03630
Cites Work
- Unnamed Item
- Unnamed Item
- Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators
- Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families
- Differential-geometrical methods in statistics.
- Interval estimation for a binomial proportion. (With comments and a rejoinder).
- Variance of the Median of Samples from a Cauchy Distribution
- Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information
- Some pioneers of modern statistical theory: a personal reflection
- An Optimum Property of Regular Maximum Likelihood Estimation
This page was built for publication: Generalized estimators, slope, efficiency, and Fisher information bounds