On testing hypotheses with divergence statistics
From MaRDI portal
Publication:4337136
DOI10.1080/03610929608831710zbMath0875.62031OpenAlexW2151704624MaRDI QIDQ4337136
Publication date: 19 May 1997
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610929608831710
entropyasymptotic distributionmaximum likelihood estimatorstesting statistical hypothesesdivergence statistics
Asymptotic distribution theory in statistics (62E20) Parametric hypothesis testing (62F03) Statistical aspects of information-theoretic topics (62B10)
Related Items (1)
Cites Work
- A Mathematical Theory of Communication
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- Trigonometric entropies, Jensen difference divergence measures, and error bounds
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- On the convexity of some divergence measures based on entropy functions
- Charakterisierung der Entropien positiver Ordnung und der shannonschen Entropie
- Information radius
- Information-theoretical considerations on estimation problems
- Unnamed Item
This page was built for publication: On testing hypotheses with divergence statistics