Implications of the Cressie-Read family of additive divergences for information recovery
From MaRDI portal
Publication:406232
DOI10.3390/e14122427zbMath1305.94018OpenAlexW2083326001MaRDI QIDQ406232
Ron C. Mittelhammer, George G. Judge
Publication date: 8 September 2014
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e14122427
conditional moment equationsCressie-Read divergenceinformation functionalsinformation theoretic methodsminimum power divergence
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Entropy: the Markov ordering approach
- Goodness-of-fit statistics for discrete multivariate data
- A new class of metric divergences on probability spaces and its applicability in statistics
- Possible generalization of Boltzmann-Gibbs statistics.
- Typical support and Sanov large deviations of correlated states
- Large Deviation Strategy for Inverse Problem I
- Empirical likelihood as a goodness-of-fit measure
- On Information and Sufficiency
- Notes on bias in estimators for simultaneous equation models.
This page was built for publication: Implications of the Cressie-Read family of additive divergences for information recovery