\((h,\Psi)\)-entropy differential metric
From MaRDI portal
Publication:1265614
DOI10.1023/A:1022214326758zbMath0898.62005OpenAlexW1590757293MaRDI QIDQ1265614
Domingo Morales, Miguel Salicrú, M. Luisa Menendez, Leandro Pardo
Publication date: 28 September 1998
Published in: Applications of Mathematics (Search for Journal in Brave)
Full work available at URL: https://eudml.org/doc/32970
Asymptotic distribution theory in statistics (62E20) Statistical aspects of information-theoretic topics (62B10) Local Riemannian geometry (53B20)
Related Items (6)
Rényi information measure for a used item ⋮ Asymptotic minimum scoring rule prediction ⋮ Different closed-form expressions for generalized entropy rates of Markov chains ⋮ Formulas for Rényi information and related measures for univariate distributions. ⋮ A novel nonparametric distance estimator for densities with error bounds ⋮ Estimators based on sample quantiles using \((h,\phi)\)-entropy measures
Cites Work
- A Mathematical Theory of Communication
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- Informative geometry of probability spaces
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- New parametric measures of information
- On the convexity of some divergence measures based on entropy functions
- On the convexity of higher order Jensen differences based on entropy functions (Corresp.)
- Asymptotic distribution of (h, φ)-entropies
- Information-theoretical considerations on estimation problems
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: \((h,\Psi)\)-entropy differential metric