The unifying frameworks of information measures
From MaRDI portal
Publication:1720478
DOI10.1155/2018/1791954zbMath1426.94078OpenAlexW2789778392MaRDI QIDQ1720478
Publication date: 8 February 2019
Published in: Mathematical Problems in Engineering (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2018/1791954
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Feature selection with SVD entropy: some modification and extension
- Information measures based on fractional calculus
- On cumulative entropies
- Markov's inequality and \(C^{\infty}\) functions on sets with polynomial cusps
- On generalized means and generalized convex functions
- Generalized Shannon-Khinchin axioms and uniqueness theorem for pseudo-additive entropies
- The world according to Rényi: Thermodynamics of multifractal systems
- Possible generalization of Boltzmann-Gibbs statistics.
- On the dynamic cumulative residual entropy
- Uncertainty and Information
- Cumulative Residual Entropy: A New Measure of Information
- Survival Exponential Entropies
- Exponential entropy as a measure of extent of a distribution
- Three approaches to the quantitative definition of information*
- Sharpening Hölder's inequality
This page was built for publication: The unifying frameworks of information measures