MAXIMIZABLE INFORMATIONAL ENTROPY AS A MEASURE OF PROBABILISTIC UNCERTAINTY
From MaRDI portal
Publication:3062755
DOI10.1142/S0217979210054713zbMath1203.82004arXiv0803.3110OpenAlexW1965482824MaRDI QIDQ3062755
Aziz El Kaabouchi, Laurent Nivanen, Franois Tsobnang, Qiuping A. Wang, Alain Le Méhauté, Jincan Chen, Congjie Ou
Publication date: 28 December 2010
Published in: International Journal of Modern Physics B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0803.3110
Related Items
A LINK BETWEEN THE MAXIMUM ENTROPY APPROACH AND THE VARIATIONAL ENTROPY FORM, A counterexample against the Lesche stability of a generic entropy functional
Cites Work
- A Mathematical Theory of Communication
- Can the maximum entropy principle be explained as a consistency requirement?
- Incomplete statistics: Nonextensive generalizations of statistical mechanics
- Possible generalization of Boltzmann-Gibbs statistics.
- Maximum entropy approach to stretched exponential probability distributions
- Generalized entropy optimized by a given arbitrary distribution
- Gibbs vs Boltzmann Entropies
- Probability distribution and entropy as a measure of uncertainty
- Generalized information functions
- Non-linear kinetics underlying generalized statistics