A free analogue of Shannon's problem on monotonicity of entropy
From MaRDI portal
Publication:861057
DOI10.1016/j.aim.2006.03.014zbMath1106.94015arXivmath/0510103OpenAlexW2594079813MaRDI QIDQ861057
Publication date: 9 January 2007
Published in: Advances in Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0510103
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
Free Convolution Powers Via Roots of Polynomials, Volumes of subset Minkowski sums and the Lyusternik region, Two Remarks on Generalized Entropy Power Inequalities, REMARKS ON A SEMICIRCULAR PERTURBATION OF THE FREE FISHER INFORMATION, Shannon's monotonicity problem for free and classical entropy, The convexification effect of Minkowski summation, REMARKS ON THE FREE RELATIVE ENTROPY AND THE FREE FISHER INFORMATION DISTANCE, Local limit theorems in free probability theory, From Boltzmann to random matrices and beyond, Maximal correlation and monotonicity of free entropy and of Stein discrepancy, Evaluation model for service life of dam based on time-varying risk probability, Fractional free convolution powers
Cites Work
- Unnamed Item
- Unnamed Item
- The analogues of entropy and of Fisher's information measure in free probability theory. V: Noncommutative Hilbert transforms
- Proof of an entropy conjecture of Wehrl
- The analogues of entropy and of Fisher's information measure in free probability theory. I
- The analogues of entropy and of Fisher's information measure in free probability theory. II
- Volumes of restricted Minkowski sums and the free analogue of the entropy power inequality
- FREE ENTROPY
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Solution of Shannon’s problem on the monotonicity of entropy