An optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional setting
DOI10.3150/21-BEJ1403zbMath1489.60035arXiv2007.04547OpenAlexW3177456179MaRDI QIDQ2137047
Publication date: 16 May 2022
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2007.04547
entropynon-convex optimizationlog-likelihoodconcentration inequalitysource coding theoremtypical set
Inequalities; stochastic orderings (60E15) Nonconvex programming, global optimization (90C26) Random matrices (algebraic aspects) (15B52) Measures of information, entropy (94A17) Multilinear algebra, tensor calculus (15A69) Source coding (94A29)
Related Items (2)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Consistent community detection in multi-relational data through restricted multi-layer stochastic blockmodel
- A note on new Bernstein-type inequalities for the log-likelihood function of Bernoulli variables
- Concentration inequalities for random tensors
- Concentration of Measure Inequalities in Information Theory, Communications, and Coding
- Stochastic blockmodels with a growing number of classes
- The Individual Ergodic Theorem of Information Theory
- Infinite Shannon entropy
- Probability
- High-Dimensional Statistics
- High-Dimensional Probability
- Probability Inequalities for Sums of Bounded Random Variables
- Elements of Information Theory
- The Basic Theorems of Information Theory
- Information Theory
- Concentration of Measure for the Analysis of Randomized Algorithms
This page was built for publication: An optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional setting