Entropy and the discrete central limit theorem
From MaRDI portal
Publication:6123274
DOI10.1016/j.spa.2023.104294arXiv2106.00514OpenAlexW3172580395MaRDI QIDQ6123274
Lampros Gavalakis, Ioannis Kontoyiannis
Publication date: 4 March 2024
Published in: Stochastic Processes and their Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.00514
entropyrelative entropyFisher informationcentral limit theoremlattice distributionconvolution inequalityBernoulli part decomposition
Inequalities; stochastic orderings (60E15) Central limit and other weak theorems (60F05) Measures of information, entropy (94A17)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Compound Poisson approximation via information functionals
- Entropy and the central limit theorem
- An elementary proof of the local central limit theorem
- Fourier analysis of distribution functions. A mathematical study of the Laplace-Gaussian law
- A New Entropy Power Inequality for Integer-Valued Random Variables
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions
- Entropy and the Law of Small Numbers
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Coding for T-user multiple-access channels
- On Local Limit Theorem for Integer-Valued Random Variables
- Binomial and Poisson distributions as maximum entropy distributions
- Entropy computations via analytic depoissonization
- Solution of Shannon’s problem on the monotonicity of entropy
- Nearly optimal multiuser codes for the binary adder channel
- Sumset and Inverse Sumset Theory for Shannon Entropy
- Variations on a Theme by Massey
- The convolution inequality for entropy powers
- A criterion for tail events for sums of independent random variables
- Concentration functions and entropy bounds for discrete log-concave distributions