The convergence of the Rényi entropy of the normalized sums of IID random variables
From MaRDI portal
Publication:984005
DOI10.1016/j.spl.2010.03.012zbMath1191.94070OpenAlexW2068985618MaRDI QIDQ984005
Publication date: 13 July 2010
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spl.2010.03.012
Related Items (2)
A comment on rates of convergence for density function in extreme value theory and Rényi entropy ⋮ An informatic approach to a long memory stationary process
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Inequalities for characteristic functions involving Fisher information
- Some results concerning maximum Rényi entropy distributions
- A class of Rényi information estimators for multidimensional densities
- Entropy and the central limit theorem
- On the rate of convergence in the entropic central limit theorem
- Fisher information inequalities and the central limit theorem
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions
- CramÉr–Rao and Moment-Entropy Inequalities for Renyi Entropy and Generalized Fisher Information
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Solution of Shannon’s problem on the monotonicity of entropy
- The convolution inequality for entropy powers
This page was built for publication: The convergence of the Rényi entropy of the normalized sums of IID random variables