Rényi divergence and the central limit theorem
From MaRDI portal
Publication:1731889
DOI10.1214/18-AOP1261zbMath1466.60065arXiv1608.01805MaRDI QIDQ1731889
Friedrich Götze, Gennadiy P. Chistyakov, Sergey G. Bobkov
Publication date: 14 March 2019
Published in: The Annals of Probability (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1608.01805
Probability distributions: general theory (60E05) Strong limit theorems (60F15) Statistical aspects of information-theoretic topics (62B10)
Related Items (8)
Prohorov-type local limit theorems on abstract Wiener spaces ⋮ On the equivalence of statistical distances for isotropic convex measures ⋮ Richter's local limit theorem, its refinement, and related results ⋮ Decay of convolved densities via Laplace transform ⋮ Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities ⋮ Two Remarks on Generalized Entropy Power Inequalities ⋮ Nonuniform bounds in the Poisson approximation with applications to informational distances. II ⋮ Common Information, Noise Stability, and Their Extensions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The fractional Fisher information and the central limit theorem for stable laws
- Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem
- Berry-Esseen bounds in the entropic central limit theorem
- Entropy inequalities for stable densities and strengthened central limit theorems
- Non-uniform bounds in local limit theorems in case of fractional moments. I
- Asymptotic development for the CLT in total variation distance
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- The entropic Erdős-Kac limit theorem
- Entropy and the central limit theorem
- Asymptotic methods in statistical decision theory
- Theory of statistical inference and information. Transl. from the Slovak by the author
- Exponential integrability and transportation cost related to logarithmic Sobolev inequalities
- Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality
- On the rate of convergence in the entropic central limit theorem
- Fisher information inequalities and the central limit theorem
- Fisher information and the central limit theorem
- The sub-Gaussian constant and concentration inequalities
- Rényi Divergence and Kullback-Leibler Divergence
- An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Information theoretic inequalities
- On Convergence in the Mean for Densities
- Some convexity and subadditivity properties of entropy
- Information gain within nonextensive thermostatistics
- A Note on the Local Limit Theorem for Large Deviations
- Solution of Shannon’s problem on the monotonicity of entropy
- On Choosing and Bounding Probability Metrics
- On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$-Divergences
- A Miniature Theory in Illustration of the Convolution Transform
- On Information and Sufficiency
This page was built for publication: Rényi divergence and the central limit theorem