Fisher information inequalities and the central limit theorem
From MaRDI portal
Publication:1881640
DOI10.1007/s00440-004-0344-0zbMath1047.62005arXivmath/0111020OpenAlexW3098545202WikidataQ60522102 ScholiaQ60522102MaRDI QIDQ1881640
Oliver Johnson, Andrew R. Barron
Publication date: 5 October 2004
Published in: Probability Theory and Related Fields (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0111020
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10) Asymptotic properties of parametric tests (62F05)
Related Items (47)
Bayesian information in an experiment and the Fisher information distance ⋮ On the time-dependent Fisher information of a density function ⋮ The fractional Fisher information and the central limit theorem for stable laws ⋮ Coordinate-wise transformation of probability distributions to achieve a Stein-type identity ⋮ Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem ⋮ Convergence in distribution norms in the CLT for non identical distributed random variables ⋮ A stroll along the gamma ⋮ Score functions, generalized relative Fisher information and applications ⋮ An extension of entropy power inequality for dependent random variables ⋮ Fisher information and convergence to stable laws ⋮ Berry-Esseen bounds in the entropic central limit theorem ⋮ A comment on rates of convergence for density function in extreme value theory and Rényi entropy ⋮ Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem ⋮ An integral representation of the relative entropy ⋮ Poincaré-type inequalities for stable densities ⋮ Fokker–Planck equations in the modeling of socio-economic phenomena ⋮ Inequalities in Statistics and Information Measures ⋮ Covariance representations, \(L^p\)-Poincaré inequalities, Stein's kernels, and high-dimensional CLTs ⋮ The entropic Erdős-Kac limit theorem ⋮ Information-theoretic convergence of extreme values to the Gumbel distribution ⋮ Entropy jumps in the presence of a spectral gap ⋮ Estimates of the approximation of weighted sums of conditionally independent random variables by the normal law ⋮ The CLT in high dimensions: quantitative bounds via martingale embedding ⋮ Optimal nonlinear transformations of random variables ⋮ On a Fokker-Planck equation for wealth distribution ⋮ Convergence of densities of some functionals of Gaussian processes ⋮ Shannon's monotonicity problem for free and classical entropy ⋮ Autour de l'inégalité de Brunn-Minkowski ⋮ Rényi divergence and the central limit theorem ⋮ Log-concavity and strong log-concavity: a review ⋮ Entropy inequalities for stable densities and strengthened central limit theorems ⋮ Fisher information and the central limit theorem ⋮ Parameter-based Fisher's information of orthogonal polynomials ⋮ Entropy and the fourth moment phenomenon ⋮ Exploring the statistical applicability of the Poincaré inequality: a test of normality ⋮ REMARKS ON THE FREE RELATIVE ENTROPY AND THE FREE FISHER INFORMATION DISTANCE ⋮ The convergence of the Rényi entropy of the normalized sums of IID random variables ⋮ Entropy-type inequalities for generalized gamma densities ⋮ Convergence of Markov chains in information divergence ⋮ A reverse entropy power inequality for log-concave random vectors ⋮ Existence of Stein kernels under a spectral gap, and discrepancy bounds ⋮ Strong Log-concavity is Preserved by Convolution ⋮ Unnamed Item ⋮ Stein's method, logarithmic Sobolev and transport inequalities ⋮ An invariance principle under the total variation distance ⋮ Lower Bounds for Divergence in Central Limit Theorem ⋮ Central limit theorem and convergence to stable laws in Mallows distance
This page was built for publication: Fisher information inequalities and the central limit theorem