Rates of Fisher information convergence in the central limit theorem for nonlinear statistics
From MaRDI portal
Publication:6632857
DOI10.1007/s00440-024-01331-yMaRDI QIDQ6632857
Publication date: 5 November 2024
Published in: Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete (Search for Journal in Brave)
Central limit and other weak theorems (60F05) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fisher information and the fourth moment theorem
- Stein's method for comparison of univariate distributions
- Nonnormal approximation by Stein's method of exchangeable pairs with application to the Curie-Weiss model
- Moment inequalities for sums of dependent random variables under projective conditions
- Density formula and concentration inequalities with Malliavin calculus
- A central limit theorem for generalized quadratic forms
- An Edgeworth expansion for symmetric statistics
- Berry-Esseen bounds of normal and nonnormal approximation for unbounded exchangeable pairs
- Stein approximation for functionals of independent random sequences
- Efron's monotonicity property for measures on \(\mathbb{R}^2\)
- On the rate of convergence in the entropic central limit theorem
- Fisher information inequalities and the central limit theorem
- Transportation cost for Gaussian and other product measures
- Berry-Esseen bounds for functionals of independent random variables
- First-order covariance inequalities via Stein's method
- Stein's method for normal approximation in Wasserstein distances with application to the multivariate central limit theorem
- Fisher information and the central limit theorem
- Entropy and the fourth moment phenomenon
- Stein kernels and moment maps
- Stein's method, logarithmic Sobolev and transport inequalities
- Monotonicity of entropy and Fisher information: a quick proof via maximal correlation
- On the isoperimetric constant, covariance inequalities and \(L_{p}\)-Poincaré inequalities in dimension one
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Maximal Correlation and the Rate of Fisher Information Convergence in the Central Limit Theorem
- Random Fields and Geometry
- Stein's density method for multivariate continuous distributions
This page was built for publication: Rates of Fisher information convergence in the central limit theorem for nonlinear statistics