scientific article
From MaRDI portal
Publication:3158591
zbMath1061.60019MaRDI QIDQ3158591
Publication date: 28 January 2005
Full work available at URL: http://ebooks.worldscinet.com/ISBN/9781860945373/toc.shtml
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Central limit and other weak theorems (60F05) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10) Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory (94-01) Research exposition (monographs, survey articles) pertaining to probability theory (60-02) Limit theorems in probability theory (60Fxx)
Related Items (56)
A Trajectorial Approach to the Gradient Flow Properties of Langevin--Smoluchowski Diffusions ⋮ On the time-dependent Fisher information of a density function ⋮ Fisher information and the fourth moment theorem ⋮ Quantitative clts on a gaussian space: a survey of recent developments ⋮ Entropy for semi-Markov processes with Borel state spaces: asymptotic equirepartition properties and invariance principles ⋮ Extension of de Bruijn's identity to dependent non-Gaussian noise channels ⋮ On kurtoses of two symmetric or asymmetric populations ⋮ Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem ⋮ Convergence to stable laws in relative entropy ⋮ Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures ⋮ Fisher information in different types of perfect and imperfect ranked set samples from finite mixture models ⋮ Fisher information and convergence to stable laws ⋮ Berry-Esseen bounds in the entropic central limit theorem ⋮ A comment on rates of convergence for density function in extreme value theory and Rényi entropy ⋮ Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem ⋮ Stability of Cramer’s Characterization of Normal Laws in Information Distances ⋮ An integral representation of the relative entropy ⋮ Entropy and the discrete central limit theorem ⋮ Larry Brown's contributions to parametric inference, decision theory and foundations: a survey ⋮ A DE BRUIJN'S IDENTITY FOR DEPENDENT RANDOM VARIABLES BASED ON COPULA THEORY ⋮ Stein's density method for multivariate continuous distributions ⋮ The entropic Erdős-Kac limit theorem ⋮ Information-theoretic convergence of extreme values to the Gumbel distribution ⋮ Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities ⋮ Two Remarks on Generalized Entropy Power Inequalities ⋮ Poisson approximation in \(\chi^2\) distance by the Stein-Chen approach ⋮ On Fuzzy Theory for Econometrics ⋮ Convergence of densities of some functionals of Gaussian processes ⋮ On a connection between information and group lattices ⋮ On the rate of convergence in the central limit theorem for hierarchical Laplacians ⋮ Asymptotic approximation of nonparametric regression experiments with unknown variances ⋮ Direct approach to quantum extensions of Fisher information ⋮ Rényi divergence and the central limit theorem ⋮ Notes on superadditivity of Wigner-Yanase-Dyson information ⋮ A novel method to generating two-sided class of probability distributions ⋮ Fisher information and the central limit theorem ⋮ Entropy and the fourth moment phenomenon ⋮ Bounds on the maximum of the density for sums of independent random variables ⋮ On simulating truncated stable random variables ⋮ Generalized Cramér-Rao relations for non-relativistic quantum systems ⋮ The generalized von Mises distribution ⋮ Multifractal diffusion entropy analysis: optimal bin width of probability histograms ⋮ Local limit theorems in free probability theory ⋮ Local limit theorems for densities in Orlicz spaces ⋮ From Boltzmann to random matrices and beyond ⋮ Convergence of Markov chains in information divergence ⋮ Nonuniform bounds in the Poisson approximation with applications to informational distances. II ⋮ Majorization and Rényi entropy inequalities via Sperner theory ⋮ On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics ⋮ Existence of Stein kernels under a spectral gap, and discrepancy bounds ⋮ Information functionals with applications to random walk and statistics ⋮ Upper bounds for Fisher information ⋮ Stein's method, logarithmic Sobolev and transport inequalities ⋮ Quasi-log concavity conjecture and its applications in statistics ⋮ Theory of \(\phi\)-Jensen variance and its applications in higher education ⋮ On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs
This page was built for publication: