On the convergence of Shannon differential entropy, and its connections with density and entropy estimation
DOI10.1016/j.jspi.2012.02.023zbMath1408.62015OpenAlexW2013232082MaRDI QIDQ419270
Jorge F. Silva, Patricio Parada
Publication date: 18 May 2012
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/10533/130530
convergence of probability measuresdensity estimationstrong consistencyconsistency in information divergencedifferential entropy estimationhistogram-based estimatorsShannon information measures
Density estimation (62G07) Convergence of probability measures (60B10) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (2)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- On convergence properties of Shannon entropy
- Information divergence estimation based on data-dependent partitions
- Density-free convergence properties of various estimators of entropy
- Consistency of data-driven histogram methods for density estimation and classification
- Histogram regression estimation using data-dependent partitions
- Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
- Distribution estimation consistent in total variation and in two types of information divergence
- Distribution Estimates Consistent in χ2-Divergence
- About the asymptotic accuracy of Barron density estimates
- Optimization of Barron density estimates
- Nonproduct Data-Dependent Partitions for Mutual Information Estimation: Strong Consistency and Applications
- Estimation of the information by an adaptive partitioning of the observation space
- On the Discontinuity of the Shannon Information Measures
- The Interplay Between Entropy and Variational Distance
- A Consistent Nonparametric Multivariate Density Estimator Based on Statistically Equivalent Blocks
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- A Useful Convergence Theorem for Probability Distributions
- On Information and Sufficiency
- Information Theory and Statistics: A Tutorial
- Combinatorial methods in density estimation
This page was built for publication: On the convergence of Shannon differential entropy, and its connections with density and entropy estimation