Information dependency: strong consistency of Darbellay-Vajda partition estimators
DOI10.1016/j.jspi.2013.08.007zbMath1408.62014OpenAlexW1974829031MaRDI QIDQ393626
Publication date: 23 January 2014
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2013.08.007
Kullback-Leibler divergenceDarbellay-Vajda partition schemedata-dependent partition schemeGessaman's partition schemeinformation dependency estimationLugosi and Nobel inequalitystrongly consistent estimator
Nonparametric estimation (62G05) Measures of association (correlation, canonical correlation, etc.) (62H20) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the convergence of Shannon differential entropy, and its connections with density and entropy estimation
- On convergence properties of Shannon entropy
- Information divergence estimation based on data-dependent partitions
- Consistency of data-driven histogram methods for density estimation and classification
- An informational measure of correlation
- Information gain and a general measure of correlation
- Relative Entropy Measures of Multivariate Dependence
- Measures of Dependence and Tests of Independence
- Nonproduct Data-Dependent Partitions for Mutual Information Estimation: Strong Consistency and Applications
- Estimation of the information by an adaptive partitioning of the observation space
- A Consistent Nonparametric Multivariate Density Estimator Based on Statistically Equivalent Blocks
- Mutual Information and Maximal Correlation as Measures of Dependence
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- A NEW MEASURE OF RANK CORRELATION
- A Non-Parametric Test of Independence
This page was built for publication: Information dependency: strong consistency of Darbellay-Vajda partition estimators