Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
From MaRDI portal
Publication:1731757
DOI10.1214/18-AOS1688zbMath1473.62177arXiv1606.00304MaRDI QIDQ1731757
Ming Yuan, Richard J. Samworth, Thomas B. Berrett
Publication date: 14 March 2019
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.00304
Estimation in multivariate analysis (62H12) Nonparametric estimation (62G05) Statistical aspects of information-theoretic topics (62B10)
Related Items
Entropy-based test for generalised Gaussian distributions, A test for independence via Bayesian nonparametric estimation of mutual information, Ultra-High Dimensional Variable Selection for Doubly Robust Causal Inference, Dimensional measures of generalized entropy, Unnamed Item, Entropy estimation via uniformization, Local nearest neighbour classification with applications to semi-supervised learning, Optimal Nonparametric Inference with Two-Scale Distributional Nearest Neighbors, Efficient functional estimation and the super-oracle phenomenon, On Azadkia-Chatterjee's conditional dependence coefficient, The Hellinger Correlation, Variance Reduction for Estimation of Shapley Effects and Adaptation to Unknown Input Distribution, Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances, Exact upper bound on the sum of squared nearest-neighbor distances between points in a rectangle, A Bayesian nonparametric estimation to entropy, Statistical estimation of mutual information for mixed model, On the estimation of entropy for non-negative data, On the Kozachenko-Leonenko entropy estimator, Large-scale multiple inference of collective dependence with applications to protein function, Optimal rates of entropy estimation over Lipschitz balls, A nearest-neighbor based nonparametric test for viral remodeling in heterogeneous single-cell proteomic data, Semiparametric density testing in the contamination model, A density based empirical likelihood approach for testing bivariate normality, Statistical estimation of conditional Shannon entropy, Theoretical analysis of cross-validation for estimating the risk of the k-Nearest Neighbor classifier, The entropy based goodness of fit tests for generalized von Mises-Fisher distributions and beyond
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the Kozachenko-Leonenko entropy estimator
- Testing composite hypotheses, Hermite polynomials and optimal estimation of a nonsmooth functional
- \(k_n\)-nearest neighbor estimators of entropy
- Lectures on the nearest neighbor method
- Sample estimate of the entropy of a random vector
- On the estimation of entropy
- On estimation of the \(L_r\) norm of a regression function
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- Efficient estimation of integral functionals of a density
- On entropy estimation by \(m\)-spacing method
- Empirical Processes with Applications to Statistics
- Undersmoothed Kernel Entropy Estimators
- On the logarithms of high-order spacings
- Asymptotic Statistics
- 10.1162/jmlr.2003.4.7-8.1271
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- An Introduction to Stein's Method
- Demystifying Fixed <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math> </inline-formula>-Nearest Neighbor Information Estimators
- Estimation of Entropy and Mutual Information
- Ensemble Estimators for Multivariate Entropy Estimation