Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances

From MaRDI portal
Publication:1731757

DOI10.1214/18-AOS1688zbMath1473.62177arXiv1606.00304MaRDI QIDQ1731757

Ming Yuan, Richard J. Samworth, Thomas B. Berrett

Publication date: 14 March 2019

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1606.00304



Related Items

Entropy-based test for generalised Gaussian distributions, A test for independence via Bayesian nonparametric estimation of mutual information, Ultra-High Dimensional Variable Selection for Doubly Robust Causal Inference, Dimensional measures of generalized entropy, Unnamed Item, Entropy estimation via uniformization, Local nearest neighbour classification with applications to semi-supervised learning, Optimal Nonparametric Inference with Two-Scale Distributional Nearest Neighbors, Efficient functional estimation and the super-oracle phenomenon, On Azadkia-Chatterjee's conditional dependence coefficient, The Hellinger Correlation, Variance Reduction for Estimation of Shapley Effects and Adaptation to Unknown Input Distribution, Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances, Exact upper bound on the sum of squared nearest-neighbor distances between points in a rectangle, A Bayesian nonparametric estimation to entropy, Statistical estimation of mutual information for mixed model, On the estimation of entropy for non-negative data, On the Kozachenko-Leonenko entropy estimator, Large-scale multiple inference of collective dependence with applications to protein function, Optimal rates of entropy estimation over Lipschitz balls, A nearest-neighbor based nonparametric test for viral remodeling in heterogeneous single-cell proteomic data, Semiparametric density testing in the contamination model, A density based empirical likelihood approach for testing bivariate normality, Statistical estimation of conditional Shannon entropy, Theoretical analysis of cross-validation for estimating the risk of the k-Nearest Neighbor classifier, The entropy based goodness of fit tests for generalized von Mises-Fisher distributions and beyond



Cites Work