Demystifying Fixed <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math> </inline-formula>-Nearest Neighbor Information Estimators
From MaRDI portal
Publication:4682867
DOI10.1109/TIT.2018.2807481zbMath1401.94072arXiv1604.03006OpenAlexW2789700415MaRDI QIDQ4682867
Pramod Viswanath, Sewoong Oh, Weihao Gao
Publication date: 19 September 2018
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1604.03006
Nonparametric estimation (62G05) Measures of information, entropy (94A17) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10)
Related Items (7)
Entropy-based test for generalised Gaussian distributions ⋮ Entropy estimation via uniformization ⋮ Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances ⋮ Exact upper bound on the sum of squared nearest-neighbor distances between points in a rectangle ⋮ Quantifying Information Conveyed by Large Neuronal Populations ⋮ Optimal rates of entropy estimation over Lipschitz balls ⋮ The entropy based goodness of fit tests for generalized von Mises-Fisher distributions and beyond
This page was built for publication: Demystifying Fixed <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math> </inline-formula>-Nearest Neighbor Information Estimators