Entropy estimation via uniformization
From MaRDI portal
Publication:6136099
DOI10.1016/j.artint.2023.103954arXiv2304.09700OpenAlexW4380356568MaRDI QIDQ6136099
Publication date: 28 August 2023
Published in: Artificial Intelligence (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2304.09700
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Lectures on the nearest neighbor method
- Estimation of entropy and other functionals of a multivariate density
- Minimum-entropy estimation in semi-parametric models
- Density-free convergence properties of various estimators of entropy
- Sample estimate of the entropy of a random vector
- The jackknife estimate of variance
- On the estimation of entropy
- Towards Bayesian experimental design for nonlinear models that require a large number of sampling times
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- Estimation of integral functionals of a density
- Optimal rates of entropy estimation over Lipschitz balls
- Combinatorics of partial derivatives
- Geometric k-nearest neighbor estimation of entropy and mutual information
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- Demystifying Fixed <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math> </inline-formula>-Nearest Neighbor Information Estimators
- Maximum Entropy Sampling and Optimal Bayesian Experimental Design
- SENSITIVITY ANALYSIS FOR STOCHASTIC SIMULATORS USING DIFFERENTIAL ENTROPY
- Ensemble Estimators for Multivariate Entropy Estimation