Sample estimate of the entropy of a random vector
From MaRDI portal
Publication:1096264
zbMath0633.62005MaRDI QIDQ1096264
L. F. Kozachenko, Nikolai N. Leonenko
Publication date: 1987
Published in: Problems of Information Transmission (Search for Journal in Brave)
consistencyindependent observationsasymptotic unbiasednessabsolutely continuous random vectorestimator of the unknown entropy
Related Items (63)
Hysteresis and disorder-induced order in continuous kinetic-like opinion dynamics in complex networks ⋮ Estimating mutual information for feature selection in the presence of label noise ⋮ Non-parametric estimation of mutual information through the entropy of the linkage ⋮ Similarity of interspike interval distributions and information gain in a stationary neuronal firing ⋮ Minimax estimation of norms of a probability density. I: Lower bounds ⋮ Particle dual averaging: optimization of mean field neural network with global convergence rate analysis* ⋮ Entropy-based test for generalised Gaussian distributions ⋮ Non-parametric entropy estimators based on simple linear regression ⋮ Limit theory for point processes in manifolds ⋮ Statistical Inference for Rényi Entropy Functionals ⋮ A test for independence via Bayesian nonparametric estimation of mutual information ⋮ Nonparametric estimation of information-based measures of statistical dispersion ⋮ On a functional of the number of nonoverlapping chains appearing in the polynomial scheme and its connection with entropy ⋮ An information-theoretic approach to assess practical identifiability of parametric dynamical systems ⋮ Entropy estimation via uniformization ⋮ Detecting anomalies in fibre systems using 3-dimensional image data ⋮ Local nearest neighbour classification with applications to semi-supervised learning ⋮ Efficient functional estimation and the super-oracle phenomenon ⋮ Causality analysis of large-scale structures in the flow around a wall-mounted square cylinder ⋮ Asymptotics for Euclidean functionals of mixing processes ⋮ The Hellinger Correlation ⋮ A model-free Bayesian classifier ⋮ Information-Maximization Clustering Based on Squared-Loss Mutual Information ⋮ Is mutual information adequate for feature selection in regression? ⋮ Information estimators for weighted observations ⋮ Parametric Bayesian estimation of differential entropy and relative entropy ⋮ Nearest neighbor estimates of entropy for multivariate circular distributions ⋮ A Nonparametric Clustering Algorithm with a Quantile-Based Likelihood Estimator ⋮ On mutual information estimation for mixed-pair random variables ⋮ On entropy estimation by \(m\)-spacing method ⋮ Entropy production and Vlasov equation for self-gravitating systems ⋮ Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances ⋮ Entropy propagation analysis in stochastic structural dynamics: application to a beam with uncertain cross sectional area ⋮ Statistical estimation of the Shannon entropy ⋮ Decomposition in derivative-free optimization ⋮ Entropy production in systems with long range interactions ⋮ A Bayesian nonparametric estimation to entropy ⋮ Entropy-based inhomogeneity detection in fiber materials ⋮ Statistical estimation of mutual information for mixed model ⋮ On the Kozachenko-Leonenko entropy estimator ⋮ Large-scale multiple inference of collective dependence with applications to protein function ⋮ A class of Rényi information estimators for multidimensional densities ⋮ Design of computer experiments: space filling and beyond ⋮ Optimal Latin hypercube designs for the Kullback-Leibler criterion ⋮ Parametric generation of conditional geological realizations using generative neural networks ⋮ Mixture-based estimation of entropy ⋮ Statistical inference for the \(\epsilon \)-entropy and the quadratic Rényi entropy ⋮ Effect of neuromodulation of short-term plasticity on information processing in hippocampal interneuron synapses ⋮ A nearest-neighbor based nonparametric test for viral remodeling in heterogeneous single-cell proteomic data ⋮ Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles ⋮ \(k_n\)-nearest neighbor estimators of entropy ⋮ Spatio-chromatic information available from different neural layers via gaussianization ⋮ A density based empirical likelihood approach for testing bivariate normality ⋮ The relation between Granger causality and directed information theory: a review ⋮ Calculating the Mutual Information between Two Spike Trains ⋮ Statistical estimation of conditional Shannon entropy ⋮ Unnamed Item ⋮ Unnamed Item ⋮ The entropy based goodness of fit tests for generalized von Mises-Fisher distributions and beyond ⋮ Reliability of coupled oscillators ⋮ Causality of energy-containing eddies in wall turbulence ⋮ TCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributions ⋮ A Note on Bayesian Inference for Long-Range Dependence of a Stationary Two-State Process
This page was built for publication: Sample estimate of the entropy of a random vector