Sample estimate of the entropy of a random vector

From MaRDI portal
Publication:1096264

zbMath0633.62005MaRDI QIDQ1096264

L. F. Kozachenko, Nikolai N. Leonenko

Publication date: 1987

Published in: Problems of Information Transmission (Search for Journal in Brave)




Related Items (63)

Hysteresis and disorder-induced order in continuous kinetic-like opinion dynamics in complex networksEstimating mutual information for feature selection in the presence of label noiseNon-parametric estimation of mutual information through the entropy of the linkageSimilarity of interspike interval distributions and information gain in a stationary neuronal firingMinimax estimation of norms of a probability density. I: Lower boundsParticle dual averaging: optimization of mean field neural network with global convergence rate analysis*Entropy-based test for generalised Gaussian distributionsNon-parametric entropy estimators based on simple linear regressionLimit theory for point processes in manifoldsStatistical Inference for Rényi Entropy FunctionalsA test for independence via Bayesian nonparametric estimation of mutual informationNonparametric estimation of information-based measures of statistical dispersionOn a functional of the number of nonoverlapping chains appearing in the polynomial scheme and its connection with entropyAn information-theoretic approach to assess practical identifiability of parametric dynamical systemsEntropy estimation via uniformizationDetecting anomalies in fibre systems using 3-dimensional image dataLocal nearest neighbour classification with applications to semi-supervised learningEfficient functional estimation and the super-oracle phenomenonCausality analysis of large-scale structures in the flow around a wall-mounted square cylinderAsymptotics for Euclidean functionals of mixing processesThe Hellinger CorrelationA model-free Bayesian classifierInformation-Maximization Clustering Based on Squared-Loss Mutual InformationIs mutual information adequate for feature selection in regression?Information estimators for weighted observationsParametric Bayesian estimation of differential entropy and relative entropyNearest neighbor estimates of entropy for multivariate circular distributionsA Nonparametric Clustering Algorithm with a Quantile-Based Likelihood EstimatorOn mutual information estimation for mixed-pair random variablesOn entropy estimation by \(m\)-spacing methodEntropy production and Vlasov equation for self-gravitating systemsEfficient multivariate entropy estimation via \(k\)-nearest neighbour distancesEntropy propagation analysis in stochastic structural dynamics: application to a beam with uncertain cross sectional areaStatistical estimation of the Shannon entropyDecomposition in derivative-free optimizationEntropy production in systems with long range interactionsA Bayesian nonparametric estimation to entropyEntropy-based inhomogeneity detection in fiber materialsStatistical estimation of mutual information for mixed modelOn the Kozachenko-Leonenko entropy estimatorLarge-scale multiple inference of collective dependence with applications to protein functionA class of Rényi information estimators for multidimensional densitiesDesign of computer experiments: space filling and beyondOptimal Latin hypercube designs for the Kullback-Leibler criterionParametric generation of conditional geological realizations using generative neural networksMixture-based estimation of entropyStatistical inference for the \(\epsilon \)-entropy and the quadratic Rényi entropyEffect of neuromodulation of short-term plasticity on information processing in hippocampal interneuron synapsesA nearest-neighbor based nonparametric test for viral remodeling in heterogeneous single-cell proteomic dataFast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles\(k_n\)-nearest neighbor estimators of entropySpatio-chromatic information available from different neural layers via gaussianizationA density based empirical likelihood approach for testing bivariate normalityThe relation between Granger causality and directed information theory: a reviewCalculating the Mutual Information between Two Spike TrainsStatistical estimation of conditional Shannon entropyUnnamed ItemUnnamed ItemThe entropy based goodness of fit tests for generalized von Mises-Fisher distributions and beyondReliability of coupled oscillatorsCausality of energy-containing eddies in wall turbulenceTCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributionsA Note on Bayesian Inference for Long-Range Dependence of a Stationary Two-State Process




This page was built for publication: Sample estimate of the entropy of a random vector