Non-parametric estimation of mutual information through the entropy of the linkage
From MaRDI portal
Publication:280732
DOI10.3390/e15125154zbMath1338.62083OpenAlexW2171789771MaRDI QIDQ280732
Roberta Sirovich, Maria Teresa Giraudo, Laura Sacerdote
Publication date: 10 May 2016
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e15125154
entropymutual informationkernel methodinformation measuresbinless estimatorcopula functionlinkage function
Related Items
Geometric k-nearest neighbor estimation of entropy and mutual information ⋮ A novel approach of dependence measure for complex signals ⋮ On dynamic mutual information for bivariate lifetimes
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coincidences and estimation of entropies of random variables with large cardinalities
- Nonparametric estimation of information-based measures of statistical dispersion
- A class of Rényi information estimators for multidimensional densities
- Numerical solution of stochastic differential equations with jumps in finance
- Sample estimate of the entropy of a random vector
- An introduction to copulas. Properties and applications
- Linkages: A tool for the construction of multivariate distributions with given nonoverlapping multivariate marginals
- Multivariante information transmission
- A computationally efficient estimator for mutual information
- Estimation of conditional densities and sensitivity measures in nonlinear dynamical systems
- Estimation of the information by an adaptive partitioning of the observation space
- Nonparametric Estimation and Symmetry Tests for Conditional Density Functions
- The Shape of Neural Dependence
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- Mutual information as a measure of multivariate association: analytical properties and statistical estimation