Estimation of the information by an adaptive partitioning of the observation space
From MaRDI portal
Publication:4701390
DOI10.1109/18.761290zbMath0957.94006OpenAlexW2127234432MaRDI QIDQ4701390
Igor Vajda, Georges A. Darbellay
Publication date: 21 November 1999
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.761290
Related Items (37)
Optimal quantization of the support of a continuous multivariate distribution based on mutual information ⋮ A kernel-based calculation of information on a metric space ⋮ Non-parametric estimation of mutual information through the entropy of the linkage ⋮ KM-MIC: an improved maximum information coefficient based on K-medoids clustering ⋮ Mutual information as a measure of multivariate association: analytical properties and statistical estimation ⋮ EVALUATION OF MUTUAL INFORMATION ESTIMATORS FOR TIME SERIES ⋮ Estimation of Entropy and Mutual Information ⋮ Canonical kernel dimension reduction ⋮ Geometric k-nearest neighbor estimation of entropy and mutual information ⋮ Non-parametric estimation of copula based mutual information ⋮ Measuring synchronization in coupled model systems: a comparison of different approaches ⋮ GENERALIZED CELLULAR NEURAL NETWORKS (GCNNs) CONSTRUCTED USING PARTICLE SWARM OPTIMIZATION FOR SPATIO-TEMPORAL EVOLUTIONARY PATTERN IDENTIFICATION ⋮ Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation ⋮ Information dependency: strong consistency of Darbellay-Vajda partition estimators ⋮ Successful network inference from time-series data using mutual information rate ⋮ Causation entropy from symbolic representations of dynamical systems ⋮ Density estimation of multivariate samples using Wasserstein distance ⋮ A computationally efficient estimator for mutual information ⋮ Causal inference for multivariate stochastic process prediction ⋮ On the convergence of Shannon differential entropy, and its connections with density and entropy estimation ⋮ Quantification of effective connectivity in the brain using a measure of directed information ⋮ Conditional Lyapunov exponents and transfer entropy in coupled bursting neurons under excitation and coupling mismatch ⋮ Least-squares two-sample test ⋮ Copula index for detecting dependence and monotonicity between stochastic signals ⋮ Operational risk aggregation based on business line dependence: a mutual information approach ⋮ Model term selection for spatio-temporal system identification using mutual information ⋮ Dimensionless Measures of Variability and Dependence for Multivariate Continuous Distributions ⋮ An efficient algorithm for the computation of average mutual information: validation and implementation in Matlab ⋮ Information divergence estimation based on data-dependent partitions ⋮ Estimation of mutual information by the fuzzy histogram ⋮ Blind Extraction of Chaotic Sources from White Gaussian Noise Based on a Measure of Determinism ⋮ Machine learning with squared-loss mutual information ⋮ Unnamed Item ⋮ Causality of energy-containing eddies in wall turbulence ⋮ Information transfer between turbulent boundary layers and porous media ⋮ Hybrid variable monitoring: an unsupervised process monitoring framework with binary and continuous variables ⋮ Probing the linearity and nonlinearity in DNA sequences
This page was built for publication: Estimation of the information by an adaptive partitioning of the observation space