Estimating the errors on measured entropy and mutual information
From MaRDI portal
Publication:1962437
DOI10.1016/S0167-2789(98)00269-3zbMath0935.94013OpenAlexW2067211778MaRDI QIDQ1962437
Publication date: 31 January 2000
Published in: Physica D (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0167-2789(98)00269-3
Applications of dynamical systems (37N99) Measures of information, entropy (94A17) Symbolic dynamics (37B10)
Related Items (20)
Analysis of symbolic sequences using the Jensen-Shannon divergence ⋮ Multi-camera piecewise planar object tracking with mutual information ⋮ Investigation on the high-order approximation of the entropy bias ⋮ METHODS FOR QUANTIFYING THE CAUSAL STRUCTURE OF BIVARIATE TIME SERIES ⋮ Scaling invariance embedded in very short time series: a factorial moment based diffusion entropy approach ⋮ A statistical dynamics approach to the study of human health data: Resolving population scale diurnal variation in laboratory data ⋮ Analytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated Communication ⋮ A (ECONOPHYSICS) NOTE ON VOLATILITY IN EXCHANGE RATE TIME SERIES ⋮ Estimating Entropy Rates with Bayesian Confidence Intervals ⋮ NONPARAMETRIC DETECTION OF DEPENDENCES IN STOCHASTIC POINT PROCESSES ⋮ A nonlinear correlation measure for multivariable data set ⋮ Recurrence plot statistics and the effect of embedding ⋮ Transition matrix analysis of earthquake magnitude sequences ⋮ A Nonparametric Causality Test: Detection of Direct Causal Effects in Multivariate Systems Using Corrected Partial Transfer Entropy ⋮ Using time-delayed mutual information to discover and interpret temporal correlation structure in complex populations ⋮ Chaotic characteristics analysis of the sintering process system with unknown dynamic functions based on phase space reconstruction and chaotic invariables ⋮ Macroeconomic simulation comparison with a multivariate extension of the Markov information criterion ⋮ Tsallis conditional mutual information in investigating long range correlation in symbol sequences ⋮ Factorized mutual information maximization ⋮ Information transfer in continuous processes
Cites Work
- Unnamed Item
- Unnamed Item
- Singular-value decomposition in attractor reconstruction: Pitfalls and precautions
- Finite sample effects in sequence analysis
- Testing for nonlinearity using redundancies: Quantitative and qualitative aspects
- Significance testing of information theoretic functionals
- Coarse-grained entropy rates for characterization of complex time series
- Extraction of delay information from chaotic time series based on information entropy
- Detecting nonlinearity in multivariate time series
- Measuring statistical dependences in a time series
- Information and entropy in strange attractors
- Independent coordinates for strange attractors from mutual information
This page was built for publication: Estimating the errors on measured entropy and mutual information