The relation between Granger causality and directed information theory: a review
From MaRDI portal
Publication:742659
DOI10.3390/e15010113OpenAlexW2084008199MaRDI QIDQ742659
Olivier J. J. Michel, Pierre-Olivier Amblard
Publication date: 19 September 2014
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1211.3169
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Measures of information, entropy (94A17)
Related Items
Informations in models of evolutionary dynamics ⋮ Bubble transfer spectral entropy and its application in epilepsy EEG analysis ⋮ Directed information graphs for the Granger causality of multivariate time series ⋮ Granger causality from quantized measurements ⋮ Sparse causality network retrieval from short time series ⋮ A general theory to estimate information transfer in nonlinear systems ⋮ Causal inference for multivariate stochastic process prediction ⋮ Two stage approach to functional network reconstruction for binary time-series ⋮ Measuring information transfer by dispersion transfer entropy ⋮ Information thermodynamics for interacting stochastic systems without bipartite structure ⋮ Application of time-delay multiscale symbolic phase compensated transfer entropy in analyzing cyclic alternating pattern (CAP) in sleep-related pathological data ⋮ Theory and applications of financial chaos index ⋮ A nonparametric efficient evaluation of partial directed coherence
Cites Work
- Graphical modelling of multivariate time series
- Escort entropies and divergences and related canonical distribution
- A class of Rényi information estimators for multidimensional densities
- Sample estimate of the entropy of a random vector
- Information transfer in continuous processes
- Evaluating causal relations in neural systems: Granger causality, directed transfer function and statistical assessment of significance
- Nonlinear analyses of interictal EEG map the brain interdependences in human focal epilepsy.
- Linear and nonlinear causality between signals: methods, examples and neurophysiological applications
- On the evaluation of information flow in multivariate systems by the directed transfer function
- Measures of mutual and causal dependence between two time series (Corresp.)
- Source Coding With Feed-Forward: Rate-Distortion Theorems and Error Exponents for a General Source
- A Coding Theorem for a Class of Stationary Channels With Feedback
- Mutual information rate, distortion, and quantization in metric spaces
- The General Equivalence of Granger and Sims Causality
- A Note on Noncausality
- Feedback between stationary stochastic processes
- On the Granger Condition for Non-Causality
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- Estimation of Entropy and Mutual Information
- The Capacity of Channels With Feedback
- Divergence Estimation for Multidimensional Densities Via $k$-Nearest-Neighbor Distances
- Estimation of Nonlinear Functionals of Densities With Confidence
- Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing
- Control Under Communication Constraints
- Investigating Causal Relations by Econometric Models and Cross-spectral Methods
- Testing Statistical Hypotheses
- Economic processes involving feedback
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item