Network inference combining mutual information rate and statistical tests
From MaRDI portal
Publication:2094514
DOI10.1016/j.cnsns.2022.106896zbMath1501.94018arXiv2209.14063OpenAlexW4296967830MaRDI QIDQ2094514
Publication date: 28 October 2022
Published in: Communications in Nonlinear Science and Numerical Simulation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2209.14063
mutual informationcomplex systemsstatistical testsShannon entropyfalse discovery ratecomplex networkstime-series datanetwork inferencemutual information rate
Inference from stochastic processes and prediction (62M20) Measures of information, entropy (94A17) Information theory (general) (94A15) Sampling theory in information and communication theory (94A20)
Uses Software
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Density of first Poincaré returns, periodic orbits, and Kolmogorov-Sinai entropy
- Clustering, coding, switching, hierarchical ordering, and control in a network of chaotic elements
- Asymptotically mean stationary measures
- Finite sample effects in sequence analysis
- The control of the false discovery rate in multiple testing under dependency.
- Coarse-grained entropy rates for characterization of complex time series
- INFERRING INDIRECT COUPLING BY MEANS OF RECURRENCES
- Revealing networks from dynamics: an introduction
- Dynamical Processes on Complex Networks
- Efficient Algorithms for Shortest Paths in Sparse Networks
- Overview of coupled map lattices
- Ergodic theory of chaos and strange attractors
- Introduction to the Theory of Complex Systems
- Inferring network topology from complex dynamics
- Investigating Causal Relations by Econometric Models and Cross-spectral Methods
- General formulation of Shannon’s main theorem in information theory
- Successful network inference from time-series data using mutual information rate
- Granger Causality: Theory and Applications