Learning Theory for Dynamical Systems
DOI10.1137/22m1516865zbMath1526.37089arXiv2208.05349OpenAlexW4385658405MaRDI QIDQ6132792
Publication date: 17 August 2023
Published in: SIAM Journal on Applied Dynamical Systems (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2208.05349
mixingLyapunov exponentreservoir computingmatrix cocycledirect forecastdelay-coordinatesiterative forecast
Uniformly hyperbolic systems (expanding, Anosov, Axiom A, etc.) (37D20) Time series analysis of dynamical systems (37M10) Simulation of dynamical systems (37M05) Nonuniformly hyperbolic systems (Lyapunov exponents, Pesin theory, etc.) (37D25) Computational methods for ergodic theory (approximation of invariant measures, computation of Lyapunov exponents, entropy, etc.) (37M25) Dynamical systems in numerical analysis (37N30)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- A variational approach to the consistency of spectral clustering
- Koopman spectra in reproducing kernel Hilbert spaces
- Random matrix products and measures on projective spaces
- The Lyapunov exponents of generic volume-preserving and symplectic maps
- Stable manifolds for nonautonomous equations without exponential dichotomy
- Smoothness of invariant manifolds for nonautonomous equations
- Identification and prediction of low dimensional dynamics
- Liapunov multipliers and decay of correlations in dynamical systems
- The metric entropy of diffeomorphisms. I: Characterization of measures satisfying Pesin's entropy formula
- Nonlinear prediction of chaotic time series
- Ergodic theory of differentiable dynamical systems
- Lyapunov exponents, entropy and periodic orbits for diffeomorphisms
- Statistical properties of dynamical systems with some hyperbolicity
- Embedology
- Recurrence times and rates of mixing
- Regularized local linear prediction of chaotic time series
- The topological invariance of Lyapunov exponents in embedded dynamics
- Embedding and approximation theorems for echo state networks
- Reproducing kernel Hilbert space compactification of unitary evolution groups
- Time-series learning of latent-space dynamics for reduced-order model closure
- Deep learning of conjugate mappings
- Echo state networks trained by Tikhonov least squares are \(L^2(\mu)\) approximators of ergodic dynamical systems
- Operator-theoretic framework for forecasting nonlinear time series with kernel analog techniques
- Machine learning for prediction with missing dynamics
- Delay-coordinate maps and the spectra of Koopman operators
- Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality
- Lyapunov exponents of linear cocycles. Continuity via large deviations
- The metric entropy of diffeomorphisms. II: Relations between entropy, exponents and dimension
- Diffusion maps
- Markov structures and decay of correlations for non-uniformly expanding dynamical systems
- Is the largest Lyapunov exponent preserved in embedded dynamics?
- Analog forecasting with dynamics-adapted kernels
- Time-Scale Separation from Diffusion-Mapped Delay Coordinates
- Numerical orbits of chaotic processes represent true orbits
- Coherent structures and isolated spectrum for Perron–Frobenius cocycles
- Lyapunov exponents of hyperbolic measures and hyperbolic periodic orbits
- Nonlinear Regression
- FAMILIES OF INVARIANT MANIFOLDS CORRESPONDING TO NONZERO CHARACTERISTIC EXPONENTS
- Regularity of invariant graphs for forced systems
- Quantitative Pesin theory for Anosov diffeomorphisms and flows
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- Super convergence of ergodic averages for quasiperiodic orbits
- Equivalence of physical and SRB measures in random dynamical systems
- Lyapunov exponents and rates of mixing for one-dimensional maps
- Erratum: “On explaining the surprising success of reservoir computing forecaster of chaos? The universal machine learning dynamical system with contrasts to VAR and DMD” [Chaos 31(1), 013108 (2021)]
- Model Reduction with Memory and the Machine Learning of Dynamical Systems
- A concept of homeomorphic defect for defining mostly conjugate dynamical systems
- Approximation of Bernoulli measures for non-uniformly hyperbolic systems
- (Dis)continuity of Lyapunov exponents
- Products of Random Matrices
- SRB measures as zero-noise limits
- Optimal Construction of Koopman Eigenfunctions for Prediction and Control
- Fading memory echo state networks are universal
- Learning strange attractors with reservoir systems