Quantifying predictability through information theory: small sample estimation in a non-Gaussian framework
From MaRDI portal
Publication:1780633
DOI10.1016/j.jcp.2004.12.008zbMath1088.62502OpenAlexW2140357575MaRDI QIDQ1780633
Kyle Haven, Andrew J. Majda, Rafail V. Abramov
Publication date: 13 June 2005
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jcp.2004.12.008
Applications of statistics to environmental and related topics (62P12) Dynamical systems in fluid mechanics, oceanography and meteorology (37N10) Meteorology and atmospheric physics (86A10) Statistical aspects of information-theoretic topics (62B10)
Related Items (17)
Challenges in Climate Science and Contemporary Applied Mathematics ⋮ Best probability density function for random sampled data ⋮ A maximum entropy method for particle filtering ⋮ Entropy measures for biological signal analyses ⋮ Filtering skill for turbulent signals for a suite of nonlinear and linear extended Kalman filters ⋮ Filtering nonlinear spatio-temporal chaos with autoregressive linear stochastic models ⋮ Mathematical strategies for filtering complex systems: Regularly spaced sparse observations ⋮ Information theory and dynamical system predictability ⋮ An improved algorithm for the multidimensional moment-constrained maximum entropy problem ⋮ New approximations and tests of linear fluctuation-response for chaotic nonlinear forced-dissipative dynamical systems ⋮ A homotopy training algorithm for fully connected neural networks ⋮ Quantifying dynamical predictability: the pseudo-ensemble approach ⋮ Mathematical test criteria for filtering complex systems: Plentiful observations ⋮ An equation-by-equation method for solving the multidimensional moment constrained maximum entropy problem ⋮ The multidimensional moment-constrained maximum entropy problem: A BFGS algorithm with constraint scaling ⋮ SENSITIVITY ANALYSIS OF NONLINEAR MODELS TO PARAMETER PERTURBATIONS FOR SMALL SIZE ENSEMBLES OF MODEL OUTPUTS ⋮ A practical computational framework for the multidimensional moment-constrained maximum entropy principle
Cites Work
- Unnamed Item
- Unnamed Item
- Statistical mechanics for truncations of the Burgers-Hopf equation: a model for intrinsic stochastic behavior with scaling
- A mathematical framework for quantifying predictability through relative entropy
- A practical computational framework for the multidimensional moment-constrained maximum entropy principle
- Weighing the Odds
- Information decay and the predictability of turbulent flows
- Moment-type estimation in the exponential family
- Remarkable statistical behavior for truncated Burgers–Hopf dynamics
- Quantifying Uncertainty for Non-Gaussian Ensembles in Complex Systems
- QUANTIFYING PREDICTABILITY IN A SIMPLE MODEL WITH COMPLEX FEATURES
- Hamiltonian structure and statistically relevant conserved quantities for the truncated Burgers‐Hopf equation
- Quantifying predictability in a model with statistical features of the atmosphere
This page was built for publication: Quantifying predictability through information theory: small sample estimation in a non-Gaussian framework