Statistical criticality arises in most informative representations
From MaRDI portal
Publication:3303364
DOI10.1088/1742-5468/ab16c8zbMath1456.62011arXiv1808.00249OpenAlexW2886808349MaRDI QIDQ3303364
Ryan John Cubero, Junghyo Jo, Yasser Roudi, Juyong Song, Matteo Marsili
Publication date: 11 August 2020
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.00249
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (6)
A random energy approach to deep learning ⋮ The peculiar statistical mechanics of optimal learning machines ⋮ Multiscale relevance and informative encoding in neuronal spike trains ⋮ Maximal relevance and optimal learning machines ⋮ Two halves of a meaningful text are statistically different ⋮ Optimal work extraction and the minimum description length principle
Cites Work
- Statistical mechanics of the US supreme court
- Are biological systems poised at criticality?
- Learning and generalization with the information bottleneck
- Estimating the dimension of a model
- Predictability, Complexity, and Learning
- On sampling and modeling complex systems
- Criticality of mostly informative samples: a Bayesian model selection approach
- Resolution and relevance trade-offs in deep learning
- Entropy estimates of small data sets
- Zipf's Law for Cities: An Explanation
- Probability Theory
- Dissecting financial markets: sectors and states
- Toward a unified theory of efficient, predictive, and sparse coding
- The Deterministic Information Bottleneck
This page was built for publication: Statistical criticality arises in most informative representations