Predictability, Complexity, and Learning
From MaRDI portal
Publication:2784814
DOI10.1162/089976601753195969zbMath0993.68045arXivphysics/0007070OpenAlexW2128957129WikidataQ40680026 ScholiaQ40680026MaRDI QIDQ2784814
Naftali Tishby, Ilya Nemenman, William Bialek
Publication date: 24 April 2002
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/physics/0007070
Related Items
Statistical signatures of structural organization: the case of long memory in renewal processes ⋮ Permutation complexity and coupling measures in hidden Markov models ⋮ Optimal prediction in the retina and natural motion statistics ⋮ Symbols as self-emergent entities in an optimization process of feature extraction and predic\-tions ⋮ Predictive rate-distortion for infinite-order Markov processes ⋮ Spectral simplicity of apparent complexity. I. The nondiagonalizable metadynamics of prediction ⋮ A measure of statistical complexity based on predictive information with application to finite spin systems ⋮ Computational mechanics of input-output processes: structured transformations and the \(\epsilon\)-transducer ⋮ Closure measures for coarse-graining of the tent map ⋮ A free energy principle for biological systems ⋮ Detecting direct associations in a network by information theoretic approaches ⋮ Partially ordered permutation entropies ⋮ Symbolic transfer entropy rate is equal to transfer entropy rate for bivariate finite-alphabet stationary ergodic Markov processes ⋮ Graph-based predictable feature analysis ⋮ Predictive information in a nonequilibrium critical model ⋮ The Evolution of Representation in Simple Cognitive Networks ⋮ Computation in finitary stochastic and quantum processes ⋮ Delayed mutual information infers patterns of synaptic connectivity in a proprioceptive neural network ⋮ Optimal Signal Estimation in Neuronal Models ⋮ Statistical criticality arises in most informative representations ⋮ Regularities unseen, randomness observed: Levels of entropy convergence ⋮ Fluctuation-Dissipation Theorem and Models of Learning ⋮ Quantifying Stimulus Discriminability: A Comparison of Information Theory and Ideal Observer Analysis ⋮ Slowness as a Proxy for Temporal Predictability: An Empirical Comparison ⋮ Estimating Entropy Rates with Bayesian Confidence Intervals ⋮ Predictive models and generative complexity ⋮ A Revision of Coding Theory for Learning from Language ⋮ QUANTIFYING EMERGENCE IN TERMS OF PERSISTENT MUTUAL INFORMATION ⋮ Predictive information and explorative behavior of autonomous robots ⋮ How should complexity scale with system size? ⋮ Complexity through nonextensivity ⋮ Pursuit of food \textit{versus} pursuit of information in a Markovian perception-action loop model of foraging ⋮ Chaos and complexity from quantum neural network. A study with diffusion metric in machine learning ⋮ Essential conditions for evolution of communication within a species ⋮ Predictive Coding and the Slowness Principle: An Information-Theoretic Approach ⋮ Prediction and dissipation in nonequilibrium molecular sensors: conditionally Markovian channels driven by memoryful environments ⋮ ON THE GENERATIVE NATURE OF PREDICTION ⋮ On the computation of entropy prior complexity and marginal prior distribution for the Bernoulli model ⋮ Maximal relevance and optimal learning machines ⋮ Prediction, retrodiction, and the amount of information stored in the present ⋮ Sophisticated Inference ⋮ Factorized mutual information maximization ⋮ Active inference, eye movements and oculomotor delays ⋮ A geometric approach to complexity ⋮ Excess entropy in natural language: Present state and perspectives ⋮ Information symmetries in irreversible processes ⋮ Natural complexity, computational complexity and depth ⋮ Local entropy and structure in a two-dimensional frustrated system ⋮ Surveying structural complexity in quantum many-body systems
Cites Work
- A Mathematical Theory of Communication
- Computation theory of cellular automata
- Stochastic complexity and modeling
- Toward a quantitative theory of self-generated complexity
- Modeling by shortest data description
- Mutual information, metric entropy and cumulative relative entropy risk
- Probability inequalities for likelihood ratios and convergence rates of sieve MLEs
- Measures of statistical complexity: why?
- Physical complexity of symbolic sequences
- Guessing probability distributions from small samples
- Statistical Inference, Occam's Razor, and Statistical Mechanics on the Space of Probability Distributions
- Information-theoretic asymptotics of Bayes methods
- Prediction and Entropy of Printed English
- Universal coding, information, prediction, and estimation
- On stochastic complexity and nonparametric density estimation
- Minimum complexity density estimation
- Density estimation by stochastic complexity
- A Theory of Program Size Formally Identical to Information Theory
- A convergent gambling estimate of the entropy of English
- Field Theories for Learning Probability Distributions
- Reparametrization Invariant Statistical Inference and Gravity
- Minimum description length induction, Bayesianism, and Kolmogorov complexity
- Fisher information and stochastic complexity
- An Information Measure for Classification
- A formal theory of inductive inference. Part I
- A new look at the statistical model identification