A Revision of Coding Theory for Learning from Language
From MaRDI portal
Publication:4923555
DOI10.1016/S1571-0661(05)82574-5zbMath1263.68167OpenAlexW2188116396MaRDI QIDQ4923555
Publication date: 24 May 2013
Published in: Electronic Notes in Theoretical Computer Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s1571-0661(05)82574-5
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Time series: theory and methods
- Toward a quantitative theory of self-generated complexity
- Analytic models and ambiguity of context-free languages
- What is complexity?
- Prediction and entropy of nonlinear dynamical systems and symbolic sequences with LRO
- Mutual information functions versus correlation functions.
- Predictability, Complexity, and Learning
- Information Theory and Statistical Mechanics
- Probability Theory
- Minimum description length induction, Bayesianism, and Kolmogorov complexity
- Regularities unseen, randomness observed: Levels of entropy convergence
- Entropic nonextensivity: A possible measure of complexity
This page was built for publication: A Revision of Coding Theory for Learning from Language