Excess entropy in natural language: Present state and perspectives
From MaRDI portal
Publication:5264343
DOI10.1063/1.3630929zbMath1317.68243arXiv1105.1306OpenAlexW1777730981WikidataQ37942686 ScholiaQ37942686MaRDI QIDQ5264343
Publication date: 27 July 2015
Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1105.1306
Measures of information, entropy (94A17) Information theory (general) (94A15) Natural language processing (68T50) Communication theory (94A05)
Related Items (5)
Predictive rate-distortion for infinite-order Markov processes ⋮ Constant conditional entropy and related hypotheses ⋮ On hidden Markov processes with infinite excess entropy ⋮ Two halves of a meaningful text are statistically different ⋮ Natural complexity, computational complexity and depth
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Variable-length coding of two-sided asymptotically mean stationary measures
- On a definition of random sequences with respect to conditional probability
- A general definition of conditional information and its application to ergodic decomposition
- Asymptotically mean stationary measures
- Word frequency and entropy of symbolic sequences: A dynamical perspective
- String matching bounds via coding
- Maximum entropy fundamentals
- Strong, weak and false inverse power laws
- Predictability, Complexity, and Learning
- ON A CLASS OF SKEW DISTRIBUTION FUNCTIONS
- The Smallest Grammar Problem
- Prediction and Entropy of Printed English
- The complexity of songs
- A Theory of Program Size Formally Identical to Information Theory
- A convergent gambling estimate of the entropy of English
- Universal redundancy rates do not exist
- Grammar-based codes: a new class of universal lossless source codes
- On the Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts
- An Example of Statistical Investigation of the Text Eugene Onegin Concerning the Connection of Samples in Chains
- Least effort and the origins of scaling in human language
- Regularities unseen, randomness observed: Levels of entropy convergence
- Computational mechanics: pattern and prediction, structure and simplicity.
- Complexity through nonextensivity
This page was built for publication: Excess entropy in natural language: Present state and perspectives