Shannon Entropy Rate of Hidden Markov Processes
From MaRDI portal
Publication:6347958
DOI10.1007/S10955-021-02769-3arXiv2008.12886MaRDI QIDQ6347958
Alexandra M. Jurgens, James P. Crutchfield
Publication date: 28 August 2020
Abstract: Hidden Markov chains are widely applied statistical models of stochastic processes, from fundamental physics and chemistry to finance, health, and artificial intelligence. The hidden Markov processes they generate are notoriously complicated, however, even if the chain is finite state: no finite expression for their Shannon entropy rate exists, as the set of their predictive features is generically infinite. As such, to date one cannot make general statements about how random they are nor how structured. Here, we address the first part of this challenge by showing how to efficiently and accurately calculate their entropy rates. We also show how this method gives the minimal set of infinite predictive features. A sequel addresses the challenge's second part on structure.
Applications of Markov chains and discrete-time Markov processes on general state spaces (social mobility, learning theory, industrial processes, etc.) (60J20) Measures of information, entropy (94A17)
This page was built for publication: Shannon Entropy Rate of Hidden Markov Processes
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6347958)