Conditional limit theorems under Markov conditioning
From MaRDI portal
Publication:3764957
DOI10.1109/TIT.1987.1057385zbMath0628.60037MaRDI QIDQ3764957
Thomas M. Cover, Imre Csiszár, Byoung-Seon Choi
Publication date: 1987
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Kullback-Leibler informationstationary distributionconditional limit theoremssliding block sample average
Central limit and other weak theorems (60F05) Markov chains (discrete-time Markov processes on discrete state spaces) (60J10)
Related Items (19)
Predictive stochastic complexity and model estimation for finite-state processes ⋮ On the maximum entropy principle for a class of stochastic processes ⋮ Graph-combinatorial approach for large deviations of Markov chains ⋮ The maximum entropy rate description of a thermodynamic system in a stationary non-equilibrium state ⋮ Maximum entropy estimation of transition probabilities of reversible Markov chains ⋮ Refinements of the Gibbs conditioning principle ⋮ A lower bound on the quantum capacity of channels with correlated errors ⋮ On the VC-Dimension of Binary Codes ⋮ Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem ⋮ Entropy at a weight-per-symbol and embeddings of Markov chains ⋮ A Riemannian manifold analysis of endothelial cell monolayer impedance parameter precision ⋮ Nonequilibrium Markov processes conditioned on large deviations ⋮ An Elementary Derivation of the Large Deviation Rate Function for Finite State <scp>M</scp>arkov Chains ⋮ A limit theorem for one-dimensional Gibbs measures under conditions on the empirical field ⋮ An alternative construction of normal numbers ⋮ On the maximum entropy principle for uniformly ergodic Markov chains ⋮ Maximum entropy principles for disordered spins ⋮ Expectations for nonreversible Markov chains ⋮ Information geometry of reversible Markov chains
This page was built for publication: Conditional limit theorems under Markov conditioning