Markov Chains and De-initializing Processes

From MaRDI portal
Publication:2771553

DOI10.1111/1467-9469.00250zbMath0985.60067OpenAlexW2128598176MaRDI QIDQ2771553

Jeffrey S. Rosenthal, Gareth O. Roberts

Publication date: 17 February 2002

Published in: Scandinavian Journal of Statistics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1111/1467-9469.00250



Related Items

Polynomial convergence rates of Markov chains, Geometric ergodicity of Pólya-Gamma Gibbs sampler for Bayesian logistic regression with a flat prior, Sandwich algorithms for Bayesian variable selection, Dimension free convergence rates for Gibbs samplers for Bayesian linear mixed models, Geometric ergodicity of Gibbs samplers for the horseshoe and its regularized variants, Convergence rates of two-component MCMC samplers, Analysis of the Pólya-gamma block Gibbs sampler for Bayesian logistic linear mixed models, On the convergence rate of the ``out-of-order block Gibbs sampler, Shrinkage with shrunken shoulders: Gibbs sampling shrinkage model posteriors with guaranteed convergence rates, The computational cost of blocking for sampling discretely observed diffusions, Strong invariance principles for ergodic Markov processes, Slice sampling. (With discussions and rejoinder), Gibbs sampling for a Bayesian hierarchical general linear model, Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors, Improving the convergence properties of the data augmentation algorithm with an application to Bayesian mixture modeling, On reparametrization and the Gibbs sampler, Stability of the Gibbs sampler for Bayesian hierarchical models, Convergence of Conditional Metropolis-Hastings Samplers, Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition, Convergence analysis of a collapsed Gibbs sampler for Bayesian vector autoregressions, Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors, Convergence complexity analysis of Albert and Chib's algorithm for Bayesian probit regression, On the convergence complexity of Gibbs samplers for a family of simple Bayesian random effects models, Block Gibbs samplers for logistic mixed models: convergence properties and a comparison with full Gibbs samplers, Convergence rate bounds for iterative random functions using one-shot coupling, Bayesian inference for high‐dimensional linear regression under mnet priors, Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors, The polar slice sampler, Wasserstein-based methods for convergence complexity analysis of MCMC with applications