Convergence rates of two-component MCMC samplers
From MaRDI portal
Publication:2136999
DOI10.3150/21-BEJ1369MaRDI QIDQ2136999
Publication date: 16 May 2022
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.14801
Cites Work
- Geometric ergodicity of the Bayesian Lasso
- Markov chain Monte Carlo estimation of quantiles
- Geometric ergodicity of random scan Gibbs samplers for hierarchical one-way random effects models
- Positivity of hit-and-run and related algorithms
- A gentle guide to the basics of two projections theory
- Gibbs sampling, exponential families and orthogonal polynomials
- Comment: ``Gibbs sampling, exponential families, and orthogonal polynomials
- Comment: On random scan Gibbs samplers
- Markov chain Monte Carlo: can we trust the third significant figure?
- Spectral bounds for certain two-factor non-reversible MCMC algorithms
- On the Markov chain central limit theorem
- Error bounds for the method of alternating projections
- Comparing sweep strategies for stochastic relaxation
- On rates of convergence of stochastic relaxation for Gaussian and non- Gaussian distributions
- Geometric ergodicity of Gibbs and block Gibbs samplers for a hierarchical random effects model
- A note on Metropolis-Hastings kernels for general state spaces
- Two convergence properties of hybrid samplers
- Information bounds for Gibbs samplers
- Geometric ergodicity and hybrid Markov chains
- Honest exploration of intractable probability distributions via Markov chain Monte Carlo.
- Convergence control methods for Markov chain Monte Carlo algorithms
- Geometric ergodicity of Pólya-Gamma Gibbs sampler for Bayesian logistic regression with a flat prior
- Strong consistency of multivariate spectral variance estimators in Markov chain Monte Carlo
- Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors
- Coordinate selection rules for Gibbs sampling
- Sufficient burn-in for Gibbs samplers for a hierarchical random effects model.
- Markov chains for exploring posterior distributions. (With discussion)
- Rates of convergence of the Hastings and Metropolis algorithms
- Convergence properties of the Gibbs sampler for perturbations of Gaussians
- Convergence rates for MCMC algorithms for a robust Bayesian binary regression model
- Multivariate initial sequence estimators in Markov chain Monte Carlo
- Convergence analysis of a collapsed Gibbs sampler for Bayesian vector autoregressions
- Batch means and spectral variance estimators in Markov chain Monte Carlo
- Markov Chains and De-initializing Processes
- Hybrid Samplers for Ill-Posed Inverse Problems
- Geometric L2 and L1 convergence are equivalent for reversible Markov chains
- Fixed-Width Output Analysis for Markov Chain Monte Carlo
- Handbook of Markov Chain Monte Carlo
- The Calculation of Posterior Distributions by Data Augmentation
- Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms
- 10.—Norm Inequalities for C*-algebras
- The Norm of the Sum of Two Projections
- Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes
- On the geometric ergodicity of hybrid samplers
- On the applicability of regenerative simulation in Markov chain Monte Carlo
- Multivariate output analysis for Markov chain Monte Carlo
- Assessing and Visualizing Simultaneous Simulation Error
- Convergence of Conditional Metropolis-Hastings Samplers
- On random- and systematic-scan samplers
- Geometric Ergodicity of van Dyk and Meng's Algorithm for the Multivariate Student'stModel
- On the Geometric Ergodicity of Two-Variable Gibbs Samplers
- Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item