On a Metropolis-Hastings importance sampling estimator
From MaRDI portal
Publication:2180048
DOI10.1214/20-EJS1680zbMath1450.62104arXiv1805.07174OpenAlexW3098404235MaRDI QIDQ2180048
Publication date: 13 May 2020
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1805.07174
Markov chainsimportance samplingcentral limit theoremvariance reductionMetropolis-Hastings algorithm
Computational methods in Markov chains (60J22) Computational methods for problems pertaining to statistics (62-08) Central limit and other weak theorems (60F05) Bayesian inference (62F15) Markov processes: estimation; hidden Markov models (62M05) Discrete-time Markov processes on general state spaces (60J05)
Related Items
Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion ⋮ Markov Chain Importance Sampling—A Highly Efficient Estimator for MCMC ⋮ Gradient-based adaptive importance samplers ⋮ MCMC-driven importance samplers
Cites Work
- Unnamed Item
- Curvature, concentration and error estimates for Markov chain Monte Carlo
- Rigorous confidence bounds for MCMC under a geometric drift condition
- Markov chains and stochastic stability
- Markov chain importance sampling with applications to rare event probability estimation
- Spectral bounds for certain two-factor non-reversible MCMC algorithms
- Optimal importance sampling for the approximation of integrals
- On the Markov chain central limit theorem
- Explicit error bounds for lazy reversible Markov chain Monte Carlo
- Central limit theorem for additive functionals of reversible Markov processes and applications to simple exclusions
- A note on Metropolis-Hastings kernels for general state spaces
- Geometric ergodicity and hybrid Markov chains
- Optimal scaling for various Metropolis-Hastings algorithms.
- The sample size required in importance sampling
- Importance sampling: intrinsic dimension and computational cost
- Leave Pima Indians alone: binary regression as a benchmark for Bayesian computation
- Central limit theorems for additive functionals of Markov chains.
- Markov chains for exploring posterior distributions. (With discussion)
- Rates of convergence of the Hastings and Metropolis algorithms
- Geometric ergodicity and the spectral gap of non-reversible Markov chains
- Kinetic methods for inverse problems
- Layered adaptive importance sampling
- Inference in hidden Markov models.
- A vanilla Rao-Blackwellization of Metropolis-Hastings algorithms
- Nonasymptotic bounds on the estimation error of MCMC algorithms
- Simple Monte Carlo and the Metropolis algorithm
- Explicit error bounds for Markov chain Monte Carlo
- Error bounds for computing the expectation by Markov chain Monte Carlo
- Does Waste Recycling Really Improve the Multi-Proposal Metropolis–Hastings algorithm? an Analysis Based on Control Variates
- Analysis of the Ensemble and Polynomial Chaos Kalman Filters in Bayesian Inverse Problems
- Rao-Blackwellisation of sampling schemes
- Markov Chain Importance Sampling—A Highly Efficient Estimator for MCMC
- Interacting Langevin Diffusions: Gradient Structure and Ensemble Kalman Sampler