Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo
From MaRDI portal
Publication:2221416
DOI10.1016/j.jcp.2019.04.043zbMath1452.65007arXiv1807.05507OpenAlexW2951193522WikidataQ127999518 ScholiaQ127999518MaRDI QIDQ2221416
Publication date: 26 January 2021
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1807.05507
dimension reductionuncertainty quantificationBayesian inverse problemshigh-dimensional samplinginfinite-dimensional geometric Markov chain Monte Carlo method
Bayesian inference (62F15) Monte Carlo methods (65C05) PDE constrained optimization (numerical aspects) (49M41)
Related Items
Continuum limit and preconditioned Langevin sampling of the path integral molecular dynamics, Scaling Up Bayesian Uncertainty Quantification for Inverse Problems Using Deep Neural Networks, Reduced-order model-based variational inference with normalizing flows for Bayesian elliptic inverse problems, Multilevel Delayed Acceptance MCMC, Bayesian spatiotemporal modeling for inverse problems, A data-driven and model-based accelerated Hamiltonian Monte Carlo method for Bayesian elliptic inverse problems, MALA-within-Gibbs Samplers for High-Dimensional Distributions with Sparse Conditional Structure, Data-free likelihood-informed dimension reduction of Bayesian inverse problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- Automated solution of differential equations by the finite element method. The FEniCS book
- Hybrid Monte Carlo on Hilbert spaces
- Emulation of higher-order tensors in manifold Monte Carlo methods for Bayesian inverse problems
- A note on Metropolis-Hastings kernels for general state spaces
- Weak convergence and optimal scaling of random walk Metropolis algorithms
- Geometric MCMC for infinite-dimensional inverse problems
- Hamiltonian Monte Carlo acceleration using surrogate functions with random bases
- On a generalization of the preconditioned Crank-Nicolson metropolis algorithm
- Proposals which speed up function-space MCMC
- Precomputing strategy for Hamiltonian Monte Carlo method based on regularity in parameter space
- Dimension-independent likelihood-informed MCMC
- A stable manifold MCMC method for high dimensions
- Langevin diffusions and the Metropolis-adjusted Langevin algorithm
- Accelerating Markov Chain Monte Carlo with Active Subspaces
- Inverse problems: A Bayesian perspective
- A Stochastic Newton MCMC Method for Large-Scale Statistical Inverse Problems with Application to Seismic Inversion
- Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces
- Likelihood-informed dimension reduction for nonlinear inverse problems
- Active Subspaces
- Randomized algorithms for the low-rank approximation of matrices
- The Geometry of Random Fields
- Sequential Monte Carlo Samplers
- Optimal Low-rank Approximations of Bayesian Linear Inverse Problems
- Algorithms for Kullback--Leibler Approximation of Probability Measures in Infinite Dimensions
- MCMC METHODS FOR DIFFUSION BRIDGES
- On the small balls problem for equivalent Gaussian measures
- Turbulent Flows
- Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods
- Computational Methods for Inverse Problems
- Stochastic Equations in Infinite Dimensions
- Randomized algorithms for generalized Hermitian eigenvalue problems with application to computing Karhunen–Loève expansion
- Sequential Monte Carlo methods for Bayesian elliptic inverse problems
- MCMC methods for functions: modifying old algorithms to make them faster