Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction
DOI10.1016/j.jcp.2023.112103arXiv2106.04170MaRDI QIDQ6158090
Olivier Zahm, Sergey V. Dolgov, Tiangang Cui
Publication date: 31 May 2023
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.04170
inverse problemsMarkov chain Monte Carlodimension reductionapproximate Bayesian computationgenerative modelstensor traintransport maps
Parametric inference (62Fxx) Numerical methods for partial differential equations, boundary value problems (65Nxx) Probabilistic methods, stochastic differential equations (65Cxx)
Related Items (2)
Cites Work
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo
- TT-cross approximation for multidimensional arrays
- CUR matrix decompositions for improved data analysis
- Besov priors for Bayesian inverse problems
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Bayesian inference for differential equations
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- A note on an inequality involving the normal distribution
- Pseudo-skeleton approximations by matrices of maximal volume
- Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality
- A continuous analogue of the tensor-train decomposition
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Deep composition of tensor-trains using squared inverse Rosenblatt transports
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- Adaptive multi-fidelity polynomial chaos approach to Bayesian inference in inverse problems
- Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data
- Discretization-invariant Bayesian inversion and Besov space priors
- Approximation and sampling of multivariate probability distributions in the tensor train decomposition
- Scalable posterior approximations for large-scale Bayesian inverse problems via likelihood-informed parameter and state reduction
- Monte Carlo strategies in scientific computing.
- Sparse-grid, reduced-basis Bayesian inversion
- Data-Driven Optimal Transport
- Spectral Tensor-Train Decomposition
- Sparse deterministic approximation of Bayesian inverse problems
- Inverse problems: A Bayesian perspective
- Alternating Minimal Energy Methods for Linear Systems in Higher Dimensions
- Data-driven model reduction for the Bayesian solution of inverse problems
- Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems
- Transport Map Accelerated Markov Chain Monte Carlo
- Sequential Monte Carlo Samplers
- MCMC METHODS FOR DIFFUSION BRIDGES
- Handbook of Markov Chain Monte Carlo
- Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
- A sequential particle filter method for static models
- Sampling-free Bayesian inversion with adaptive hierarchical tensor representations
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- A Family of Nonparametric Density Estimation Algorithms
- Parameter Estimation for Differential Equations: a Generalized Smoothing Approach
- Multifidelity Dimension Reduction via Active Subspaces
- An Adaptive Surrogate Modeling Based on Deep Neural Networks for Large-Scale Bayesian Inverse Problems
- A Hybrid Alternating Least Squares--TT-Cross Algorithm for Parametric PDEs
- Non‐linear model reduction for uncertainty quantification in large‐scale inverse problems
- Remarks on a Multivariate Transformation
- Nonlinear Reduced Models for State and Parameter Estimation
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- MCMC methods for functions: modifying old algorithms to make them faster
This page was built for publication: Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction