Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format
DOI10.1137/20M1314653zbMath1505.65195arXiv2001.08187OpenAlexW3002052860MaRDI QIDQ5052901
Paul B. Rohrbach, Lars Grasedyck, Sergey V. Dolgov, Robert Scheichl
Publication date: 25 November 2022
Published in: SIAM/ASA Journal on Uncertainty Quantification (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.08187
Approximation by polynomials (41A10) Algorithms for approximation of functions (65D15) Numerical quadrature and cubature formulas (65D32) Multilinear algebra, tensor calculus (15A69) Numerical linear algebra (65F99) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Related Items (2)
Uses Software
Cites Work
- Unnamed Item
- Fast Sampling of Gaussian Markov Random Fields
- Tensor-Train Decomposition
- TT-cross approximation for multidimensional arrays
- Low-rank tensor structure of linear diffusion operators in the TT and QTT formats
- Iterative numerical methods for sampling from high dimensional Gaussian distributions
- The best \(L^ 2\)-approximation by finite sums of functions with separable variables
- Structure ranks of matrices
- Greedy approximation of high-dimensional Ornstein-Uhlenbeck operators
- Constructive representation of functions in low-rank tensor formats
- A continuous analogue of the tensor-train decomposition
- Deep composition of tensor-trains using squared inverse Rosenblatt transports
- Dynamic tensor approximation of high-dimensional nonlinear PDEs
- Localization for MCMC: sampling high-dimensional posterior distributions with local structure
- Approximation and sampling of multivariate probability distributions in the tensor train decomposition
- A tensor decomposition algorithm for large ODEs with conservation laws
- Adaptive stochastic Galerkin FEM with hierarchical tensor representations
- Accelerated Gibbs sampling of normal distributions using matrix splittings and polynomials
- Stochastic processes and filtering theory
- Spectral Tensor-Train Decomposition
- Two-Level QTT-Tucker Format for Optimized Tensor Calculus
- A literature survey of low-rank tensor approximation techniques
- Inverse problems: A Bayesian perspective
- MCMC-Based Image Reconstruction with Uncertainty Quantification
- An Introduction to Computational Stochastic PDEs
- Hierarchical Singular Value Decomposition of Tensors
- The density-matrix renormalization group
- Tensor-Structured Galerkin Approximation of Parametric and Stochastic Elliptic PDEs
- Tensor Spaces and Numerical Tensor Calculus
- Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions
- Fast Sampling in a Linear-Gaussian Inverse Problem
- The vectorization of ITPACK 2C
- Hierarchical Tensor Approximation of Output Quantities of Parameter-Dependent PDEs
- Polynomial Chaos Expansion of Random Coefficients and the Solution of Stochastic Partial Differential Equations in the Tensor Train Format
- Sampling-free Bayesian inversion with adaptive hierarchical tensor representations
- Fast Solution of Parabolic Problems in the Tensor Train/Quantized Tensor Train Format with Initial Application to the Fokker--Planck Equation
- Deep neural network expression of posterior expectations in Bayesian PDE inversion
- A Hybrid Alternating Least Squares--TT-Cross Algorithm for Parametric PDEs
- Fast Multidimensional Convolution in Low-Rank Tensor Formats via Cross Approximation
- Sparse grids
- Tensor numerical methods for multidimensional PDES: theoretical analysis and initial applications
This page was built for publication: Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format