Cholesky-Based Experimental Design for Gaussian Process and Kernel-Based Emulation and Calibration
From MaRDI portal
Publication:5163220
DOI10.4208/cicp.OA-2020-0060zbMath1473.62089OpenAlexW3102762050MaRDI QIDQ5163220
John D. Jakeman, Peter Zaspel, Helmut Harbrecht
Publication date: 3 November 2021
Published in: Communications in Computational Physics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.4208/cicp.oa-2020-0060
Gaussian processradial basis functionBayesian inferenceactive learninguncertainty quantificationexperimental design
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Tensor-Train Decomposition
- On the low-rank approximation by the pivoted Cholesky decomposition
- Bases for kernel-based spaces
- Numerical approach for quantification of epistemic uncertainty
- A Newton basis for kernel spaces
- Pseudo-skeleton approximations by matrices of maximal volume
- Approximation of boundary element matrices
- Design and analysis of computer experiments. With comments and a rejoinder by the authors
- Bayesian deep convolutional encoder-decoder networks for surrogate modeling and uncertainty quantification
- Near-optimal data-independent point locations for radial basis function interpolation
- Error estimates and condition numbers for radial basis function interpolation
- Goal-oriented adaptive surrogate construction for stochastic inversion
- Polynomial chaos expansions for dependent random variables
- Adaptive multi-fidelity polynomial chaos approach to Bayesian inference in inverse problems
- Gradient-based optimization for regression in the functional tensor-train format
- A flexible numerical approach for quantification of epistemic uncertainty
- Linearly constrained reconstruction of functions by kernels with applications to machine learning
- A general multipurpose interpolation procedure: The magic points
- Inverse problems: A Bayesian perspective
- Computing Multivariate Fekete and Leja Points by Numerical Linear Algebra
- Kernel techniques: From machine learning to meshless methods
- A Sparse Grid Stochastic Collocation Method for Partial Differential Equations with Random Input Data
- Local error estimates for radial basis function interpolation of scattered data
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- Probabilistic Sensitivity Analysis of Complex Models: A Bayesian Approach
- Comparison of Some Reduced Representation Approximations
- Bayesian Model Calibration with Interpolating Polynomials based on Adaptively Weighted Leja Nodes
- An Adaptive Surrogate Modeling Based on Deep Neural Networks for Large-Scale Bayesian Inverse Problems
- High-Order Collocation Methods for Differential Equations with Random Inputs
- Mercer Kernels and Integrated Variance Experimental Design: Connections Between Gaussian Process Regression and Polynomial Approximation
This page was built for publication: Cholesky-Based Experimental Design for Gaussian Process and Kernel-Based Emulation and Calibration