Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions

From MaRDI portal
Publication:5220403

DOI10.1137/18M1221837zbMath1433.41007arXiv1801.07922MaRDI QIDQ5220403

Olivier Zahm, Youssef M. Marzouk, Paul G. Constantine, Clémentine Prieur

Publication date: 20 March 2020

Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1801.07922




Related Items (23)

Preintegration via Active SubspaceReduced basis methods for time-dependent problemsPrior normalization for certified likelihood-informed subspace detection of Bayesian inverse problemsCertified dimension reduction in nonlinear Bayesian inverse problemsCharacterization of flow through random media via Karhunen-Loève expansion: an information theory perspectiveGeneralized bounds for active subspacesModel Reduction for Nonlinear Systems by Balanced Truncation of State and Gradient CovarianceLearning high-dimensional parametric maps via reduced basis adaptive residual networksKernel‐based active subspaces with application to computational fluid dynamics parametric problems using the discontinuous Galerkin methodMultifidelity Dimension Reduction via Active SubspacesOn the Deep Active-Subspace MethodMulti‐fidelity data fusion through parameter space reduction with applications to automotive engineeringLarge-scale Bayesian optimal experimental design with derivative-informed projected neural networkDerivative-informed neural operator: an efficient framework for high-dimensional parametric derivative learningAn efficient dimension reduction for the Gaussian process emulation of two nested codes with functional outputsModified Active Subspaces Using the Average of GradientsA distributed active subspace method for scalable surrogate modeling of function valued outputsEmbedded ridge approximationsDerivative-informed projected neural networks for high-dimensional parametric maps governed by PDEsStructure exploiting methods for fast uncertainty quantification in multiphase flow through heterogeneous mediaDerivative-Based Global Sensitivity Analysis for Models with High-Dimensional Inputs and Functional OutputsA Supervised Learning Approach Involving Active Subspaces for an Efficient Genetic Algorithm in High-Dimensional Optimization ProblemsData-free likelihood-informed dimension reduction of Bayesian inverse problems



Cites Work


This page was built for publication: Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions