Neural and spectral operator surrogates: unified construction and expression rate bounds
From MaRDI portal
Publication:6601288
DOI10.1007/s10444-024-10171-2zbMATH Open1547.35228MaRDI QIDQ6601288
J. Zech, Christoph Schwab, Lukas Herrmann
Publication date: 10 September 2024
Published in: Advances in Computational Mathematics (Search for Journal in Brave)
Artificial neural networks and deep learning (68T07) Spectral, collocation and related methods for boundary value problems involving PDEs (65N35) Second-order elliptic equations (35J15) Rate of convergence, degree of approximation (41A25) Numerical approximation of high-dimensional functions; sparse grids (65D40)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Further analysis of multilevel Monte Carlo methods for elliptic PDEs with random coefficients
- High-dimensional adaptive sparse polynomial interpolation and applications to parametric PDEs
- Parabolic molecules
- Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs
- Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- A basis theory primer.
- Continuous shearlet tight frames
- Hierarchical Riesz bases for \(H^{s}(\Omega),\;1 < s < \frac{5}{2}\)
- Function spaces and wavelets on domains
- Multilevel frames for sparse tensor product spaces
- Bases in function spaces, sampling, discrepancy, numerical integration
- On a BPX-preconditioner for P1 elements
- On the Lebesgue constant of Leja sequences for the complex unit disk and of their real projection
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Constructive deep ReLU neural network approximation
- Exponential ReLU DNN expression of holomorphic maps in high dimension
- The modern mathematics of deep learning
- Quantifying total uncertainty in physics-informed neural networks for solving forward and inverse stochastic problems
- Solving many-electron Schrödinger equation using deep neural networks
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Sparse Tensor Discretization of Elliptic sPDEs
- ANALYTIC REGULARITY AND POLYNOMIAL APPROXIMATION OF PARAMETRIC AND STOCHASTIC ELLIPTIC PDE'S
- Sparse tensor discretizations of high-dimensional parametric and stochastic PDEs
- Estimates near the boundary for solutions of elliptic partial differential equations satisfying general boundary conditions. I
- Convergence Estimates for Multigrid Algorithms without Regularity Assumptions
- Adaptive wavelet methods for solving operator equations: An overview
- Orthonormal bases of compactly supported wavelets
- Electromagnetic wave scattering by random surfaces: Shape holomorphy
- Solving ill-posed inverse problems using iterative deep neural networks
- Shape Holomorphy of the Stationary Navier--Stokes Equations
- Deep learning in high dimension: Neural network expression rates for generalized polynomial chaos expansions in UQ
- Element-by-Element Construction of Wavelets Satisfying Stability and Moment Conditions
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- MIONet: Learning Multiple-Input Operators via Tensor Product
- Error estimates for DeepONets: a deep learning framework in infinite dimensions
- Convergence rates of high dimensional Smolyak quadrature
- Deep ReLU networks and high-order finite element methods
- Finite Neuron Method and Convergence Analysis
- Approximation of high-dimensional parametric PDEs
- Fully Discrete Approximation of Parametric and Stochastic Elliptic PDEs
- Estimates near the boundary for solutions of elliptic partial differential equations satisfying general boundary conditions II
- An introduction to frames and Riesz bases
- De Rham compatible deep neural network FEM
- Exponential ReLU neural network approximation rates for point and edge singularities
- Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations
- Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\)
- Convergence rate of DeepONets for learning operators arising from advection-diffusion equations
- Analyticity and sparsity in uncertainty quantification for PDEs with Gaussian random field inputs
This page was built for publication: Neural and spectral operator surrogates: unified construction and expression rate bounds