Exponential ReLU DNN expression of holomorphic maps in high dimension
From MaRDI portal
Publication:2117341
DOI10.1007/s00365-021-09542-5zbMath1500.41008OpenAlexW3161792008WikidataQ115607850 ScholiaQ115607850MaRDI QIDQ2117341
Joost A. A. Opschoor, J. Zech, Christoph Schwab
Publication date: 21 March 2022
Published in: Constructive Approximation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00365-021-09542-5
Artificial neural networks and deep learning (68T07) Multidimensional problems (41A63) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Related Items (23)
Structure probing neural network deflation ⋮ Neural network approximation ⋮ Deep solution operators for variational inequalities via proximal neural networks ⋮ ReLU deep neural networks from the hierarchical basis perspective ⋮ Variational physics informed neural networks: the role of quadratures and test functions ⋮ Physics Informed Neural Networks (PINNs) For Approximating Nonlinear Dispersive PDEs ⋮ Sparse approximation of triangular transports. I: The finite-dimensional case ⋮ De Rham compatible deep neural network FEM ⋮ Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs ⋮ Approximation theory of tree tensor networks: tensorized univariate functions ⋮ Approximation error for neural network operators by an averaged modulus of smoothness ⋮ Exponential ReLU neural network approximation rates for point and edge singularities ⋮ Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations ⋮ Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\) ⋮ Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class ⋮ Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs ⋮ Optimal approximation of infinite-dimensional holomorphic functions ⋮ Computation and learning in high dimensions. Abstracts from the workshop held August 1--7, 2021 (hybrid meeting) ⋮ Convergence rate of DeepONets for learning operators arising from advection-diffusion equations ⋮ Deep ReLU network expression rates for option prices in high-dimensional, exponential Lévy models ⋮ Higher-Order Quasi-Monte Carlo Training of Deep Neural Networks ⋮ Solving parametric partial differential equations with deep rectified quadratic unit neural networks ⋮ Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- An upper estimate of integral points in real simplices with an application to singularity theory
- Gevrey class regularity for the solutions of the Navier-Stokes equations
- Approximation by Ridge functions and neural networks with one hidden layer
- Approximation by superposition of sigmoidal and radial basis functions
- Multilayer feedforward networks are universal approximators
- Approximation properties of a multilayered feedforward artificial neural network
- Exponential convergence of the deep neural network approximation for analytic functions
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Exponential convergence in \(H^1\) of \textit{hp}-FEM for Gevrey regularity with isotropic singularities
- Error bounds for approximations with deep ReLU networks
- Optimal transport for applied mathematicians. Calculus of variations, PDEs, and modeling
- Analysis of quasi-optimal polynomial approximations for parameterized PDEs with deterministic and stochastic coefficients
- Polynomial approximation of anisotropic analytic functions of several variables
- $hp$-DGFEM for Second Order Elliptic Problems in Polyhedra II: Exponential Convergence
- An Anisotropic Sparse Grid Stochastic Collocation Method for Partial Differential Equations with Random Input Data
- Deep learning in high dimension: Neural network expression rates for generalized polynomial chaos expansions in UQ
- Multilevel approximation of parametric and stochastic PDES
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Convergence rates of high dimensional Smolyak quadrature
- Deep ReLU networks and high-order finite element methods
- Deep neural network expression of posterior expectations in Bayesian PDE inversion
- Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units
- EXPONENTIAL CONVERGENCE OF hp-FEM FOR MAXWELL EQUATIONS WITH WEIGHTED REGULARIZATION IN POLYGONAL DOMAINS
This page was built for publication: Exponential ReLU DNN expression of holomorphic maps in high dimension