SelectNet: self-paced learning for high-dimensional partial differential equations
DOI10.1016/j.jcp.2021.110444OpenAlexW3000403725WikidataQ115350072 ScholiaQ115350072MaRDI QIDQ2131038
Yiqi Gu, Chao Zhou, Haizhao Yang
Publication date: 25 April 2022
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.04860
convergenceleast square methodhigh-dimensional PDEsself-paced learningdeep neural networksselected sampling
Artificial intelligence (68Txx) Numerical methods for ordinary differential equations (65Lxx) Numerical methods for partial differential equations, initial value and time-dependent initial-boundary value problems (65Mxx)
Related Items (11)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations
- Weak adversarial networks for high-dimensional partial differential equations
- Numerical solution for high order differential equations using a hybrid neural network-optimization method
- Lower bounds for approximation by MLP neural networks
- Multilayer feedforward networks are universal approximators
- The Deep Ritz Method: a deep learning-based numerical algorithm for solving variational problems
- Exponential convergence of the deep neural network approximation for analytic functions
- DGM: a deep learning algorithm for solving partial differential equations
- Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem
- On the approximation by single hidden layer feedforward neural networks with fixed weights
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Nonlinear approximation via compositions
- Overcoming the curse of dimensionality in the approximative pricing of financial derivatives with default risks
- A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations
- A priori estimates of the population risk for two-layer neural networks
- Error bounds for approximations with deep ReLU networks
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Solving for high-dimensional committor functions using artificial neural networks
- Sparse grid spaces for the numerical solution of the electronic Schrödinger equation
- Neural algorithm for solving differential equations
- Symmetric positive linear differential equations
- A FAST, STABLE AND ACCURATE NUMERICAL METHOD FOR THE BLACK–SCHOLES EQUATION OF AMERICAN OPTIONS
- Universal approximation bounds for superpositions of a sigmoidal function
- Neural‐network‐based approximations for solving partial differential equations
- Solving high-dimensional partial differential equations using deep learning
- Adaptive Deep Learning for High-Dimensional Hamilton--Jacobi--Bellman Equations
- Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth
- New Error Bounds for Deep ReLU Networks Using Sparse Grids
- Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions
- Approximation by superpositions of a sigmoidal function
This page was built for publication: SelectNet: self-paced learning for high-dimensional partial differential equations