Convergence analysis for over-parameterized deep learning
From MaRDI portal
Publication:6608346
DOI10.4208/cicp.oa-2023-0264zbMATH Open1545.68112MaRDI QIDQ6608346
Jerry Zhijian Yang, Yu Ling Jiao, Xiliang Lu, Unnamed Author
Publication date: 19 September 2024
Published in: Communications in Computational Physics (Search for Journal in Brave)
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Artificial neural networks and deep learning (68T07) Finite element, Rayleigh-Ritz and Galerkin methods for boundary value problems involving PDEs (65N30) Rate of convergence, degree of approximation (41A25)
Cites Work
- Unnamed Item
- Unnamed Item
- The Deep Ritz Method: a deep learning-based numerical algorithm for solving variational problems
- Over-parametrized deep neural networks minimizing the empirical risk do not generalize well
- On the rate of convergence of fully connected deep neural network regression estimates
- Approximation rates for neural networks with encodable weights in smoothness spaces
- Loss landscapes and optimization in over-parameterized non-linear systems and neural networks
- Just interpolate: kernel ``ridgeless regression can generalize
- Nonparametric regression using deep neural networks with ReLU activation function
- Error bounds for approximations with deep ReLU networks
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Estimates near the boundary for solutions of elliptic partial differential equations satisfying general boundary conditions. I
- Elliptic Partial Differential Equations of Second Order
- Deep Neural Networks for Estimation and Inference
- Benign overfitting in linear regression
- Estimates on the generalization error of physics-informed neural networks for approximating a class of inverse problems for PDEs
- Convergence Rate Analysis for Deep Ritz Method
- A Rate of Convergence of Physics Informed Neural Networks for the Linear Second Order Elliptic PDEs
- Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Nonparametric regression on low-dimensional manifolds using deep ReLU networks: function approximation and statistical recovery
- Deep learning: a statistical viewpoint
- Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation
This page was built for publication: Convergence analysis for over-parameterized deep learning