Optimal convergence rates of deep neural networks in a classification setting
From MaRDI portal
Publication:6184926
DOI10.1214/23-ejs2187arXiv2207.12180MaRDI QIDQ6184926
Publication date: 5 January 2024
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2207.12180
Cites Work
- Fast learning rates for plug-in classifiers
- Approximation and estimation bounds for artificial neural networks
- Smooth discrimination analysis
- Optimal aggregation of classifiers in statistical learning.
- On the rate of convergence of fully connected deep neural network regression estimates
- Convergence rates of deep ReLU networks for multiclass classification
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Nonparametric regression using deep neural networks with ReLU activation function
- Metric entropy of some classes of sets with differentiable boundaries
- Error bounds for approximations with deep ReLU networks
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Estimation of a Function of Low Local Dimensionality by Deep Neural Networks
- Approximation by superpositions of a sigmoidal function
- Fast convergence rates of deep neural networks for classification
This page was built for publication: Optimal convergence rates of deep neural networks in a classification setting