Statistical theory for image classification using deep convolutional neural network with cross-entropy loss under the hierarchical max-pooling model
From MaRDI portal
Publication:6616182
DOI10.1016/j.jspi.2024.106188MaRDI QIDQ6616182
Publication date: 8 October 2024
Published in: (Search for Journal in Brave)
Could not fetch data.
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A distribution-free theory of nonparametric regression
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- On the rate of convergence of fully connected deep neural network regression estimates
- On the rate of convergence of image classifiers based on convolutional neural networks
- Universal approximations of invariant maps by neural networks
- Convergence rates of deep ReLU networks for multiclass classification
- Nonparametric regression using deep neural networks with ReLU activation function
- Discussion of: ``Nonparametric regression using deep neural networks with ReLU activation function
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Necessary and sufficient conditions for the pointwise convergence of nearest neighbor regression function estimates
- Deep Convolutional Neural Networks for Image Classification: A Comprehensive Review
- Convexity, Classification, and Risk Bounds
- Fast convergence rates of deep neural networks for classification
- The Elements of Statistical Learning
This page was built for publication: Statistical theory for image classification using deep convolutional neural network with cross-entropy loss under the hierarchical max-pooling model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6616182)