Discussion of: ``Nonparametric regression using deep neural networks with ReLU activation function
DOI10.1214/19-AOS1915zbMath1455.62087OpenAlexW3049259997MaRDI QIDQ2215716
Publication date: 14 December 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1597370656
additive modelsnonparametric regressionmultilayer neural networksrectified linear unit (ReLU)minimax estimation riskrectified linear activation function
Nonparametric regression and quantile regression (62G08) Minimax procedures in statistical decision theory (62C20) Neural nets and related approaches to inference from stochastic processes (62M45)
Cites Work
- Unnamed Item
- Unnamed Item
- Boosting the margin: a new explanation for the effectiveness of voting methods
- The landscape of empirical risk for nonconvex losses
- Stochastic subgradient method converges on tame functions
- Neural Network Learning
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
This page was built for publication: Discussion of: ``Nonparametric regression using deep neural networks with ReLU activation function