Layer sparsity in neural networks
From MaRDI portal
Publication:6616187
DOI10.1016/j.jspi.2024.106195MaRDI QIDQ6616187
Johannes Lederer, Mahsa Taheri, Mohamed Hebiri
Publication date: 8 October 2024
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Adaptive estimation of a quadratic functional by model selection.
- On the rate of convergence of fully connected deep neural network regression estimates
- Nonparametric regression using deep neural networks with ReLU activation function
- Prediction error bounds for linear regression with the TREX
- Error bounds for approximations with deep ReLU networks
- On Lasso refitting strategies
- A Practical Scheme and Fast Algorithm to Tune the Lasso With Optimality Guarantees
- 10.1162/153244303321897690
- Statistical guarantees for regularized neural networks
- Balancing Statistical and Computational Precision: A General Theory and Applications to Sparse Regression
This page was built for publication: Layer sparsity in neural networks