Estimates on compressed neural networks regression
From MaRDI portal
Publication:889370
DOI10.1016/j.neunet.2014.10.008zbMath1325.68205OpenAlexW2031104813WikidataQ41739234 ScholiaQ41739234MaRDI QIDQ889370
Youmei Li, Yongquan Zhang, Jiabing Ji, Jianyong Sun
Publication date: 6 November 2015
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: http://gala.gre.ac.uk/id/eprint/12516/1/12516_Jianyong_SUN_Neural_Networks_%28AAM%29_%282014%29.pdf
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Analysis of the rate of convergence of least squares neural network regression estimates in case of measurement errors
- Interpolation and rates of convergence for a class of neural networks
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- A note on different covering numbers in learning theory.
- Weak convergence and empirical processes. With applications to statistics
- Nonasymptotic bounds on the \(L_{2}\) error of neural network regression estimates
- On the mathematical foundations of learning
- Universal approximation bounds for superpositions of a sigmoidal function
- Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators
- Compressed and Privacy-Sensitive Sparse Regression
- Model Selection and Estimation in Regression with Grouped Variables
- Compressed sensing
- Approximation by superpositions of a sigmoidal function
- The elements of statistical learning. Data mining, inference, and prediction
This page was built for publication: Estimates on compressed neural networks regression