Subsampled Hessian Newton Methods for Supervised Learning
From MaRDI portal
Publication:5380307
DOI10.1162/NECO_a_00751zbMath1472.68162OpenAlexW2103346443WikidataQ40830680 ScholiaQ40830680MaRDI QIDQ5380307
Chun-Heng Huang, Chih-Jen Lin, Chien-Chih Wang
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00751
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Uses Software
Cites Work
- Performance of first-order methods for smooth convex minimization: a novel approach
- On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- A finite newton method for classification
- Trust Region Methods
- A simple automatic derivative evaluation program
- Some methods of speeding up the convergence of iteration methods
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item