Stochastic gradient descent with Barzilai-Borwein update step for SVM
From MaRDI portal
Publication:1749833
DOI10.1016/j.ins.2015.03.073zbMath1390.68555OpenAlexW573246709MaRDI QIDQ1749833
Paweł Drozda, Krzysztof Sopyła
Publication date: 17 May 2018
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ins.2015.03.073
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (7)
Accelerating mini-batch SARAH by step size rules ⋮ Adaptive step size rules for stochastic optimization in large-scale learning ⋮ Insensitive stochastic gradient twin support vector machines for large scale problems ⋮ Properties of the sign gradient descent algorithms ⋮ Parameter selection method for support vector regression based on adaptive fusion of the mixed kernel function ⋮ Accelerated augmented Lagrangian method for total variation minimization ⋮ NEW ADAPTIVE BARZILAI–BORWEIN STEP SIZE AND ITS APPLICATION IN SOLVING LARGE-SCALE OPTIMIZATION PROBLEMS
Uses Software
Cites Work
- Gradient methods with adaptive step-sizes
- Cutting-plane training of structural SVMs
- R-linear convergence of the Barzilai and Borwein gradient method
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Two-Point Step Size Gradient Methods
- Updating Quasi-Newton Matrices with Limited Storage
- Acceleration of Stochastic Approximation by Averaging
- Gradient Method with Retards and Generalizations
- A Stochastic Approximation Method
- Adaptive two-point stepsize gradient algorithm
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Stochastic gradient descent with Barzilai-Borwein update step for SVM