Quasi-Newton methods for machine learning: forget the past, just sample
From MaRDI portal
Publication:5058389
DOI10.1080/10556788.2021.1977806OpenAlexW3206876740MaRDI QIDQ5058389
Martin Takáč, Albert S. Berahas, Peter Richtárik, Majid Jahani
Publication date: 20 December 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1901.09997
Related Items
Quasi-Newton methods for machine learning: forget the past, just sample, Towards explicit superlinear convergence rate for SR1, An overview of stochastic quasi-Newton methods for large-scale machine learning, Globally Convergent Multilevel Training of Deep Residual Networks, A robust multi-batch L-BFGS method for machine learning
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- On solving L-SR1 trust-region subproblems
- Minimizing finite sums with the stochastic average gradient
- On the limited memory BFGS method for large scale optimization
- Convergence of quasi-Newton matrices generated by the symmetric rank one update
- Representations of quasi-Newton matrices and their use in limited memory methods
- The BFGS method with exact line searches fails for non-convex objective functions
- Sub-sampled Newton methods
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Global Convergence of Online Limited Memory BFGS
- On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning
- Updating Quasi-Newton Matrices with Limited Storage
- Algorithms for nonlinear constraints that use lagrangian functions
- Trust Region Methods
- Block BFGS Methods
- Optimization Methods for Large-Scale Machine Learning
- A Theoretical and Experimental Study of the Symmetric Rank-One Update
- Analysis of a Symmetric Rank-One Trust Region Method
- Convergence Properties of the BFGS Algoritm
- A robust multi-batch L-BFGS method for machine learning
- Quasi-Newton methods for machine learning: forget the past, just sample
- An investigation of Newton-Sketch and subsampled Newton methods
- Quasi-Newton Methods and their Application to Function Minimisation
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- A Stochastic Approximation Method
- Exact and inexact subsampled Newton methods for optimization
- A modified BFGS method and its global convergence in nonconvex minimization