Block BFGS Methods
From MaRDI portal
Publication:4641646
DOI10.1137/16M1092106zbMath1397.90402arXiv1609.00318OpenAlexW2963242954MaRDI QIDQ4641646
Publication date: 18 May 2018
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1609.00318
Related Items (3)
Quasi-Newton methods for machine learning: forget the past, just sample ⋮ An overview of stochastic quasi-Newton methods for large-scale machine learning ⋮ Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- On the limited memory BFGS method for large scale optimization
- Local convergence analysis for partitioned quasi-Newton updates
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Algorithms for nonlinear constraints that use lagrangian functions
- A repository of convex quadratic programming problems
- Algorithm 984
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Block BFGS Methods