Greedy Quasi-Newton Methods with Explicit Superlinear Convergence
From MaRDI portal
Publication:5853572
DOI10.1137/20M1320651zbMath1461.90167arXiv2002.00657MaRDI QIDQ5853572
Anton Rodomanov, Yu. E. Nesterov
Publication date: 10 March 2021
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.00657
rate of convergencequasi-Newton methodssuperlinear convergenceBroyden familyBFGSDFPSR1local coverage
Analysis of algorithms and problem complexity (68Q25) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items
Rates of superlinear convergence for classical quasi-Newton methods, Towards explicit superlinear convergence rate for SR1, Non-asymptotic superlinear convergence of standard quasi-Newton methods, Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence, Hessian averaging in stochastic Newton methods achieves superlinear convergence, Greedy PSB methods with explicit superlinear convergence, An overview of nonlinear optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonsmooth optimization via quasi-Newton methods
- Lectures on convex optimization
- Convergence of quasi-Newton matrices generated by the symmetric rank one update
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Local convergence analysis for partitioned quasi-Newton updates
- Generalized self-concordant functions: a recipe for Newton-type methods
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Superlinear convergence of Broyden's boundedθ-class of methods
- Variable Metric Method for Minimization
- Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
- Quasi-Newton Methods, Motivation and Theory
- LOCAL AND SUPERLINEAR CONVERGENCE OF STRUCTURED QUASI-NEWTON METHODS FOR NONLINEAR OPTIMIZATION
- Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms
- Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Contracting Proximal Methods for Smooth Convex Optimization
- A Rapidly Convergent Descent Method for Minimization
- Quasi-Newton Methods and their Application to Function Minimisation
- Variance algorithm for minimization
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- On the Convergence of the Variable Metric Algorithm
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate