An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
From MaRDI portal
Publication:4639139
DOI10.1080/02331934.2017.1399392zbMath1398.90117OpenAlexW2767809233MaRDI QIDQ4639139
Zexian Liu, Xiao Liang Dong, Hong-Wei Liu
Publication date: 3 May 2018
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2017.1399392
gradient methodstrictly convex quadratic minimizationBarzilai-Borwein (BB) methodBFGS update formulaapproximating optimal stepsize
Related Items (6)
An accelerated minimal gradient method with momentum for strictly convex quadratic optimization ⋮ A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization ⋮ An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization ⋮ An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization ⋮ A delayed weighted gradient method for strictly convex quadratic minimization ⋮ Accelerated augmented Lagrangian method for total variation minimization
Cites Work
- Unnamed Item
- A new modified Barzilai-Borwein gradient method for the quadratic minimization problem
- Gradient methods with adaptive step-sizes
- Multi-step quasi-Newton methods for optimization
- A new adaptive Barzilai and Borwein method for unconstrained optimization
- R-linear convergence of the Barzilai and Borwein gradient method
- On spectral properties of steepest descent methods
- Two-Point Step Size Gradient Methods
- Gradient Method with Retards and Generalizations
- Alternate minimization gradient method
- Alternate step gradient method*
- On the Barzilai and Borwein choice of steplength for the gradient method
- Benchmarking optimization software with performance profiles.
This page was built for publication: An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem