A second-order gradient method for convex minimization
From MaRDI portal
Publication:2236579
DOI10.1007/s40590-021-00375-7OpenAlexW3194213170MaRDI QIDQ2236579
Publication date: 25 October 2021
Published in: Boletín de la Sociedad Matemática Mexicana. Third Series (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s40590-021-00375-7
Numerical mathematical programming methods (65K05) Convex programming (90C25) Methods of reduced gradient type (90C52)
Cites Work
- Unnamed Item
- Unnamed Item
- Gradient algorithms for quadratic optimization with fast convergence rates
- An efficient gradient method using the Yuan steplength
- A limited memory steepest descent method
- New adaptive stepsize selections in gradient methods
- An accelerated minimal gradient method with momentum for strictly convex quadratic optimization
- A delayed weighted gradient method for strictly convex quadratic minimization
- On the asymptotic behaviour of some new gradient methods
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two Novel Gradient Methods with Optimal Step Sizes
- Two-Point Step Size Gradient Methods
- Gradient Method with Retards and Generalizations
- Alternate minimization gradient method
- Alternate step gradient method*
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A hybrid gradient method for strictly convex quadratic programming
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
This page was built for publication: A second-order gradient method for convex minimization