A delayed weighted gradient method for strictly convex quadratic minimization
From MaRDI portal
Publication:2282816
DOI10.1007/s10589-019-00125-6zbMath1427.90209OpenAlexW2970926905WikidataQ127313178 ScholiaQ127313178MaRDI QIDQ2282816
Publication date: 19 December 2019
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-019-00125-6
Numerical mathematical programming methods (65K05) Convex programming (90C25) Quadratic programming (90C20) Iterative numerical methods for linear systems (65F10) Methods of reduced gradient type (90C52)
Related Items
An accelerated minimal gradient method with momentum for strictly convex quadratic optimization, An extended delayed weighted gradient algorithm for solving strongly convex optimization problems, A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization, Adaptive parameter alternating direction algorithm for centrosymmetric solutions of a class of generalized coupled Sylvester-transpose matrix equations, On the Preconditioned Delayed Weighted Gradient Method, A second-order gradient method for convex minimization, Properties of the delayed weighted gradient method
Cites Work
- Unnamed Item
- Unnamed Item
- An efficient gradient method using the Yuan steplength
- Gradient methods with adaptive step-sizes
- Linear and nonlinear programming.
- New adaptive stepsize selections in gradient methods
- Multiparameter descent methods
- Hybrid procedures for solving linear systems
- Smooth and adaptive gradient method with retards
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Variations on Richardson's method and acceleration
- On the steplength selection in gradient methods for unconstrained optimization
- On the asymptotic behaviour of some new gradient methods
- R-linear convergence of the Barzilai and Borwein gradient method
- On spectral properties of steepest descent methods
- The Conjugate Gradient Method and Trust Regions in Large Scale Optimization
- One class of methods of unconditional minimization of a convex function, having a high rate of convergence
- Two-Point Step Size Gradient Methods
- Gradient Method with Retards and Generalizations
- Alternate minimization gradient method
- Alternate step gradient method*
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- On the Barzilai and Borwein choice of steplength for the gradient method
- Methods of conjugate gradients for solving linear systems