Accelerated multiple step-size methods for solving unconstrained optimization problems
From MaRDI portal
Publication:5865330
DOI10.1080/10556788.2019.1653868zbMath1489.65085OpenAlexW2969652412WikidataQ127338982 ScholiaQ127338982MaRDI QIDQ5865330
Gradimir V. Milovanović, Ivona Brajević, Branislav Ivanov, Predrag S. Stanimirović, Snežana S. Djordjević
Publication date: 13 June 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2019.1653868
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Two modifications of the method of the multiplicative parameters in descent gradient methods
- Scalar correction method for solving large scale unconstrained minimization problems
- An accelerated double step size model in unconstrained optimization
- A classification of quasi-Newton methods
- Modified two-point stepsize gradient methods for unconstrained optimization
- A transformation of accelerated double step size method for unconstrained optimization
- Accelerated double direction method for solving unconstrained optimization problems
- Hybrid modification of accelerated double direction method
- Analysis of monotone gradient methods
- Convergence of line search methods for unconstrained optimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- Hybridization of accelerated gradient descent method
- A Picard-Mann hybrid iterative process
- On the asymptotic behaviour of some new gradient methods
- Optimization theory and methods. Nonlinear programming
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Numerical Optimization
- Alternate minimization gradient method
- Alternate step gradient method*
- INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS
- Fixed Points by a New Iteration Method
- On the Barzilai and Borwein choice of steplength for the gradient method
- Mean Value Methods in Iteration
- Benchmarking optimization software with performance profiles.
- Enriched methods for large-scale unconstrained optimization
- Accelerated gradient descent methods with line search
This page was built for publication: Accelerated multiple step-size methods for solving unconstrained optimization problems