Scalar correction method for solving large scale unconstrained minimization problems
From MaRDI portal
Publication:658551
DOI10.1007/s10957-011-9864-9zbMath1229.90087OpenAlexW2044934031MaRDI QIDQ658551
Predrag S. Stanimirović, Sladjana Miljković, Marko B. Miladinović
Publication date: 12 January 2012
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-011-9864-9
nonlinear programmingconvergence ratequasi-Newton methodsnonmonotone line searchgradient descent methodsBB method
Related Items (5)
An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization ⋮ Scalar correction method for finding least-squares solutions on Hilbert spaces and its applications ⋮ Accelerated multiple step-size methods for solving unconstrained optimization problems
Cites Work
- Unnamed Item
- Unnamed Item
- A classification of quasi-Newton methods
- Weak sufficient convergence conditions and applications for Newton methods
- Minimization of functions having Lipschitz continuous first partial derivatives
- Convergence of descent method without line search
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Numerical Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Barzilai and Borwein choice of steplength for the gradient method
- Benchmarking optimization software with performance profiles.
- Adaptive two-point stepsize gradient algorithm
- On the nonmonotone line search
This page was built for publication: Scalar correction method for solving large scale unconstrained minimization problems