An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
From MaRDI portal
Publication:1751056
DOI10.1007/s11075-017-0365-2zbMath1397.90270OpenAlexW2669391889MaRDI QIDQ1751056
Publication date: 23 May 2018
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-017-0365-2
conic modelquadratic modelapproximate optimal stepsizeBarzilai-Borwein (BB) methodBFGS update formula
Related Items (16)
A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization ⋮ A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization ⋮ A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization ⋮ An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization ⋮ A hybrid BB-type method for solving large scale unconstrained optimization ⋮ A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization ⋮ A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization ⋮ An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization ⋮ A class of accelerated subspace minimization conjugate gradient methods ⋮ An effective first order reliability method based on Barzilai-Borwein step ⋮ New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization ⋮ An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization ⋮ The new spectral conjugate gradient method for large-scale unconstrained optimisation ⋮ A subspace minimization conjugate gradient method based on conic model for unconstrained optimization ⋮ Accelerated augmented Lagrangian method for total variation minimization ⋮ Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
Cites Work
- Unnamed Item
- Scaling on the spectral gradient method
- Scalar correction method for solving large scale unconstrained minimization problems
- Gradient methods with adaptive step-sizes
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Notes on the Dai-Yuan-Yuan modified spectral gradient method
- New quasi-Newton equation and related methods for unconstrained optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- R-linear convergence of the Barzilai and Borwein gradient method
- On spectral properties of steepest descent methods
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- An adaptive conic trust-region method for unconstrained optimization
- Two-Point Step Size Gradient Methods
- Conic Approximations and Collinear Scalings for Optimizers
- The Q-Superlinear Convergence of a Collinear Scaling Algorithm for Unconstrained Optimization
- Gradient Method with Retards and Generalizations
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Barzilai and Borwein choice of steplength for the gradient method
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization