A modified conjugate gradient method for general convex functions
From MaRDI portal
Publication:2699981
DOI10.1007/s11075-022-01349-0OpenAlexW4283258730MaRDI QIDQ2699981
Fahimeh Abdollahi, Masoud Fatemi
Publication date: 20 April 2023
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-022-01349-0
Related Items (1)
Uses Software
Cites Work
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- Proximal quasi-Newton methods for nondifferentiable convex optimization
- Convergence of some algorithms for convex minimization
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- A trust region method for nonsmooth convex optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- An effective nonsmooth optimization algorithm for locally Lipschitz functions
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A descent algorithm for nonsmooth convex optimization
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Trust Region Methods
- First-Order Methods in Optimization
- A new conjugate gradient algorithm with cubic Barzilai–Borwein stepsize for unconstrained optimization
- On Second-Order Properties of the Moreau–Yosida Regularization for Constrained Nonsmooth Convex Programs
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- New limited memory bundle method for large-scale nonsmooth optimization
- CUTEr and SifDec
- A descent family of Dai–Liao conjugate gradient methods
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A new efficient conjugate gradient method for unconstrained optimization
This page was built for publication: A modified conjugate gradient method for general convex functions