An efficient conjugate gradient method with strong convergence properties for non-smooth optimization
From MaRDI portal
Publication:5025110
DOI10.22124/jmm.2020.16747.1452zbMath1499.90129OpenAlexW3118910617MaRDI QIDQ5025110
Masoud Fatemi, Fahimeh Abdollahi
Publication date: 1 February 2022
Full work available at URL: https://jmm.guilan.ac.ir/article_4471_6cf479f380e06f0783bdb615ea168c1e.pdf
Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Complexity and performance of numerical algorithms (65Y20)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- A new trust region algorithm for nonsmooth convex minimization
- Proximal quasi-Newton methods for nondifferentiable convex optimization
- Convergence of some algorithms for convex minimization
- Globally convergent BFGS method for nonsmooth convex optimization
- Multivariate spectral gradient algorithm for nonsmooth convex optimization problems
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- A trust region method for nonsmooth convex optimization
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- Conjugate gradient type methods for the nondifferentiable convex minimization
- A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization
- A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- THE BARZILAI AND BORWEIN GRADIENT METHOD WITH NONMONOTONE LINE SEARCH FOR NONSMOOTH CONVEX OPTIMIZATION PROBLEMS
- Comparing different nonsmooth minimization methods and software
- A descent algorithm for nonsmooth convex optimization
- Deblurring Images
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Trust Region Methods
- On Second-Order Properties of the Moreau–Yosida Regularization for Constrained Nonsmooth Convex Programs
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Convergence analysis of a proximal newton method1
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- New limited memory bundle method for large-scale nonsmooth optimization
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A new efficient conjugate gradient method for unconstrained optimization
This page was built for publication: An efficient conjugate gradient method with strong convergence properties for non-smooth optimization