A family of spectral conjugate gradient methods with strong convergence and its applications in image restoration and machine learning
DOI10.1016/J.JFRANKLIN.2024.107033zbMATH Open1543.90305MaRDI QIDQ6593741
Ligang Pan, Xian-Zhen Jiang, Meixing Liu, Jin-Bao Jian
Publication date: 27 August 2024
Published in: Journal of the Franklin Institute (Search for Journal in Brave)
unconstrained optimizationstrong convergencemachine learningimage restorationspectral conjugate gradient method
Nonconvex programming, global optimization (90C26) Numerical optimization and variational techniques (65K10) Learning and adaptive systems in artificial intelligence (68T05) Methods of quasi-Newton type (90C53) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Two modified scaled nonlinear conjugate gradient methods
- On restart procedures for the conjugate gradient method
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- A spectral conjugate gradient method for solving large-scale unconstrained optimization
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Two-Point Step Size Gradient Methods
- Testing Unconstrained Optimization Software
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A new two-parameter family of nonlinear conjugate gradient methods
- A new spectral conjugate gradient method for large-scale unconstrained optimization
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- A Two-Term PRP-Based Descent Method
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising
- A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations
- A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems
This page was built for publication: A family of spectral conjugate gradient methods with strong convergence and its applications in image restoration and machine learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6593741)