Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method
From MaRDI portal
Publication:508045
DOI10.1016/j.cam.2016.12.021zbMath1381.90069OpenAlexW2564507326MaRDI QIDQ508045
Raouf Ziadi, Abdelatif Bencherif-Madani, Rachid Ellaia
Publication date: 9 February 2017
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2016.12.021
global optimizationstochastic perturbationLennard-Jones clusters problemPolak-Ribière conjugate gradient method
Nonconvex programming, global optimization (90C26) Derivative-free methods and methods using generalized derivatives (90C56) Numerical optimization and variational techniques (65K10) Methods of quasi-Newton type (90C53)
Related Items
A new family of hybrid three-term conjugate gradient methods with applications in image restoration, A new modified three-term conjugate gradient method with sufficient descent property and its global convergence, Unnamed Item, A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence, Implementation of reduced gradient with bisection algorithms for non-convex optimization problem via stochastic perturbation, A deterministic method for continuous global optimization using a dense curve
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- Continuous global optimization through the generation of parametric curves
- A conjugate gradient method with descent direction for unconstrained optimization
- A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems
- Global optimization. Scientific and engineering case studies
- Conjugate gradient algorithms in nonconvex optimization
- Speeding up continuous GRASP
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A modified PRP conjugate gradient method
- Global optimization by random perturbation of the gradient method with a fixed parameter
- Differential evolution -- a simple and efficient heuristic for global optimization over continuous spaces
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Convergence of descent method without line search
- Experimental testing of advanced scatter search designs for global optimization of multimodal functions
- A three-parameter family of nonlinear conjugate gradient methods
- Convergence of conjugate gradient methods with constant stepsizes
- A CARTOPT METHOD FOR BOUND-CONSTRAINED GLOBAL OPTIMIZATION
- Algorithm 851
- Nonlinear Programming
- Conjugate Directions without Linear Searches
- Benchmarking optimization software with performance profiles.
- Global convergence of conjugate gradient methods without line search