Continuation Newton methods with deflation techniques for global optimization problems

From MaRDI portal
Publication:6374025

arXiv2107.13864MaRDI QIDQ6374025

Author name not available (Why is that?)

Publication date: 29 July 2021

Abstract: The global minimum point of an optimization problem is of interest in engineering fields and it is difficult to be solved, especially for a nonconvex large-scale optimization problem. In this article, we consider a new memetic algorithm for this problem. That is to say, we use the continuation Newton method with the deflation technique to find multiple stationary points of the objective function and use those found stationary points as the initial seeds of the evolutionary algorithm, other than the random initial seeds of the known evolutionary algorithms. Meanwhile, in order to retain the usability of the derivative-free method and the fast convergence of the gradient-based method, we use the automatic differentiation technique to compute the gradient and replace the Hessian matrix with its finite difference approximation. According to our numerical experiments, this new algorithm works well for unconstrained optimization problems and finds their global minima efficiently, in comparison to the other representative global optimization methods such as the multi-start methods (the built-in subroutine GlobalSearch.m of MATLAB R2021b, GLODS and VRBBO), the branch-and-bound method (Couenne, a state-of-the-art open-source solver for mixed integer nonlinear programming problems), and the derivative-free algorithms (CMA-ES and MCS).




Has companion code repository: https://github.com/luoxinlongroger/cnmge

No records found.








This page was built for publication: Continuation Newton methods with deflation techniques for global optimization problems

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6374025)