Derivative-free global minimization for a class of multiple minima problems
From MaRDI portal
Publication:6342919
arXiv2006.08181MaRDI QIDQ6342919
Author name not available (Why is that?)
Publication date: 15 June 2020
Abstract: We prove that the finite-difference based derivative-free descent (FD-DFD) methods have a capability to find the global minima for a class of multiple minima problems. Our main result shows that, for a class of multiple minima objectives that is extended from strongly convex functions with Lipschitz-continuous gradients, the iterates of FD-DFD converge to the global minimizer with the linear convergence for a fixed and any initial iteration when the parameters are properly selected. Since the per-iteration cost, i.e., the number of function evaluations, is fixed and almost independent of the dimension , the FD-DFD algorithm has a complexity bound for finding a point such that the optimality gap is less than . Numerical experiments in various dimensions from to demonstrate the benefits of the FD-DFD method.
Has companion code repository: https://github.com/xiaopengluo/dfd
No records found.
This page was built for publication: Derivative-free global minimization for a class of multiple minima problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6342919)