Derivative-free global minimization for a class of multiple minima problems

From MaRDI portal
Publication:6342919

arXiv2006.08181MaRDI QIDQ6342919

Author name not available (Why is that?)

Publication date: 15 June 2020

Abstract: We prove that the finite-difference based derivative-free descent (FD-DFD) methods have a capability to find the global minima for a class of multiple minima problems. Our main result shows that, for a class of multiple minima objectives that is extended from strongly convex functions with Lipschitz-continuous gradients, the iterates of FD-DFD converge to the global minimizer x* with the linear convergence |xk+1x*|22leqslanthok|x1x*|22 for a fixed 0<ho<1 and any initial iteration x1inmathbbRd when the parameters are properly selected. Since the per-iteration cost, i.e., the number of function evaluations, is fixed and almost independent of the dimension d, the FD-DFD algorithm has a complexity bound mathcalO(logfrac1epsilon) for finding a point x such that the optimality gap |xx*|22 is less than epsilon>0. Numerical experiments in various dimensions from 5 to 500 demonstrate the benefits of the FD-DFD method.




Has companion code repository: https://github.com/xiaopengluo/dfd

No records found.








This page was built for publication: Derivative-free global minimization for a class of multiple minima problems

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6342919)