Curvature-Aware Derivative-Free Optimization

From MaRDI portal
Publication:6378733

arXiv2109.13391MaRDI QIDQ6378733

Author name not available (Why is that?)

Publication date: 27 September 2021

Abstract: The paper discusses derivative-free optimization (DFO), which involves minimizing a function without access to gradients or directional derivatives, only function evaluations. Classical DFO methods, which mimic gradient-based methods, such as Nelder-Mead and direct search have limited scalability for high-dimensional problems. Zeroth-order methods have been gaining popularity due to the demands of large-scale machine learning applications, and the paper focuses on the selection of the step size alphak in these methods. The proposed approach, called Curvature-Aware Random Search (CARS), uses first- and second-order finite difference approximations to compute a candidate alpha+. We prove that for strongly convex objective functions, CARS converges linearly provided that the search direction is drawn from a distribution satisfying very mild conditions. We also present a Cubic Regularized variant of CARS, named CARS-CR, which converges in a rate of mathcalO(k1) without the assumption of strong convexity. Numerical experiments show that CARS and CARS-CR match or exceed the state-of-the-arts on benchmark problem sets.




Has companion code repository: https://github.com/bumsu-kim/CARS_Refactored

No records found.








This page was built for publication: Curvature-Aware Derivative-Free Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6378733)