Second-order Conditional Gradient Sliding

From MaRDI portal
Publication:6335251

arXiv2002.08907MaRDI QIDQ6335251

Sebastian Pokutta, Alejandro Carderera

Publication date: 20 February 2020

Abstract: Constrained second-order convex optimization algorithms are the method of choice when a high accuracy solution to a problem is needed, due to their local quadratic convergence. These algorithms require the solution of a constrained quadratic subproblem at every iteration. We present the emph{Second-Order Conditional Gradient Sliding} (SOCGS) algorithm, which uses a projection-free algorithm to solve the constrained quadratic subproblems inexactly. When the feasible region is a polytope the algorithm converges quadratically in primal gap after a finite number of linearly convergent iterations. Once in the quadratic regime the SOCGS algorithm requires mathcalO(log(log1/varepsilon)) first-order and Hessian oracle calls and mathcalO(log(1/varepsilon)log(log1/varepsilon)) linear minimization oracle calls to achieve an varepsilon-optimal solution. This algorithm is useful when the feasible region can only be accessed efficiently through a linear optimization oracle, and computing first-order information of the function, although possible, is costly.




Has companion code repository: https://github.com/alejandro-carderera/SOCGS








This page was built for publication: Second-order Conditional Gradient Sliding

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6335251)