Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization

From MaRDI portal
Publication:6335013

arXiv2002.07290MaRDI QIDQ6335013

Author name not available (Why is that?)

Publication date: 17 February 2020

Abstract: We develop two new stochastic Gauss-Newton algorithms for solving a class of non-convex stochastic compositional optimization problems frequently arising in practice. We consider both the expectation and finite-sum settings under standard assumptions, and use both classical stochastic and SARAH estimators for approximating function values and Jacobians. In the expectation case, we establish mathcalO(varepsilon2) iteration-complexity to achieve a stationary point in expectation and estimate the total number of stochastic oracle calls for both function value and its Jacobian, where varepsilon is a desired accuracy. In the finite sum case, we also estimate mathcalO(varepsilon2) iteration-complexity and the total oracle calls with high probability. To our best knowledge, this is the first time such global stochastic oracle complexity is established for stochastic Gauss-Newton methods. Finally, we illustrate our theoretical results via two numerical examples on both synthetic and real datasets.




Has companion code repository: https://github.com/unc-optimization/SGN








This page was built for publication: Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6335013)