Subsampled inexact Newton methods for minimizing large sums of convex functions
From MaRDI portal
Publication:5857347
DOI10.1093/imanum/drz027zbMath1466.65041arXiv1811.05730OpenAlexW2963952517MaRDI QIDQ5857347
Nataša Krejić, Nataša Krklec Jerinkić, Stefania Bellavia
Publication date: 31 March 2021
Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.05730
global convergencesuperlinear convergenceinexact Newton methodmean square convergencesubsampled Hessian
Related Items (8)
Inexact restoration with subsampled trust-region methods for finite-sum minimization ⋮ Hessian averaging in stochastic Newton methods achieves superlinear convergence ⋮ Spectral projected subgradient method for nonsmooth convex optimization problems ⋮ Unnamed Item ⋮ A generalized worst-case complexity analysis for non-monotone line searches ⋮ Linesearch Newton-CG methods for convex optimization with noise ⋮ An inexact restoration-nonsmooth algorithm with variable accuracy for stochastic nonsmooth convex optimization problems in machine learning and stochastic linear complementarity problems ⋮ LSOS: Line-search second-order stochastic optimization methods for nonconvex finite sums
This page was built for publication: Subsampled inexact Newton methods for minimizing large sums of convex functions