Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise

From MaRDI portal
Publication:4997171

DOI10.1137/19M1291832zbMath1470.90129arXiv1910.04055MaRDI QIDQ4997171

Katya Scheinberg, Liyuan Cao, Albert S. Berahas

Publication date: 28 June 2021

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1910.04055



Related Items

Full-low evaluation methods for derivative-free optimization, Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling, Zeroth-order methods for noisy Hölder-gradient functions, Stochastic Trust-Region Methods with Trust-Region Radius Depending on Probabilistic Models, Inequality constrained stochastic nonlinear optimization via active-set sequential quadratic programming, A trust region method for noisy unconstrained optimization, Convergence Properties of an Objective-Function-Free Optimization Regularization Algorithm, Including an \(\boldsymbol{\mathcal{O}(\epsilon^{-3/2})}\) Complexity Bound, A line search based proximal stochastic gradient algorithm with dynamical variance reduction, Trust-region algorithms: probabilistic complexity and intrinsic noise with applications to subsampling techniques, Unifying framework for accelerated randomized methods in convex optimization, The impact of noise on evaluation complexity: the deterministic trust-region case, Recent Theoretical Advances in Non-Convex Optimization, Adaptive regularization for nonconvex optimization using inexact function values and randomly perturbed derivatives, Expected complexity analysis of stochastic direct-search, A stochastic first-order trust-region method with inexact restoration for finite-sum minimization, A Noise-Tolerant Quasi-Newton Algorithm for Unconstrained Optimization, LSOS: Line-search second-order stochastic optimization methods for nonconvex finite sums


Uses Software


Cites Work