Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy

From MaRDI portal
Publication:4629334

DOI10.1137/16M1106316zbMath1436.90136arXiv1811.07057WikidataQ128268315 ScholiaQ128268315MaRDI QIDQ4629334

Coralia Cartis, Nick I. M. Gould, Phillipe L. Toint

Publication date: 22 March 2019

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1811.07057



Related Items

On high-order model regularization for multiobjective optimization, On the complexity of solving feasibility problems with regularized models, Tensor methods for finding approximate stationary points of convex functions, Inexact basic tensor methods for some classes of convex optimization problems, Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems, A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization, An adaptive regularization method in Banach spaces, A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization, Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization, Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence, Adaptive Third-Order Methods for Composite Convex Optimization, A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization, The impact of noise on evaluation complexity: the deterministic trust-region case, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Recent Theoretical Advances in Non-Convex Optimization, Implementable tensor methods in unconstrained convex optimization, Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities, Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions, On the Complexity of an Inexact Restoration Method for Constrained Optimization, Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints, Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives, On large-scale unconstrained optimization and arbitrary regularization, Iteration and evaluation complexity for the minimization of functions whose computation is intrinsically inexact, Unified Acceleration of High-Order Algorithms under General Hölder Continuity, A control-theoretic perspective on optimal high-order optimization, On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization, On inexact solution of auxiliary problems in tensor methods for convex optimization, An Optimal High-Order Tensor Method for Convex Optimization


Uses Software


Cites Work