Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
From MaRDI portal
Publication:4629334
DOI10.1137/16M1106316zbMath1436.90136arXiv1811.07057WikidataQ128268315 ScholiaQ128268315MaRDI QIDQ4629334
Coralia Cartis, Nick I. M. Gould, Phillipe L. Toint
Publication date: 22 March 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.07057
Related Items
On high-order model regularization for multiobjective optimization, On the complexity of solving feasibility problems with regularized models, Tensor methods for finding approximate stationary points of convex functions, Inexact basic tensor methods for some classes of convex optimization problems, Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems, A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization, An adaptive regularization method in Banach spaces, A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization, Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization, Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence, Adaptive Third-Order Methods for Composite Convex Optimization, A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization, The impact of noise on evaluation complexity: the deterministic trust-region case, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Recent Theoretical Advances in Non-Convex Optimization, Implementable tensor methods in unconstrained convex optimization, Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities, Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions, On the Complexity of an Inexact Restoration Method for Constrained Optimization, Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints, Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives, On large-scale unconstrained optimization and arbitrary regularization, Iteration and evaluation complexity for the minimization of functions whose computation is intrinsically inexact, Unified Acceleration of High-Order Algorithms under General Hölder Continuity, A control-theoretic perspective on optimal high-order optimization, On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization, On inexact solution of auxiliary problems in tensor methods for convex optimization, An Optimal High-Order Tensor Method for Convex Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Universal gradient methods for convex optimization problems
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Introductory lectures on convex optimization. A basic course.
- Regularity results for nonlinear elliptic systems and applications
- On the use of third-order models with fourth-order regularization for unconstrained optimization
- Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems
- Cubic regularization of Newton method and its global performance
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Tensor Methods for Unconstrained Optimization Using Second Derivatives
- Trust Region Methods
- Accelerated Methods for NonConvex Optimization
- Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- Affine conjugate adaptive Newton methods for nonlinear elastomechanics
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians