Adaptive Third-Order Methods for Composite Convex Optimization
From MaRDI portal
Publication:6171322
DOI10.1137/22m1480872zbMath1522.90093arXiv2202.12730OpenAlexW4385493772MaRDI QIDQ6171322
Geovani Nunes Grapiglia, Yu. E. Nesterov
Publication date: 11 August 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.12730
Cites Work
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Accelerating the cubic regularization of Newton's method on convex problems
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Implementable tensor methods in unconstrained convex optimization
- Cubic regularization of Newton method and its global performance
- Superfast second-order methods for unconstrained convex optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- ARCq: a new adaptive regularization by cubics
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- Tensor methods for finding approximate stationary points of convex functions
- Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives
- A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- On inexact solution of auxiliary problems in tensor methods for convex optimization