Inexact basic tensor methods for some classes of convex optimization problems
From MaRDI portal
Publication:5043845
DOI10.1080/10556788.2020.1854252zbMath1502.90131OpenAlexW3112051302MaRDI QIDQ5043845
Publication date: 6 October 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2020.1854252
Related Items
Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle, Efficiency of higher-order algorithms for minimizing composite functions, Random Coordinate Descent Methods for Nonseparable Composite Optimization, Inexact accelerated high-order proximal-point methods, Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure
Cites Work
- Unnamed Item
- Gradient methods for minimizing composite functions
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Lectures on convex optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Implementable tensor methods in unconstrained convex optimization
- Cubic regularization of Newton method and its global performance
- Evaluating Derivatives
- On High-order Model Regularization for Constrained Optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- Tensor methods for finding approximate stationary points of convex functions
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- On inexact solution of auxiliary problems in tensor methods for convex optimization