Tensor methods for finding approximate stationary points of convex functions
From MaRDI portal
Publication:5038435
DOI10.1080/10556788.2020.1818082OpenAlexW3088573158MaRDI QIDQ5038435
Geovani Nunes Grapiglia, Yu. E. Nesterov
Publication date: 30 September 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.07053
Convex programming (90C25) Nonlinear programming (90C30) Convexity of real functions of several variables, generalizations (26B25)
Related Items
Inexact basic tensor methods for some classes of convex optimization problems, Worst-case evaluation complexity of a quadratic penalty method for nonconvex optimization, Adaptive Third-Order Methods for Composite Convex Optimization
Cites Work
- Unnamed Item
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Accelerating the cubic regularization of Newton's method on convex problems
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Smoothness parameter of power of Euclidean norm
- Lower bounds for finding stationary points II: first-order methods
- Implementable tensor methods in unconstrained convex optimization
- Cubic regularization of Newton method and its global performance
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Tensor Methods for Unconstrained Optimization Using Second Derivatives
- Tensor Methods for Large, Sparse Unconstrained Optimization
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- On High-order Model Regularization for Constrained Optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure
- Contracting Proximal Methods for Smooth Convex Optimization
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- On inexact solution of auxiliary problems in tensor methods for convex optimization