Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
From MaRDI portal
Publication:4971023
DOI10.1137/19M1259432zbMath1451.90119arXiv1904.12559OpenAlexW3164331286MaRDI QIDQ4971023
Geovani Nunes Grapiglia, Yu. E. Nesterov
Publication date: 8 October 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1904.12559
unconstrained minimizationhigh-order methodsHölder conditiontensor methodsworst-case global complexity bounds
Convex programming (90C25) Nonlinear programming (90C30) Convexity of real functions of several variables, generalizations (26B25)
Related Items
Tensor methods for finding approximate stationary points of convex functions, Inexact basic tensor methods for some classes of convex optimization problems, Local convergence of tensor methods, Accelerated meta-algorithm for convex optimization problems, Super-Universal Regularized Newton Method, Efficiency of higher-order algorithms for minimizing composite functions, Adaptive Third-Order Methods for Composite Convex Optimization, Near-Optimal Hyperfast Second-Order Method for Convex Optimization, An adaptive high order method for finding third-order critical points of nonconvex optimization, Unified Acceleration of High-Order Algorithms under General Hölder Continuity, A control-theoretic perspective on optimal high-order optimization, On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization, On inexact solution of auxiliary problems in tensor methods for convex optimization, High-Order Optimization Methods for Fully Composite Problems
Cites Work
- Unnamed Item
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Lectures on convex optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Cubic regularization of Newton method and its global performance
- Tensor Methods for Unconstrained Optimization Using Second Derivatives
- Tensor Methods for Large, Sparse Unconstrained Optimization
- On High-order Model Regularization for Constrained Optimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- Über homogene Polynome in ($L^{2}$)
- On inexact solution of auxiliary problems in tensor methods for convex optimization