Unified Acceleration of High-Order Algorithms under General Hölder Continuity
From MaRDI portal
Publication:5003214
DOI10.1137/19M1290243zbMath1483.65100OpenAlexW3182238355MaRDI QIDQ5003214
No author found.
Publication date: 20 July 2021
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/19m1290243
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Complexity and performance of numerical algorithms (65Y20)
Related Items (3)
Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization ⋮ Cyclic Coordinate Dual Averaging with Extrapolation ⋮ A control-theoretic perspective on optimal high-order optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Universal gradient methods for convex optimization problems
- Operator norm inequalities between tensor unfoldings on the partition lattice
- Accelerating the cubic regularization of Newton's method on convex problems
- Sharp uniform convexity and smoothness inequalities for trace norms
- Introductory lectures on convex optimization. A basic course.
- Complexity bounds for primal-dual methods minimizing the model of objective function
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Cubic regularization of Newton method and its global performance
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Optimal methods of smooth convex minimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- A variational perspective on accelerated methods in optimization
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
- Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
This page was built for publication: Unified Acceleration of High-Order Algorithms under General Hölder Continuity