Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods
DOI10.1080/10556788.2021.2022148OpenAlexW3129171642MaRDI QIDQ5058404
Publication date: 20 December 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2102.02045
convex optimizationsuperlinear convergencestrongly convexproximal-point algorithmaccelerated methodsproximal-Newton methodhigh-order tensor methodslarge-step
Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Numerical optimization and variational techniques (65K10) Monotone operators and generalizations (47H05)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- Relative-error approximate versions of Douglas-Rachford splitting and special cases of the ADMM
- Minimizing uniformly convex functions by cubic regularization of Newton method
- A control-theoretic perspective on optimal high-order optimization
- A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Inexact accelerated high-order proximal-point methods
- An Inexact Hybrid Generalized Proximal Point Algorithm and Some New Results on the Theory of Bregman Functions
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- A dynamic approach to a proximal-Newton method for monotone inclusions in Hilbert spaces, with complexity O(1/n^2)
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- Monotone Operators and the Proximal Point Algorithm
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Iteration-Complexity of a Newton Proximal Extragradient Method for Monotone Variational Inequalities and Inclusion Problems
- Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure
- An Optimal High-Order Tensor Method for Convex Optimization
- Regularized HPE-Type Methods for Solving Monotone Inclusions with Improved Pointwise Iteration-Complexity Bounds
This page was built for publication: Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods