Local convergence of tensor methods
From MaRDI portal
Publication:2133417
DOI10.1007/s10107-020-01606-xzbMath1491.90117arXiv1912.02516OpenAlexW3118428130MaRDI QIDQ2133417
Nikita Doikov, Yu. E. Nesterov
Publication date: 29 April 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.02516
convex optimizationuniform convexityhigh-order methodslocal convergenceproximal methodstensor methods
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Lectures on convex optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Implementable tensor methods in unconstrained convex optimization
- Cubic regularization of Newton method and its global performance
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- Proximal Newton-Type Methods for Minimizing Composite Functions
- On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- Monotone Operators and the Proximal Point Algorithm
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Near-Optimal Hyperfast Second-Order Method for Convex Optimization
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- Iteration-complexity of a Rockafellar's proximal method of multipliers for convex programming based on second-order approximations
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians