Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
From MaRDI portal
Publication:5737717
DOI10.1137/16M1087801zbMath1406.49029MaRDI QIDQ5737717
Yu. E. Nesterov, Geovani Nunes Grapiglia
Publication date: 30 May 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Convex programming (90C25) Nonlinear programming (90C30) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37) Implicit function theorems; global Newton methods on manifolds (58C15)
Related Items
On high-order model regularization for multiobjective optimization, On the complexity of solving feasibility problems with regularized models, Tensor methods for finding approximate stationary points of convex functions, Local convergence of tensor methods, A cubic regularization of Newton's method with finite difference Hessian approximations, Smoothness parameter of power of Euclidean norm, Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems, A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization, On the worst-case evaluation complexity of non-monotone line search algorithms, Super-Universal Regularized Newton Method, Adaptive Third-Order Methods for Composite Convex Optimization, Optimal Transport Approximation of 2-Dimensional Measures, The impact of noise on evaluation complexity: the deterministic trust-region case, Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy, Implementable tensor methods in unconstrained convex optimization, Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities, On Regularization and Active-set Methods with Complexity for Constrained Optimization, Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions, Complexity of proximal augmented Lagrangian for nonconvex optimization with nonlinear equality constraints, A generalized worst-case complexity analysis for non-monotone line searches, Minimizing uniformly convex functions by cubic regularization of Newton method, Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints, Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives, On large-scale unconstrained optimization and arbitrary regularization, Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary, A control-theoretic perspective on optimal high-order optimization, On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization, On inexact solution of auxiliary problems in tensor methods for convex optimization, High-Order Optimization Methods for Fully Composite Problems
Uses Software
Cites Work
- Smooth minimization of non-smooth functions
- Universal gradient methods for convex optimization problems
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Accelerating the cubic regularization of Newton's method on convex problems
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Cubic regularization of Newton method and its global performance
- On the Evaluation Complexity of Cubic Regularization Methods for Potentially Rank-Deficient Nonlinear Least-Squares Problems and Its Relevance to Constrained Nonlinear Optimization