Complexity bounds for second-order optimality in unconstrained optimization
From MaRDI portal
Publication:657654
DOI10.1016/j.jco.2011.06.001zbMath1245.65063OpenAlexW2082877410MaRDI QIDQ657654
Publication date: 10 January 2012
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2011.06.001
algorithmsglobal optimizationunconstrained optimizationregularizationnonconvex optimizationworst-case analysistrust-regionsecond-order optimally conditions
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Interior-point methods (90C51)
Related Items (35)
Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy ⋮ Sharp Restricted Isometry Bounds for the Inexistence of Spurious Local Minima in Nonconvex Matrix Recovery ⋮ A note about the complexity of minimizing Nesterov's smooth Chebyshev–Rosenbrock function ⋮ Unnamed Item ⋮ Cubic overestimation and secant updating for unconstrained optimization ofC2, 1functions ⋮ Using improved directions of negative curvature for the solution of bound-constrained nonconvex problems ⋮ Lower bounds for non-convex stochastic optimization ⋮ Finding stationary points on bounded-rank matrices: a geometric hurdle and a smooth remedy ⋮ Detecting negative eigenvalues of exact and approximate Hessian matrices in optimization ⋮ Riemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimization ⋮ OFFO minimization algorithms for second-order optimality and their complexity ⋮ Newton-type methods for non-convex optimization under inexact Hessian information ⋮ Lower bounds for finding stationary points I ⋮ Lower bounds for finding stationary points II: first-order methods ⋮ Convergence of Newton-MR under Inexact Hessian Information ⋮ On Regularization and Active-set Methods with Complexity for Constrained Optimization ⋮ Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization ⋮ Nonlinear stepsize control algorithms: complexity bounds for first- and second-order optimality ⋮ Complexity of proximal augmented Lagrangian for nonconvex optimization with nonlinear equality constraints ⋮ Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models ⋮ Updating the regularization parameter in the adaptive cubic regularization algorithm ⋮ Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization ⋮ A geometric analysis of phase retrieval ⋮ A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds ⋮ Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization ⋮ Adaptive regularization with cubics on manifolds ⋮ On the Complexity of an Inexact Restoration Method for Constrained Optimization ⋮ A concise second-order complexity analysis for unconstrained optimization using high-order regularized models ⋮ A note on inexact gradient and Hessian conditions for cubic regularized Newton's method ⋮ A second-order globally convergent direct-search method and its worst-case complexity ⋮ A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization ⋮ Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step ⋮ Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary ⋮ Trust-Region Newton-CG with Strong Second-Order Complexity Guarantees for Nonconvex Optimization ⋮ On the Evaluation Complexity of Constrained Nonlinear Least-Squares and General Constrained Nonlinear Optimization Using Second-Order Methods
Cites Work
- Unnamed Item
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Approximation algorithms for indefinite quadratic programming
- Worst case complexity of direct search
- Accelerating the cubic regularization of Newton's method on convex problems
- Trust-region and other regularisations of linear least-squares problems
- Introductory lectures on convex optimization. A basic course.
- Cubic regularization of Newton method and its global performance
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Trust Region Methods
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Black-Box Complexity of Local Minimization
- Affine conjugate adaptive Newton methods for nonlinear elastomechanics
This page was built for publication: Complexity bounds for second-order optimality in unconstrained optimization