The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
From MaRDI portal
Publication:5266534
DOI10.1137/16M110280XzbMath1370.90260MaRDI QIDQ5266534
Ernesto G. Birgin, José Mario Martínez
Publication date: 16 June 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Abstract computational complexity for mathematical programming problems (90C60) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items
On high-order model regularization for multiobjective optimization, A cubic regularization of Newton's method with finite difference Hessian approximations, Adaptive Quadratically Regularized Newton Method for Riemannian Optimization, A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization, On the use of third-order models with fourth-order regularization for unconstrained optimization, Worst-Case Complexity of TRACE with Inexact Subproblem Solutions for Nonconvex Smooth Optimization, Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems, A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization, A Newton-CG Based Barrier Method for Finding a Second-Order Stationary Point of Nonconvex Conic Optimization with Complexity Guarantees, A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression, Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence, A Newton-CG Based Augmented Lagrangian Method for Finding a Second-Order Stationary Point of Nonconvex Equality Constrained Optimization with Complexity Guarantees, On the worst-case evaluation complexity of non-monotone line search algorithms, On High-order Model Regularization for Constrained Optimization, Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy, A line-search algorithm inspired by the adaptive cubic regularization framework and complexity analysis, On Regularization and Active-set Methods with Complexity for Constrained Optimization, Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization, Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions, Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points, Regional complexity analysis of algorithms for nonconvex smooth optimization, A generalized worst-case complexity analysis for non-monotone line searches, A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds, On the Complexity of an Inexact Restoration Method for Constrained Optimization, A concise second-order complexity analysis for unconstrained optimization using high-order regularized models, A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization, On large-scale unconstrained optimization and arbitrary regularization, Iteration and evaluation complexity for the minimization of functions whose computation is intrinsically inexact, Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary
Uses Software
Cites Work
- Unnamed Item
- On the convergence and worst-case complexity of trust-region and regularization methods for unconstrained optimization
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Evaluating bound-constrained minimization software
- Algebraic rules for quadratic regularization of Newton's method
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Cubic regularization of Newton method and its global performance
- A New Matrix-Free Algorithm for the Large-Scale Trust-Region Subproblem
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Computing a Trust Region Step
- Algorithm 873
- LAPACK Users' Guide
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- Trust Region Methods
- A method for the solution of certain non-linear problems in least squares
- Benchmarking optimization software with performance profiles.