Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization - MaRDI portal

The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization

From MaRDI portal
Publication:5266534

DOI10.1137/16M110280XzbMath1370.90260MaRDI QIDQ5266534

Ernesto G. Birgin, José Mario Martínez

Publication date: 16 June 2017

Published in: SIAM Journal on Optimization (Search for Journal in Brave)




Related Items

On high-order model regularization for multiobjective optimization, A cubic regularization of Newton's method with finite difference Hessian approximations, Adaptive Quadratically Regularized Newton Method for Riemannian Optimization, A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization, On the use of third-order models with fourth-order regularization for unconstrained optimization, Worst-Case Complexity of TRACE with Inexact Subproblem Solutions for Nonconvex Smooth Optimization, Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems, A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization, A Newton-CG Based Barrier Method for Finding a Second-Order Stationary Point of Nonconvex Conic Optimization with Complexity Guarantees, A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression, Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence, A Newton-CG Based Augmented Lagrangian Method for Finding a Second-Order Stationary Point of Nonconvex Equality Constrained Optimization with Complexity Guarantees, On the worst-case evaluation complexity of non-monotone line search algorithms, On High-order Model Regularization for Constrained Optimization, Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy, A line-search algorithm inspired by the adaptive cubic regularization framework and complexity analysis, On Regularization and Active-set Methods with Complexity for Constrained Optimization, Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization, Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions, Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points, Regional complexity analysis of algorithms for nonconvex smooth optimization, A generalized worst-case complexity analysis for non-monotone line searches, A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds, On the Complexity of an Inexact Restoration Method for Constrained Optimization, A concise second-order complexity analysis for unconstrained optimization using high-order regularized models, A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization, On large-scale unconstrained optimization and arbitrary regularization, Iteration and evaluation complexity for the minimization of functions whose computation is intrinsically inexact, Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary


Uses Software


Cites Work