A cubic regularization of Newton's method with finite difference Hessian approximations
From MaRDI portal
Publication:2138398
DOI10.1007/s11075-021-01200-yzbMath1492.65166OpenAlexW3205980973WikidataQ114224288 ScholiaQ114224288MaRDI QIDQ2138398
Publication date: 11 May 2022
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-021-01200-y
Related Items
Uses Software
Cites Work
- Unnamed Item
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- On the worst-case evaluation complexity of non-monotone line search algorithms
- Fast derivatives of likelihood functionals for ODE based models using adjoint-state method
- A line-search algorithm inspired by the adaptive cubic regularization framework and complexity analysis
- Lower bounds for finding stationary points I
- A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
- Cubic regularization of Newton method and its global performance
- A cubic regularization algorithm for unconstrained optimization using line search and nonmonotone techniques
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Testing Unconstrained Optimization Software
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- An improvement of adaptive cubic regularization method for unconstrained optimization problems
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians