A fast and simple modification of Newton's method avoiding saddle points
From MaRDI portal
Publication:6086150
DOI10.1007/s10957-023-02270-9OpenAlexW4384662421MaRDI QIDQ6086150
Tat Dat Tô, Hoang Phuong Nguyen, Maged Helmy, Hang-Tuan Nguyen, Thu Hang Nguyen, Tuyen Trung Truong
Publication date: 9 November 2023
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-023-02270-9
rate of convergencesaddle pointsNewton-type methodbacktracking line searchroots of univariate meromorphic functions
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37) Dynamical systems in optimization and economics (37N40)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A stabilized SQP method: superlinear convergence
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Convergence properties of the regularized Newton method for the unconstrained nonconvex optimization
- A regularized Newton method without line search for unconstrained optimization
- Mathematical problems for the next century
- On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption
- Perturbation theory for linear operators.
- A regularized Newton method for degenerate unconstrained optimization problems
- Negativity of Lyapunov exponents and convergence of generic random polynomial dynamical systems and random relaxed Newton's methods
- Backtracking gradient descent method and some applications in large scale optimisation. II: Algorithms and experiments
- Local convergence of the Levenberg-Marquardt method under Hölder metric subregularity
- Cubic regularization of Newton method and its global performance
- Minimization of functions having Lipschitz continuous first partial derivatives
- A cubic regularization algorithm for unconstrained optimization using line search and nonmonotone techniques
- A Machine Method for Solving Polynomial Equations
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- A stabilized SQP method: global convergence
- Finding zeros of Hölder metrically subregular mappings via globally convergent Levenberg–Marquardt methods
- An Inertial Newton Algorithm for Deep Learning
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- A Numerical Method for Locating the Zeros of an Analytic Function
- A method for the solution of certain non-linear problems in least squares