Using gradient directions to get global convergence of Newton-type methods
From MaRDI portal
Publication:2244123
DOI10.1016/j.amc.2020.125612OpenAlexW3080939493MaRDI QIDQ2244123
Marco Viola, Gerardo Toraldo, Daniela di Serafino
Publication date: 11 November 2021
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2004.00968
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Newton-type methods (49M15)
Related Items (2)
Combined Newton-gradient method for constrained root-finding in chemical reaction networks ⋮ LSOS: Line-search second-order stochastic optimization methods for nonconvex finite sums
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems
- On the application of the spectral projected gradient method in image segmentation
- A new steplength selection for scaled gradient methods with application to image deblurring
- Modified Cholesky algorithms: A catalog with new approaches
- Introductory lectures on convex optimization. A basic course.
- The BFGS method with exact line searches fails for non-convex objective functions
- Globally convergent algorithms for unconstrained optimization
- A globalization procedure for solving nonlinear systems of equations
- The projected Barzilai-Borwein method with fall-back for strictly convex QCQP problems with separable constraints
- Steplength selection in gradient projection methods for box-constrained quadratic programs
- A gradient-based globalization strategy for the Newton method
- ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration
- On the steplength selection in gradient methods for unconstrained optimization
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Efficient gradient projection methods for edge-preserving removal of Poisson noise
- Two-Point Step Size Gradient Methods
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Testing Unconstrained Optimization Software
- Trust Region Methods
- A Two-Phase Gradient Method for Quadratic Programming Problems with a Single Linear Constraint and Bounds on the Variables
- Convergence Properties of the BFGS Algoritm
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Using gradient directions to get global convergence of Newton-type methods