Multi-step quasi-Newton methods for optimization
From MaRDI portal
Publication:1334773
DOI10.1016/0377-0427(94)90309-3zbMath0807.65062OpenAlexW2054094588MaRDI QIDQ1334773
Publication date: 22 September 1994
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0377-0427(94)90309-3
Related Items (48)
New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation ⋮ A multi-iterate method to solve systems of nonlinear equations ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ Estimation of the variance matrix in bivariate classical measurement error models ⋮ Local and superlinear convergence of quasi-Newton methods based on modified secant conditions ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ Numerical experience with multiple update quasi-Newton methods for unconstrained optimization ⋮ Minimum curvature multistep quasi-Newton methods ⋮ Alternating multi-step quasi-Newton methods for unconstrained optimization ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ Using function-values in multi-step quasi-Newton methods ⋮ A new two-step gradient-type method for large-scale unconstrained optimization ⋮ Extra multistep BFGS updates in quasi-Newton methods ⋮ Eigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equation ⋮ Improved Hessian approximation with modified secant equations for symmetric rank-one method ⋮ Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization ⋮ A new conjugate gradient algorithm for training neural networks based on a modified secant equation ⋮ Two-step conjugate gradient method for unconstrained optimization ⋮ Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition ⋮ Descent Perry conjugate gradient methods for systems of monotone nonlinear equations ⋮ Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization ⋮ Globally convergent modified Perry's conjugate gradient method ⋮ Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization ⋮ A modified secant equation quasi-Newton method for unconstrained optimization ⋮ Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization ⋮ An improved nonlinear conjugate gradient method with an optimal property ⋮ An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem ⋮ Global convergence property of scaled two-step BFGS method ⋮ Higher order curvature information and its application in a modified diagonal Secant method ⋮ A new modified Barzilai-Borwein gradient method for the quadratic minimization problem ⋮ Multi-step nonlinear conjugate gradient methods for unconstrained minimization ⋮ Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems ⋮ Two new conjugate gradient methods based on modified secant equations ⋮ A descent family of Dai–Liao conjugate gradient methods ⋮ A new class of memory gradient methods with inexact line searches ⋮ Implicit updates in multistep quasi-Newton methods ⋮ A nonlinear model for function-value multistep methods ⋮ Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations ⋮ Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems ⋮ The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices ⋮ Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations ⋮ A modified BFGS algorithm based on a hybrid secant equation ⋮ Accelerated augmented Lagrangian method for total variation minimization ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing ⋮ A new super-memory gradient method with curve search rule ⋮ Variable metric methods for unconstrained optimization and nonlinear least squares ⋮ The use of alternation and recurrences in two-step quasi-Newton methods ⋮ Three-step fixed-point quasi-Newton methods for unconstrained optimisation
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the use of curvature estimates in quasi-Newton methods
- On the construction of minimization methods of quasi-Newton type
- On the solution of highly structured nonlinear equations
- On the use of function-values in unconstrained optimisation
- A Note on Minimization Algorithms which make Use of Non-quardratic Properties of the Objective Function
- A Quasi-Newton Method Employing Direct Secant Updates of Matrix Factorizations
- Correction to the Paper on Global Convergence of a Class of Trust Region Algorithms for Optimization with Simple Bounds
- Testing Unconstrained Optimization Software
- Direct Secant Updates of Matrix Factorizations
- Inversionsfreie Verfahren zur Lösung nichtlinearer Operatorgleichungen
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- On Large Scale Nonlinear Least Squares Calculations
- A conjugate direction implementation of the BFGS algorithm with automatic scaling
- A Quadratically Convergent Krawczyk-Like Algorithm
- A Class of Methods for Solving Nonlinear Simultaneous Equations
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective Function
- Quasi-Newton Methods for Discretized Non-linear Boundary Problems
This page was built for publication: Multi-step quasi-Newton methods for optimization