Multi-step quasi-Newton methods for optimization

From MaRDI portal
Publication:1334773

DOI10.1016/0377-0427(94)90309-3zbMath0807.65062OpenAlexW2054094588MaRDI QIDQ1334773

I. A. Moghrabi, John A. Ford

Publication date: 22 September 1994

Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/0377-0427(94)90309-3




Related Items (48)

New implicit updates in multi-step quasi-Newton methods for unconstrained optimisationA multi-iterate method to solve systems of nonlinear equationsA class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS updateEstimation of the variance matrix in bivariate classical measurement error modelsLocal and superlinear convergence of quasi-Newton methods based on modified secant conditionsNew nonlinear conjugate gradient methods based on optimal Dai-Liao parametersNumerical experience with multiple update quasi-Newton methods for unconstrained optimizationMinimum curvature multistep quasi-Newton methodsAlternating multi-step quasi-Newton methods for unconstrained optimizationA hybrid quasi-Newton method with application in sparse recoveryUsing function-values in multi-step quasi-Newton methodsA new two-step gradient-type method for large-scale unconstrained optimizationExtra multistep BFGS updates in quasi-Newton methodsEigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equationImproved Hessian approximation with modified secant equations for symmetric rank-one methodCompetitive secant (BFGS) methods based on modified secant relations for unconstrained optimizationA new conjugate gradient algorithm for training neural networks based on a modified secant equationTwo-step conjugate gradient method for unconstrained optimizationSome modified Yabe–Takano conjugate gradient methods with sufficient descent conditionDescent Perry conjugate gradient methods for systems of monotone nonlinear equationsGlobally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimizationGlobally convergent modified Perry's conjugate gradient methodConjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimizationA modified secant equation quasi-Newton method for unconstrained optimizationMulti-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimizationAn improved nonlinear conjugate gradient method with an optimal propertyAn efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problemGlobal convergence property of scaled two-step BFGS methodHigher order curvature information and its application in a modified diagonal Secant methodA new modified Barzilai-Borwein gradient method for the quadratic minimization problemMulti-step nonlinear conjugate gradient methods for unconstrained minimizationNonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problemsTwo new conjugate gradient methods based on modified secant equationsA descent family of Dai–Liao conjugate gradient methodsA new class of memory gradient methods with inexact line searchesImplicit updates in multistep quasi-Newton methodsA nonlinear model for function-value multistep methodsProperties and numerical performance of quasi-Newton methods with modified quasi-Newton equationsUsing nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problemsThe Dai-Liao nonlinear conjugate gradient method with optimal parameter choicesEnhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equationsA modified BFGS algorithm based on a hybrid secant equationAccelerated augmented Lagrangian method for total variation minimizationDiagonally scaled memoryless quasi-Newton methods with application to compressed sensingA new super-memory gradient method with curve search ruleVariable metric methods for unconstrained optimization and nonlinear least squaresThe use of alternation and recurrences in two-step quasi-Newton methodsThree-step fixed-point quasi-Newton methods for unconstrained optimisation


Uses Software


Cites Work


This page was built for publication: Multi-step quasi-Newton methods for optimization