The hybrid BFGS-CG method in solving unconstrained optimization problems
From MaRDI portal
Publication:1724270
DOI10.1155/2014/507102zbMath1470.90159OpenAlexW2068557887WikidataQ59038891 ScholiaQ59038891MaRDI QIDQ1724270
Mohd Asrul Hery Ibrahim, Mustafa Mamat, Wah June Leong
Publication date: 14 February 2019
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2014/507102
Related Items (2)
Bayesian inversion with α-stable priors ⋮ New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convergence of quasi-Newton method with new inexact line search
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
- The Gauss-Seidel-quasi-Newton method: a hybrid algorithm for solving dynamic economic models
- Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Restart procedures for the conjugate gradient method
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- On Steepest Descent
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: The hybrid BFGS-CG method in solving unconstrained optimization problems