A new conjugate gradient algorithm for training neural networks based on a modified secant equation
From MaRDI portal
Publication:905328
DOI10.1016/j.amc.2013.06.101zbMath1329.65128OpenAlexW2018704488MaRDI QIDQ905328
Ioannis E. Livieris, Panagiotis Pintelas
Publication date: 19 January 2016
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2013.06.101
global convergenceartificial neural networksmodified secant equationdescent conjugate gradient algorithm
Related Items (6)
A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ Descent Perry conjugate gradient methods for systems of monotone nonlinear equations ⋮ A new class of nonmonotone conjugate gradient training algorithms ⋮ A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Maximizing Downlink Channel Capacity of NOMA System Using Power Allocation Based on Channel Coefficients Using Particle Swarm Optimization and Back Propagation Neural Network ⋮ A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Two new conjugate gradient methods based on modified secant equations
- New quasi-Newton equation and related methods for unconstrained optimization
- An adaptive conjugate gradient learning algorithm for efficient training of neural networks
- Multi-step quasi-Newton methods for optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Using function-values in multi-step quasi-Newton methods
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- New quasi-Newton methods for unconstrained optimization problems
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- A comparative study of neural network based feature extraction paradigms
- Numerical Optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- Methods of conjugate gradients for solving linear systems
- A spectral conjugate gradient method for unconstrained optimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: A new conjugate gradient algorithm for training neural networks based on a modified secant equation