Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems
From MaRDI portal
Publication:5094911
DOI10.1137/20M1321140MaRDI QIDQ5094911
Qinmeng Zou, Frédéric Magoulès
Publication date: 5 August 2022
Published in: SIAM Review (Search for Journal in Brave)
Research exposition (monographs, survey articles) pertaining to numerical analysis (65-02) Iterative numerical methods for linear systems (65F10) Linear equations (linear algebraic aspects) (15A06)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems
- On the worst case performance of the steepest descent algorithm for quadratic functions
- An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
- Estimation of spectral bounds in gradient algorithms
- Gradient algorithms for quadratic optimization with fast convergence rates
- The chaotic nature of faster gradient descent methods
- A new analysis on the Barzilai-Borwein gradient method
- An efficient gradient method using the Yuan steplength
- A high-performance, portable implementation of the MPI message passing interface standard
- A limited memory steepest descent method
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Gradient methods with adaptive step-sizes
- A new steplength selection for scaled gradient methods with application to image deblurring
- A short note on the Q-linear convergence of the steepest descent method
- New adaptive stepsize selections in gradient methods
- An affine-scaling interior-point CBB method for box-constrained optimization
- s-step iterative methods for symmetric linear systems
- On the determinants of moment matrices
- A historical overview of iterative methods
- Iterative solution of linear systems in the 20th century
- On the behavior of the gradient norm in the steepest descent method
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Analysis of monotone gradient methods
- Steplength selection in gradient projection methods for box-constrained quadratic programs
- On the asymptotic convergence and acceleration of gradient methods
- On the acceleration of the Barzilai-Borwein method
- Fast gradient methods with alignment for symmetric linear systems without using Cauchy step
- A family of spectral gradient methods for optimization
- Steepest descent method with random step lengths
- On the steplength selection in gradient methods for unconstrained optimization
- On the asymptotic behaviour of some new gradient methods
- New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- A new gradient method with an optimal stepsize property
- On the asymptotic directions of the s-dimensional optimum gradient method
- Preconditioned Barzilai-Borwein method for the numerical solution of partial differential equations
- R-linear convergence of the Barzilai and Borwein gradient method
- On spectral properties of steepest descent methods
- The university of Florida sparse matrix collection
- Gradient Methods for Large Scale Convex Quadratic Functions
- Gradient-Based Methods for Sparse Recovery
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A New Active Set Algorithm for Box Constrained Optimization
- Two-Point Step Size Gradient Methods
- Some remarks on the method of minimal residues
- An Iterative Solution Method for Linear Systems of Which the Coefficient Matrix is a Symmetric M-Matrix
- Gradient Method with Retards and Generalizations
- Inexact and Preconditioned Uzawa Algorithms for Saddle Point Problems
- Alternate minimization gradient method
- Inexact spectral projected gradient methods on convex sets
- Alternate step gradient method*
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- The Riemannian Barzilai–Borwein method with nonmonotone line search and the matrix geometric mean computation
- $R$ -linear convergence of limited memory steepest descent
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- Handling nonpositive curvature in a limited memory steepest descent method
- A Nonmonotone Line Search Technique for Newton’s Method
- A Sequential Linear Constraint Programming Algorithm for NLP
- Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property
- Parameter estimation in the Hermitian and skew‐Hermitian splitting method using gradient iterations
- Gradient methods exploiting spectral properties
- Gradient descent and fast artificial time integration
- On the Barzilai and Borwein choice of steplength for the gradient method
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- An Iterative Least-Square Method Suitable for Solving Large Sparse Matrices
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- The general theory of relaxation methods applied to linear systems
- Methods of conjugate gradients for solving linear systems
- Solving linear algebraic equations can be interesting
- On the nonmonotone line search
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- On the steepest descent algorithm for quadratic functions