An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
From MaRDI portal
Publication:360475
DOI10.1007/s11590-012-0491-7zbMath1276.90047OpenAlexW1994949122MaRDI QIDQ360475
Luc Pronzato, Elena Bukina, Anatoly A. Zhigljavsky
Publication date: 27 August 2013
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-012-0491-7
quadratic optimizationFibonacci numbersconjugate gradientgradient algorithmsarcsine distributionestimation of leading eigenvalues
Related Items (6)
Estimation of spectral bounds in gradient algorithms ⋮ Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems ⋮ On the steplength selection in gradient methods for unconstrained optimization ⋮ Spectral Properties of Barzilai--Borwein Rules in Solving Singly Linearly Constrained Optimization Problems Subject to Lower and Upper Bounds ⋮ On \(R\)-linear convergence analysis for a class of gradient methods ⋮ Performance analysis of greedy algorithms for minimising a maximum mean discrepancy
Cites Work
- Unnamed Item
- Unnamed Item
- Gradient algorithms for quadratic optimization with fast convergence rates
- The block preconditioned conjugate gradient method on vector computers
- Efficient and reliable iterative methods for linear systems
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- Studying Convergence of Gradient Algorithms Via Optimal Experimental Design Theory
- A Dynamical-System Analysis of the Optimum s-Gradient Algorithm
- Practical Use of Polynomial Preconditionings for the Conjugate Gradient Method
- Two-Point Step Size Gradient Methods
This page was built for publication: An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost