On the steplength selection in gradient methods for unconstrained optimization
From MaRDI portal
Publication:2422865
DOI10.1016/j.amc.2017.07.037zbMath1426.65082OpenAlexW2739680746WikidataQ58832693 ScholiaQ58832693MaRDI QIDQ2422865
Daniela di Serafino, Gerardo Toraldo, Valeria Ruggiero, Luca Zanni
Publication date: 21 June 2019
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/11380/1146848
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Quadratic programming (90C20)
Related Items
Newton projection method as applied to assembly simulation, Geometrical inverse matrix approximation for least-squares problems and acceleration strategies, Ritz-like values in steplength selections for stochastic gradient methods, An extended delayed weighted gradient algorithm for solving strongly convex optimization problems, Split Bregman iteration for multi-period mean variance portfolio optimization, Subsampled nonmonotone spectral gradient methods, Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems, A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization, A homogeneous Rayleigh quotient with applications in gradient methods, Adaptive \(l_1\)-regularization for short-selling control in portfolio selection, Analysis of the Barzilai-Borwein step-sizes for problems in Hilbert spaces, Gradient method with multiple damping for large-scale unconstrained optimization, A novel heuristic algorithm for solving engineering optimization and real-world problems: people identity attributes-based information-learning search optimization, Foreword to the special issue ``Recent trends in numerical computations: theory and algorithms, A gradient method exploiting the two dimensional quadratic termination property, Linear convergence rate analysis of a class of exact first-order distributed methods for weight-balanced time-varying networks and uncoordinated step sizes, On the stationarity for nonlinear optimization problems with polyhedral constraints, A hybrid BB-type method for solving large scale unconstrained optimization, Spectral Properties of Barzilai--Borwein Rules in Solving Singly Linearly Constrained Optimization Problems Subject to Lower and Upper Bounds, A collection of efficient retractions for the symplectic Stiefel manifold, On the Preconditioned Delayed Weighted Gradient Method, Computation of Sum of Squares Polynomials from Data Points, A harmonic framework for stepsize selection in gradient methods, A family of modified spectral projection methods for nonlinear monotone equations with convex constraint, Gradient methods exploiting spectral properties, Variable metric techniques for forward-backward methods in imaging, New stepsizes for the gradient method, A relaxed interior point method for low-rank semidefinite programming problems with applications to matrix completion, Fused Lasso approach in portfolio selection, Using gradient directions to get global convergence of Newton-type methods, Steplength selection in gradient projection methods for box-constrained quadratic programs, Properties of the delayed weighted gradient method, A Two-Phase Gradient Method for Quadratic Programming Problems with a Single Linear Constraint and Bounds on the Variables, A delayed weighted gradient method for strictly convex quadratic minimization, Reconstruction of 3D X-ray CT images from reduced sampling by a scaled gradient projection algorithm, ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration, On the global convergence of a new spectral residual algorithm for nonlinear systems of equations, Dirichlet problem for a nonlocal \(p\)-Laplacian elliptic equation, On the asymptotic convergence and acceleration of gradient methods, Solving nonlinear systems of equations via spectral residual methods: stepsize selection and applications, On the inexact scaled gradient projection method, A family of spectral gradient methods for optimization, Semi-supervised generalized eigenvalues classification, A view of computational models for image segmentation, Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property, Hybrid limited memory gradient projection methods for box-constrained optimization problems, On the acceleration of the Barzilai-Borwein method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems
- On the application of the spectral projected gradient method in image segmentation
- On the worst case performance of the steepest descent algorithm for quadratic functions
- An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
- Estimation of spectral bounds in gradient algorithms
- Gradient algorithms for quadratic optimization with fast convergence rates
- An efficient gradient method using the Yuan steplength
- Duality-based algorithms for total-variation-regularized image restoration
- A limited memory steepest descent method
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Accelerating gradient projection methods for \(\ell _1\)-constrained signal recovery by steplength selection rules
- Gradient methods with adaptive step-sizes
- A new steplength selection for scaled gradient methods with application to image deblurring
- On nonmonotone Chambolle gradient projection algorithms for total variation image restoration
- New adaptive stepsize selections in gradient methods
- Estimation of the optimal constants and the thickness of thin films using unconstrained optimization
- Introductory lectures on convex optimization. A basic course.
- On the behavior of the gradient norm in the steepest descent method
- Analysis of monotone gradient methods
- On the asymptotic behaviour of some new gradient methods
- New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
- R-linear convergence of the Barzilai and Borwein gradient method
- On spectral properties of steepest descent methods
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A scaled gradient projection method for constrained image deblurring
- Two-Point Step Size Gradient Methods
- Some Numerical Results Using a Sparse Matrix Updating Formula in Unconstrained Optimization
- Gradient Method with Retards and Generalizations
- Alternate minimization gradient method
- Alternate step gradient method*
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- $R$ -linear convergence of limited memory steepest descent
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- Full convergence of the steepest descent method with inexact line searches
- Gradient projection methods for quadratic programs and applications in training support vector machines
- A Rapidly Convergent Descent Method for Minimization
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Adaptive two-point stepsize gradient algorithm
- On the nonmonotone line search
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- On the steepest descent algorithm for quadratic functions