A limited memory BFGS subspace algorithm for bound constrained nonsmooth problems
From MaRDI portal
Publication:2069440
DOI10.1186/s13660-020-02398-6zbMath1503.90133OpenAlexW3031943549MaRDI QIDQ2069440
Publication date: 20 January 2022
Published in: Journal of Inequalities and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1186/s13660-020-02398-6
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Nonsmooth analysis (49J52)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A new subspace limited memory BFGS algorithm for large-scale bound constrained optimization
- Proximity control in bundle methods for convex nondifferentiable minimization
- A subgradient projection algorithm
- A general approach to convergence properties of some methods for nonsmooth convex optimization
- Convergence of some algorithms for convex minimization
- Representations of quasi-Newton matrices and their use in limited memory methods
- Non-Euclidean restricted memory level method for large-scale convex optimization
- A family of variable metric proximal methods
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Methods of descent for nondifferentiable optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A nonsmooth version of Newton's method
- The global convergence of a modified BFGS method for nonconvex functions
- THE BARZILAI AND BORWEIN GRADIENT METHOD WITH NONMONOTONE LINE SEARCH FOR NONSMOOTH CONVEX OPTIMIZATION PROBLEMS
- Comparing different nonsmooth minimization methods and software
- Optimization and nonsmooth analysis
- Projected gradient methods for linearly constrained problems
- A Version of the Bundle Idea for Minimizing a Nonsmooth Function: Conceptual Idea, Convergence Analysis, Numerical Results
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- An Active Set Newton Algorithm for Large-Scale Nonlinear Programs with Box Constraints
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Limited Memory Algorithm for Bound Constrained Optimization
- New limited memory bundle method for large-scale nonsmooth optimization
- A Kuhn–Tucker Algorithm
- Benchmarking optimization software with performance profiles.
This page was built for publication: A limited memory BFGS subspace algorithm for bound constrained nonsmooth problems