A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization
From MaRDI portal
Publication:4646678
DOI10.1080/10556788.2017.1378652zbMath1406.49032arXiv1612.07350OpenAlexW2562219230MaRDI QIDQ4646678
Nitish Shirish Keskar, Andreas Wächter
Publication date: 14 January 2019
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1612.07350
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Newton-type methods (49M15) Methods of quasi-Newton type (90C53) Decomposition methods (49M27)
Related Items
Modeling approaches for addressing unrelaxable bound constraints with unconstrained optimization methods, Unnamed Item, Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods, A convergence analysis of the method of codifferential descent, A Smoothing Active Set Method for Linearly Constrained Non-Lipschitz Nonconvex Optimization, Solving generalized inverse eigenvalue problems via L-BFGS-B method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- Nonsmooth optimization via quasi-Newton methods
- On Nesterov's nonsmooth Chebyshev-Rosenbrock functions
- Globally convergent limited memory bundle method for large-scale nonsmooth optimization
- A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees
- On the limited memory BFGS method for large scale optimization
- Representations of quasi-Newton matrices and their use in limited memory methods
- A globally convergent primal-dual active-set framework for large-scale convex quadratic optimization
- Nonsmoothness and a variable metric method
- Adaptive limited memory bundle method for bound constrained large-scale nonsmooth optimization
- A second-order method for convex1-regularized optimization with active-set prediction
- An adaptive gradient sampling algorithm for non-smooth optimization
- Comparing different nonsmooth minimization methods and software
- A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization
- Introduction to Nonsmooth Optimization
- A Feasible Active Set Method for Strictly Convex Quadratic Problems with Simple Bounds
- Limited memory bundle method for large bound constrained nonsmooth optimization: convergence analysis
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Projected Newton Methods for Optimization Problems with Simple Constraints
- Survey of Bundle Methods for Nonsmooth Optimization
- A Limited Memory Algorithm for Bound Constrained Optimization
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- New limited memory bundle method for large-scale nonsmooth optimization
- Benchmarking optimization software with performance profiles.