A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees
From MaRDI portal
Publication:903922
DOI10.1007/s12532-015-0086-2zbMath1333.49042OpenAlexW902317526MaRDI QIDQ903922
Publication date: 15 January 2016
Published in: Mathematical Programming Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12532-015-0086-2
unconstrained optimizationnonsmooth optimizationnonconvex optimizationquasi-Newton methodsline search methodsgradient sampling
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (10)
Limited-memory BFGS with displacement aggregation ⋮ Manifold Sampling for Optimization of Nonconvex Functions That Are Piecewise Linear Compositions of Smooth Components ⋮ A hierarchy of spectral relaxations for polynomial optimization ⋮ Trust-region algorithms for training responses: machine learning methods using indefinite Hessian approximations ⋮ A New Sequential Optimality Condition for Constrained Nonsmooth Optimization ⋮ An SL/QP Algorithm for Minimizing the Spectral Abscissa of Time Delay Systems ⋮ A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization ⋮ A fast gradient and function sampling method for finite-max functions ⋮ A geometric integration approach to nonsmooth, nonconvex optimisation ⋮ Solving an inverse heat convection problem with an implicit forward operator by using a projected quasi-Newton method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A derivative-free approximate gradient sampling algorithm for finite minimax problems
- Nonsmooth optimization via quasi-Newton methods
- Smoothing methods for nonsmooth, nonconvex minimization
- Robust optimization-methodology and applications
- Methods of descent for nondifferentiable optimization
- Data Fitting Problems with Bounded Uncertainties in the Data
- Optimality Conditions and a Smoothing Trust Region Newton Method for NonLipschitz Optimization
- An adaptive gradient sampling algorithm for non-smooth optimization
- A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization
- A Nonderivative Version of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- A nonsmooth optimisation approach for the stabilisation of time-delay systems
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Optimization and nonsmooth analysis
- A Method for Solving Certain Quadratic Programming Problems Arising in Nonsmooth Optimization
- Updating Quasi-Newton Matrices with Limited Storage
- Optimization of lipschitz continuous functions
- An Algorithm for Constrained Optimization with Semismooth Functions
- Derivative-free optimization methods for finite minimax problems
- The Smoothed Spectral Abscissa for Robust Stability Optimization
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- Minimizing the Condition Number for Small Rank Modifications
- New limited memory bundle method for large-scale nonsmooth optimization
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- Approximating Subdifferentials by Random Sampling of Gradients
- Robust Portfolio Selection Problems
- Numerical optimization. Theoretical and practical aspects. Transl. from the French
- Compressed sensing
- Benchmarking optimization software with performance profiles.
This page was built for publication: A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees