Minimization of \(SC^ 1\) functions and the Maratos effect
From MaRDI portal
Publication:1905075
DOI10.1016/0167-6377(94)00059-FzbMath0843.90108OpenAlexW1980428414MaRDI QIDQ1905075
Publication date: 19 August 1996
Published in: Operations Research Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0167-6377(94)00059-f
unconstrained minimizationline searchcontinuously differentiable functionMaratos effectsemismooth gradient
Related Items (37)
Massive data classification via unconstrained support vector machines ⋮ A truncated Newton method in an augmented Lagrangian framework for nonlinear programming ⋮ Constructing a sequence of discrete Hessian matrices of an \(SC^{1}\) function uniformly convergent to the generalized Hessian matrix ⋮ A preconditioning proximal Newton method for nondifferentiable convex optimization ⋮ A nonsmooth inexact Newton method for the solution of large-scale nonlinear complementarity problems ⋮ Robust recursive quadratic programming algorithm model with global and superlinear convergence properties ⋮ A fast eigenvalue approach for solving the trust region subproblem with an additional linear inequality ⋮ Local feasible QP-free algorithms for the constrained minimization of SC\(^1\) functions ⋮ Newton method for \(\ell_0\)-regularized optimization ⋮ A perturbation approach for an inverse quadratic programming problem ⋮ Generalized damped Newton algorithms in nonsmooth optimization via second-order subdifferentials ⋮ Unnamed Item ⋮ Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization ⋮ A nonsmooth Newton method for solving the generalized complementarity problem ⋮ Chunking for massive nonlinear kernel classification ⋮ Unconstrained optimization reformulation of the generalized nonlinear complementarity problem and related method ⋮ Exactness conditions for a convex differentiable exterior penalty for linear programming ⋮ Breast tumor susceptibility to chemotherapy via support vector machines ⋮ AN INFEASIBLE SSLE FILTER ALGORITHM FOR GENERAL CONSTRAINED OPTIMIZATION WITHOUT STRICT COMPLEMENTARITY ⋮ Inverse and implicit function theorems forH-differentiable and semismooth functions ⋮ Globally and superlinearly convergent QP-free algorithm for nonlinear constrained optimization ⋮ An accurate active set Newton algorithm for large scale bound constrained optimization. ⋮ A Newton method for linear programming ⋮ Differentiability and semismoothness properties of integral functions and their applications ⋮ Sequential systems of linear equations method for general constrained optimization without strict complementarity ⋮ A finite newton method for classification ⋮ An accelerated active-set algorithm for a quadratic semidefinite program with general constraints ⋮ An augmented Lagrangian method for a class of Inverse quadratic programming problems ⋮ Multicategory proximal support vector machine classifiers ⋮ An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets ⋮ Multicategory proximal support vector machine classifiers ⋮ The \(SC^1\) 1property of an expected residual function arising from stochastic complementarity problems ⋮ An efficient Hessian based algorithm for singly linearly and box constrained least squares regression ⋮ Globally and superlinearly convergent algorithms for the solution of box-constrained optimi\-zation ⋮ Theoretical and numerical investigation of the D-gap function for box constrained variational inequalities ⋮ A Newton-type algorithm for generalized linear complementarity problem over a polyhedral cone ⋮ Newton-type methods for stochastic programming.
Cites Work
- Unnamed Item
- Generalized Hessian matrix and second-order optimality conditions for problems with \(C^{1,1}\) data
- Superlinearly convergent approximate Newton methods for LC\(^ 1\) optimization problems
- Robust recursive quadratic programming algorithm model with global and superlinear convergence properties
- Computational schemes for large-scale problems in extended linear- quadratic programming
- On concepts of directional differentiability
- A globally convergent Newton method for convex \(SC^ 1\) minimization problems
- A nonsmooth version of Newton's method
- Nonsmooth Equations: Motivation and Algorithms
- Generalized Linear-Quadratic Problems of Deterministic and Stochastic Optimal Control in Discrete Time
- Local structure of feasible sets in nonlinear programming, Part III: Stability and sensitivity
- On second-order sufficient optimality conditions for c 1,1-optimization problems
- Semismooth and Semiconvex Functions in Constrained Optimization
- Generalized second-order directional derivatives and optimization with C1,1 functions
- Exact Penalty Functions in Constrained Optimization
- Refinements of necessary optimality conditions in nondifferentiable programming II
This page was built for publication: Minimization of \(SC^ 1\) functions and the Maratos effect