Generalized Hessian matrix and second-order optimality conditions for problems with \(C^{1,1}\) data

From MaRDI portal
Publication:795323

DOI10.1007/BF01442169zbMath0542.49011OpenAlexW1993613781MaRDI QIDQ795323

Jean-Baptiste Hiriart-Urruty, Van Hien Nguyen, Jean Jacques Strodiot

Publication date: 1984

Published in: Applied Mathematics and Optimization (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/bf01442169



Related Items

On optimality conditions for nonsmooth vector problems in normed spaces via generalized Hadamard directional derivatives, Necessary conditions for vector optimization in infinite dimension, A subclass of generating set search with convergence to second-order stationary points, Second-order conditions in c1, 1optimization with applications, Unnamed Item, Local properties of solutions of nonsmooth variational solutions of nonsmooth variational inequalities, Invexity criteria for a class of vector-valued functions, Generalized second-order derivatives and optimality conditions, Zero-norm regularized problems: equivalent surrogates, proximal MM method and statistical error bound, Newton-based approach to solving K-SVCR and twin-KSVC multi-class classification in the primal space, Strong Variational Sufficiency for Nonlinear Semidefinite Programming and Its Implications, New second-order limiting directional derivatives and \(C^1\)-optimization, Global well‐posedness for the one‐phase Muskat problem, On Lagrangian L2-norm pinball twin bounded support vector machine via unconstrained convex minimization, A semismooth Newton based dual proximal point algorithm for maximum eigenvalue problem, On the weak second-order optimality condition for nonlinear semidefinite and second-order cone programming, A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds, A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems, A Decomposition Augmented Lagrangian Method for Low-Rank Semidefinite Programming, Unnamed Item, Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization, Optimality conditions for nonsmooth vector problems in normed spaces, Fréchet Second-Order Subdifferentials of Lagrangian Functions and Optimality Conditions, Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification, On second-order directional derivatives, Chunking for massive nonlinear kernel classification, Inexact Newton Method for Minimization of Convex Piecewise Quadratic Functions, An Asymptotically Superlinearly Convergent Semismooth Newton Augmented Lagrangian Method for Linear Programming, Second order approximations and dual necessary optimality conditions, On second-order Fritz John type optimality conditions for a class of differentiable optimization problems, On second-order sufficient optimality conditions for c 1,1-optimization problems, Generalised hessian, max function and weak convexity, Maximality and first-order criteria of r-monotone operators, Unnamed Item, Second-order necessary optimality conditions via directional regularity, A filter-trust-region method for LC 1 unconstrained optimization and its global convergence, Limiting subhessians, limiting subjets and their calculus, Exactness conditions for a convex differentiable exterior penalty for linear programming, Second-order conditions for constrained vector optimization problems with ℓ-stable data, On Pseudo-Differentiability, Linear programming twin support vector regression, Minimum norm solution to the positive semidefinite linear complementarity problem, A globally and superlinearly convergent trust region method for \(LC^1\) optimization problems, Sufficient Conditions of Isolated Minimizers for Constrained Programming Problems, A finite newton method for classification, Optimality conditions for \(C^{1,1}\) vector optimization problems, Saddlepoint Problems in Nondifferentiable Programming, Multicategory proximal support vector machine classifiers, Exact penalty functions and Lagrange multipliers, Generalized Second Derivatives of Convex Functions and Saddle Functions, A semismooth Newton-CG based dual PPA for matrix spectral norm approximation problems, Multicategory proximal support vector machine classifiers, Strong second-order Karush–Kuhn–Tucker optimality conditions for vector optimization, Proximal Gradient Method for Nonsmooth Optimization over the Stiefel Manifold, Minimal approximate Hessians for continuously Gâteaux differentiable functions, A Convex Matrix Optimization for the Additive Constant Problem in Multidimensional Scaling with Application to Locally Linear Embedding, Parallel implementation of augmented Lagrangian method within L-shaped method for stochastic linear programs, Stability of inclusions: characterizations via suitable Lipschitz functions and algorithms, Controllability of Some Nonlinear Systems with Drift via Generalized Curvature Properties, Constrained Best Euclidean Distance Embedding on a Sphere: A Matrix Optimization Approach, Unnamed Item, Necessary and Sufficient Optimality Conditions in DC Semi-infinite Programming, Newton Hard-Thresholding Pursuit for Sparse Linear Complementarity Problem via A New Merit Function, Recursive Finite Newton Algorithm for Support Vector Regression in the Primal, On relations and applications of generalized second-order directional derivatives, Approximate generalized Hessians and Taylor’s expansions for continuously Gâteaux differentiable functions, Smoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex Programs, Unnamed Item, Support functions of the Clarke generalized Jacobian and of its plenary hull, Second-order conditions for efficiency in nonsmooth multiobjective optimization problems, Generalized derivatives and nonsmooth optimization, a finite dimensional tour (with comments and rejoinder), Computing minimum norm solution of linear systems of equations by the generalized Newton method, Metric regularity and second-order necessary optimality conditions for minimization problems under inclusion constraints, Superlinearly convergent approximate Newton methods for LC\(^ 1\) optimization problems, A set-valued analysis approach to second order differentiation of nonsmooth functions, Massive data classification via unconstrained support vector machines, Optimality conditions for semi-infinite and generalized semi-infinite programs via lower order exact penalty functions, On the convergence properties of a majorized alternating direction method of multipliers for linearly constrained convex optimization problems with coupled objective functions, Second-order optimality conditions for constrained optimization problems with \(C^1\) data via regular and limiting subdifferentials, An ODE-based trust region method for unconstrained optimization problems, A globally convergent Newton method for convex \(SC^ 1\) minimization problems, Generalized Hessian for \(C^{1,1}\) functions in infinite dimensional normed spaces, On second-order Fritz John type optimality conditions in nonsmooth multiobjective programming, Properties associated with the epigraph of the \(l_1\) norm function of projection onto the nonnegative orthant, Minimization of \(SC^ 1\) functions and the Maratos effect, Convex composite minimization with \(C^{1,1}\) functions, Augmented Lagrangian methods for convex matrix optimization problems, A fast eigenvalue approach for solving the trust region subproblem with an additional linear inequality, Pseudo-Hessian and Taylor's expansion for vector-valued functions, Optimality conditions for reflecting boundary control problems, First and second order optimality conditions using approximations for nonsmooth vector optimization in Banach spaces, Parametric method for global optimization, Second-order subdifferentials of \(C^{1,1}\) functions and optimality conditions, Local feasible QP-free algorithms for the constrained minimization of SC\(^1\) functions, Second-order optimality conditions for nonlinear programs and mathematical programs, Some applications of variational inequalities in nonsmooth analysis, Second-order optimality conditions for inequality constrained problems with locally Lipschitz data, A perturbation approach for an inverse quadratic programming problem, Optimal control of nonconvex sweeping processes with separable endpoints: nonsmooth maximum principle for local minimizers, Existence of augmented Lagrange multipliers: reduction to exact penalty functions and localization principle, On generalized Fenchel-Moreau theorem and second-order characterization for convex vector functions, An improved robust and sparse twin support vector regression via linear programming, New second-order Karush-Kuhn-Tucker optimality conditions for vector optimization, On relations of vector optimization results with \(C^{1,1}\) data, Characterizing convexity of a function by its Fréchet and limiting second-order subdifferentials, Second order optimality conditions for the extremal problem under inclusion constraints, A nonsmooth maximum principle for a controlled nonconvex sweeping process, Optimality conditions for \(C^{1,1}\) constrained multiobjective problems, Positive definiteness of high-order subdifferential and high-order optimality conditions in vector optimization problems, Second-order optimality conditions in minimax optimization problems, Newton-type method for solving systems of linear equations and inequalities, Feasible perturbations of control systems with pure state constraints and applications to second-order optimality conditions, The augmented Lagrangian method for a type of inverse quadratic programming problems over second-order cones, Second-order necessary optimality conditions for optimization problems involving set-valued maps, A new second order optimality conditions for the extremal problem under inclusion constraints, Second-order characterizations of quasiconvexity and pseudoconvexity for differentiable functions with Lipschitzian derivatives, Optimality conditions based on the Fréchet second-order subdifferential, Augmented Lagrangian method within L-shaped method for stochastic linear programs, QSDPNAL: a two-phase augmented Lagrangian method for convex quadratic semidefinite programming, SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints, Inexact variable metric stochastic block-coordinate descent for regularized optimization, An efficient duality-based approach for PDE-constrained sparse optimization, Breast tumor susceptibility to chemotherapy via support vector machines, Differentiability properties of functions that are \(\ell \)-stable at a point, Differentiability properties of \(\ell\)-stable vector functions in infinite-dimensional normed spaces, From scalar to vector optimization., Parametric proximal-point methods, On second-order conditions in unconstrained optimization, Stability in generalized differentiability based on a set convergence principle, A generalized Newton algorithm for quantile regression models, Second-order optimality conditions for nondominated solutions of multiobjective programming with \(C^{1,1}\) data, Second-order optimality conditions using approximations for nonsmooth vector optimization problems under inclusion constraints, Bicovariograms and Euler characteristic of random fields excursions, A Newton method for linear programming, Analysis on Newton projection method for the split feasibility problem, Solving equations via the trust region and its application to a class of stochastic linear complementarity problems, Second-order global optimality conditions for optimization problems, An augmented Lagrangian method for a class of Inverse quadratic programming problems, On necessary optimality conditions for nonsmooth vector optimization problems with mixed constraints in infinite dimensions, New second-order optimality conditions for a class of differentiable optimization problems, Efficient implicit Lagrangian twin parametric insensitive support vector regression via unconstrained minimization problems, On the efficient computation of a generalized Jacobian of the projector over the Birkhoff polytope, Projection Methods in Conic Optimization, An enhanced Baillon-Haddad theorem for convex functions defined on convex sets, An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming, On the optimal correction of infeasible systems of linear inequalities, The second order optimality conditions for nonlinear mathematical programming with \(C^{1,1}\) data, First and second-order approximations as derivatives of mappings in optimality conditions for nonsmooth vector optimization, Fréchet approach in second-order optimization, A note on second-order optimality conditions, Second-order optimality conditions for the extremal problem under inclusion constraints, Second order optimality conditions for a bilevel optimization problem in terms of approximate Hessians, Higher order optimality conditions with an arbitrary non-differentiable function, Second-order global optimality conditions for convex composite optimization, Distributed coordination for nonsmooth convex optimization via saddle-point dynamics, On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization, Newton's method and quasi-Newton-SQP method for general \(\text{LC}^1\) constrained optimization, An improvement on parametric \(\nu\)-support vector algorithm for classification, Characterization of strict convexity for locally Lipschitz functions, Second-order mollified derivatives and optimization, Limited-memory common-directions method for large-scale optimization: convergence, parallelization, and distributed optimization, A semismooth Newton-based augmented Lagrangian algorithm for density matrix least squares problems, Second-order conditions in \(C^{1,1}\) constrained vector optimization, Lipschitzian inverse functions, directional derivatives, and applications in \(C^{1,1}\) optimization, Descent algorithm for a class of convex nondifferentiable functions, Limiting behavior of the approximate second-order subdifferential of a convex function, Characterizations of strict local minima and necessary conditions for weak sharp minima, Distributed optimization for uncertain Euler-Lagrange systems with local and relative measurements



Cites Work