Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization
From MaRDI portal
Publication:5020852
DOI10.1137/21M1409445zbMath1483.90126arXiv2103.14987MaRDI QIDQ5020852
Sheng-Long Zhou, Li-Li Pan, Hou-Duo Qi, Nai-Hua Xiu
Publication date: 7 January 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2103.14987
Newton's methodoptimality conditionslocally quadratic convergence\(0/1\) loss functionsuperior numerical performance
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Related Items
Curved elements in weak Galerkin finite element methods ⋮ A majorization penalty method for SVM with sparse constraint
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Supersparse linear integer models for optimized medical scoring systems
- A smoothing Newton-type algorithm of stronger convergence for the quadratically constrained convex quadratic programming
- Noisy 1-bit compressive sensing: models and algorithms
- Analysis of the consistency of a mixed integer programming-based multi-category constrained discriminant model
- Non-parametric analysis of a generalized regression model. The maximum rank correlation estimator
- Solving mixed integer classification problems by decomposition
- On the difficulty of approximately maximizing agreements.
- A smoothing Newton method for general nonlinear complementarity problems
- Support-vector networks
- Regularization networks and support vector machines
- Semi-smooth Newton methods for state-constrained optimal control problems
- On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions, and Algorithms
- Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed $\ell_q$ Minimization
- Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms
- Binarized Support Vector Machines
- Robust 1-Bit Compressive Sensing via Binary Stable Embeddings of Sparse Vectors
- Support Vector Machines with the Ramp Loss and the Hard Margin Loss
- Robust Decoding from 1-Bit Compressive Sampling with Ordinary and Regularized Least Squares
- Robust Truncated Hinge Loss Support Vector Machines
- Integer Programming Solution of a Classification Problem
- Global and superlinear convergence of the smoothing Newton method and its application to general box constrained variational inequalities
- The Primal-Dual Active Set Strategy as a Semismooth Newton Method
- Robust 1-bit Compressive Sensing Using Adaptive Outlier Pursuit
- A Primal Dual Active Set Algorithm With Continuation for Compressed Sensing
- Agnostic Learning of Monomials by Halfspaces Is Hard
This page was built for publication: Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization