A globally convergent proximal Newton-type method in nonsmooth convex optimization
From MaRDI portal
Publication:2687066
DOI10.1007/s10107-022-01797-5OpenAlexW3105742279MaRDI QIDQ2687066
Boris S. Mordukhovich, Shangzhi Zeng, Jin Zhang, Xiao-Ming Yuan
Publication date: 1 March 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2011.08166
global and local convergencemachine learningnonsmooth convex optimizationmetric subregularityproximal Newton methods
Related Items
COAP 2021 best paper prize, Generalized damped Newton algorithms in nonsmooth optimization via second-order subdifferentials, A Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization Problems, Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization, Proximal quasi-Newton method for composite optimization over the Stiefel manifold, Minimizing oracle-structured composite functions, Convergence Rate of Inexact Proximal Point Algorithms for Operator with Hölder Metric Subregularity, Inexact proximal Newton methods in Hilbert spaces, A Riemannian Proximal Newton Method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Metric subregularity of order \(q\) and the solving of inclusions
- Lectures on convex optimization
- Higher-order metric subregularity and its applications
- Proximal quasi-Newton methods for nondifferentiable convex optimization
- Local behavior of an iterative framework for generalized equations with nonisolated solutions
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems
- Hölder weak sharp minimizers and Hölder tilt-stability
- Linear convergence of first order methods for non-strongly convex optimization
- Inexact successive quadratic approximation for regularized optimization
- Pathwise coordinate optimization
- Proximal Newton-Type Methods for Minimizing Composite Functions
- A generalized proximal point algorithm for certain non-convex minimization problems
- Generalized equations and their solutions, part II: Applications to nonlinear programming
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Monotone Operators and the Proximal Point Algorithm
- Variational Analysis
- Variational Analysis and Applications
- Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms
- First-Order Methods in Optimization
- Convergence Properties of the Inexact Levenberg-Marquardt Method under Local Error Bound Conditions
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Hölder Metric Subregularity with Applications to Proximal Point Method
- Second-order growth, tilt stability, and metric regularity of the subdifferential
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Newton-Type Methods for Optimization and Variational Problems
- Metric subregularity of the convex subdifferential in Banach spaces
- Implicit Functions and Solution Mappings