Kurdyka-Łojasiewicz property of zero-norm composite functions
From MaRDI portal
Publication:2026719
DOI10.1007/s10957-020-01779-7zbMath1468.90100arXiv1811.04371OpenAlexW3107686159MaRDI QIDQ2026719
Shaohua Pan, Shujun Bi, Yuqia Wu
Publication date: 20 May 2021
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.04371
Nonconvex programming, global optimization (90C26) Derivative-free methods and methods using generalized derivatives (90C56) Fréchet and Gateaux differentiability in optimization (49J50)
Related Items
A Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization Problems ⋮ Calculus rules of the generalized concave Kurdyka-Łojasiewicz property ⋮ Correction to: ``Kurdyka-Łojasiewicz property of zero-norm composite functions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Restricted normal cones and sparsity optimization with affine constraints
- New fractional error bounds for polynomial systems with applications to Hölderian stability in optimization and spectral theory of tensors
- On metric and calmness qualification conditions in subdifferential calculus
- A coordinate gradient descent method for nonsmooth separable minimization
- From error bounds to the complexity of first-order descent methods for convex functions
- A unified approach to error bounds for structured convex optimization problems
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Generalized subdifferentials of the rank function
- Quadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods
- On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming
- Error bounds for parametric polynomial systems with applications to higher-order stability analysis and convergence rates
- Generalized power method for sparse principal component analysis
- Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems
- Sparse and stable Markowitz portfolios
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- Some continuity properties of polyhedral multifunctions
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality Problem
- Variational Analysis
- Variational Analysis and Applications
- A Sparse Completely Positive Relaxation of the Modularity Maximization for Community Detection
- Explicit bounds for the Łojasiewicz exponent in the gradient inequality for polynomials