Kurdyka-Łojasiewicz exponent via Hadamard parametrization
From MaRDI portal
Publication:6663111
DOI10.1137/24m1636186MaRDI QIDQ6663111
Wenqing Ouyang, Hao Wang, Yuncheng Liu, Ting Kei Pong
Publication date: 14 January 2025
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Nonconvex programming, global optimization (90C26)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Lectures on convex optimization
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Semianalytic and subanalytic sets
- Geometry of subanalytic and semialgebraic sets
- Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization
- From error bounds to the complexity of first-order descent methods for convex functions
- A unified approach to error bounds for structured convex optimization problems
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Loss landscapes and optimization in over-parameterized non-linear systems and neural networks
- First-order methods almost always avoid strict saddle points
- The equivalence of three types of error bounds for weakly and approximately convex functions
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Clarke Subgradients of Stratifiable Functions
- Decoding by Linear Programming
- Two Models of Double Descent for Weak Features
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- An Introduction to Differential Manifolds
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Stable signal recovery from incomplete and inaccurate measurements
- Convex Analysis
- Penalty Methods for a Class of Non-Lipschitz Optimization Problems
- High-dimensional linear regression via implicit regularization
- Convex analysis and monotone operator theory in Hilbert spaces
- Smooth over-parameterized solvers for non-smooth structured optimization
- More is Less: Inducing Sparsity via Overparameterization
- Gradient descent provably escapes saddle points in the training of shallow ReLU networks
This page was built for publication: Kurdyka-Łojasiewicz exponent via Hadamard parametrization