A Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization Problems
From MaRDI portal
Publication:6116248
DOI10.1137/22m1482822zbMath1522.90145arXiv2203.02957MaRDI QIDQ6116248
Yuqia Wu, Xiao Qi Yang, Shaohua Pan
Publication date: 11 August 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2203.02957
global convergencelocal error boundregularized Newton methodsuperlinear convergence rateKL property\(\ell_q\)-norm regularized composite optimization
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonsmooth analysis (49J52)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Iterative reweighted minimization methods for \(l_p\) regularized unconstrained nonlinear programming
- Convergence properties of the regularized Newton method for the unconstrained nonconvex optimization
- Convergence of the reweighted \(\ell_1\) minimization algorithm for \(\ell_2-\ell_p\) minimization
- Local behavior of an iterative framework for generalized equations with nonisolated solutions
- Local convergence of the heavy-ball method and iPiano for non-convex optimization
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Regularized Newton methods for convex minimization problems with singular solutions
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems
- Kurdyka-Łojasiewicz property of zero-norm composite functions
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Proximal gradient algorithms under local Lipschitz gradient continuity. A convergence and robustness analysis of PANOC
- Optimization problems involving group sparsity terms
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Dynamic programming and suboptimal control: a survey from ADP to MPC
- Newton method for \(\ell_0\)-regularized optimization
- A globally convergent proximal Newton-type method in nonsmooth convex optimization
- A Nonlinear Lagrangian Approach to Constrained Optimization Problems
- A Krylov--Schur Algorithm for Large Eigenproblems
- On the convergence of the forward–backward splitting method with linesearches
- Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed $\ell_q$ Minimization
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Lower Bound Theory of Nonzero Entries in Solutions of $\ell_2$-$\ell_p$ Minimization
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Two-Point Step Size Gradient Methods
- Variational Analysis
- Sparse Reconstruction by Separable Approximation
- Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms
- Sparse Regularization: Convergence Of Iterative Jumping Thresholding Algorithm
- A Statistical View of Some Chemometrics Regression Tools
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- L 1/2 regularization
- Group sparse optimization via $\ell_{p,q}$ regularization
- The Variable Metric Forward-Backward Splitting Algorithm Under Mild Differentiability Assumptions
- Efficient Reconstruction of Piecewise Constant Images Using Nonsmooth Nonconvex Minimization
- A Unified Augmented Lagrangian Approach to Duality and Exact Penalization
- A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima
- Newton acceleration on manifolds identified by proximal gradient methods