High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
From MaRDI portal
Publication:2020600
DOI10.1007/s10107-020-01470-9zbMath1465.90095arXiv1902.10767OpenAlexW3000514161MaRDI QIDQ2020600
Phillipe L. Toint, Xiaojun Chen
Publication date: 23 April 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.10767
nonlinear optimizationisotropic modelcomplexity theorygroup sparsitynon-Lipschitz functionspartially-separable problems
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Optimality conditions and duality in mathematical programming (90C46)
Related Items
Tensor methods for finding approximate stationary points of convex functions, An interior stochastic gradient method for a class of non-Lipschitz optimization problems, The evaluation complexity of finding high-order minimizers of nonconvex optimization, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, An adaptive high order method for finding third-order critical points of nonconvex optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Modeling Language for Mathematical Programming
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals
- The benefit of group sparsity
- Modified partial-update Newton-type algorithms for unary optimization
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- Isotropic sparse regularization for spherical harmonic representations of random fields on the sphere
- Optimization problems involving group sparsity terms
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Error bounds for compressed sensing algorithms with group sparsity: A unified approach
- Support union recovery in high-dimensional multivariate regression
- Complexity of unconstrained \(L_2 - L_p\) minimization
- Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization
- Optimality Conditions and a Smoothing Trust Region Newton Method for NonLipschitz Optimization
- Worst-Case Complexity of Smoothing Quadratic Regularization Methods for Non-Lipschitzian Optimization
- Group-Sparse Model Selection: Hardness and Relaxations
- Lower Bound Theory of Nonzero Entries in Solutions of $\ell_2$-$\ell_p$ Minimization
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
- A group bridge approach for variable selection
- Trust Region Methods
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- The Group Lasso for Stable Recovery of Block-Sparse Signal Representations
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Spherical Designs and Nonconvex Minimization for Recovery of Sparse Signals on the Sphere
- Convergence Properties of Minimization Algorithms for Convex Constraints Using a Structured Trust Region
- WORST-CASE EVALUATION COMPLEXITY AND OPTIMALITY OF SECOND-ORDER METHODS FOR NONCONVEX SMOOTH OPTIMIZATION
- Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints
- Sparse optimization for nonconvex group penalized estimation
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization
- Subspace Methods for Joint Sparse Recovery
- Partial-Update Newton Methods for Unary, Factorable, and Partially Separable Optimization
- Model Selection and Estimation in Regression with Grouped Variables
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors