Nonsmoothness in machine learning: specific structure, proximal identification, and applications
From MaRDI portal
Publication:829492
DOI10.1007/s11228-020-00561-1zbMath1506.90272arXiv2010.00848OpenAlexW3090131095MaRDI QIDQ829492
Franck Iutzeler, Jérôme Malick
Publication date: 6 May 2021
Published in: Set-Valued and Variational Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.00848
Convex programming (90C25) Derivative-free methods and methods using generalized derivatives (90C56) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Newton acceleration on manifolds identified by proximal gradient methods, Local linear convergence of proximal coordinate descent algorithm, Distributed Learning with Sparse Communications by Identification, An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonlinear total variation based noise removal algorithms
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Computing proximal points of nonconvex functions
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- On the interplay between acceleration and identification for the proximal gradient algorithm
- ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?
- A \(\mathcal{VU}\)-algorithm for convex minimization
- Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods
- Low Complexity Regularization of Linear Inverse Problems
- Proximal Splitting Methods in Signal Processing
- Optimization with Sparsity-Inducing Penalties
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Discrete Total Variation: New Definition and Minimization
- Identifiable Surfaces in Constrained Optimization
- Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- The Proximal Average: Basic Theory
- Monotone Operators and the Proximal Point Algorithm
- Variational Analysis
- Sensitivity Analysis for Mirror-Stratifiable Convex Functions
- First-Order Methods in Optimization
- Gap Safe screening rules for sparsity enforcing penalties
- The 𝒰-Lagrangian of a convex function
- Safe Feature Elimination in Sparse Supervised Learning
- Thresholding gradient methods in Hilbert spaces: support identification and linear convergence
- A Distributed Flexible Delay-Tolerant Proximal Gradient Algorithm
- The $L^1$-Potts Functional for Robust Jump-Sparse Reconstruction
- Sparse regularization on thin grids I: the Lasso
- Nonconvex Sparse Regularization and Splitting Algorithms
- Understanding Machine Learning
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- Convex analysis and monotone operator theory in Hilbert spaces
- Compressed sensing