Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity
From MaRDI portal
Publication:5869813
DOI10.1137/21M140376XzbMath1501.90073arXiv2102.10312OpenAlexW3130060520MaRDI QIDQ5869813
Puya Latafat, Masoud Ahookhosh, Andreas Themelis, Panagiotis Patrinos
Publication date: 29 September 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2102.10312
nonsmooth nonconvex optimizationKL inequalityBregman-Moreau envelopeincremental aggregated algorithms
Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonsmooth analysis (49J52) Set-valued and variational analysis (49J53)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Saga
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Minimizing finite sums with the stochastic average gradient
- Iteration complexity analysis of block coordinate descent methods
- Incremental proximal methods for large scale convex optimization
- The Moreau envelope function and proximal mapping in the sense of the Bregman distance
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- A coordinate gradient descent method for nonsmooth separable minimization
- On gradients of functions definable in o-minimal structures
- On semi- and subanalytic geometry
- A simplified view of first order methods for optimization
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A geometric analysis of phase retrieval
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Fastest rates for stochastic mirror descent methods
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- Kurdyka-Łojasiewicz exponent via inf-projection
- Incrementally updated gradient methods for constrained and regularized optimization
- On linear convergence of non-Euclidean gradient methods without strong convexity and Lipschitz gradient continuity
- Non-smooth non-convex Bregman minimization: unification and new algorithms
- An Inexact Hybrid Generalized Proximal Point Algorithm and Some New Results on the Theory of Bregman Functions
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Clarke Subgradients of Stratifiable Functions
- Robust Stochastic Approximation Approach to Stochastic Programming
- Relaxation methods for problems with strictly convex separable costs and linear constraints
- Gradient Convergence in Gradient methods with Errors
- ESSENTIAL SMOOTHNESS, ESSENTIAL STRICT CONVEXITY, AND LEGENDRE FUNCTIONS IN BANACH SPACES
- Optical Wavefront Reconstruction: Theory and Numerical Methods
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- First-Order Methods in Optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
- On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging
- Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions
- Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications
- On the Convergence of Block Coordinate Descent Type Methods
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- A Convergent Incremental Gradient Method with a Constant Step Size
- Convex Analysis
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima
- The nonsmooth landscape of phase retrieval
- The elements of statistical learning. Data mining, inference, and prediction
- Convergence of a block coordinate descent method for nondifferentiable minimization