A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
From MaRDI portal
Publication:5739143
DOI10.1287/moor.2016.0817zbMath1364.90251OpenAlexW2551153860WikidataQ106727948 ScholiaQ106727948MaRDI QIDQ5739143
Jérôme Bolte, Heinz H. Bauschke, Marc Teboulle
Publication date: 2 June 2017
Published in: Mathematics of Operations Research (Search for Journal in Brave)
Full work available at URL: http://publications.ut-capitole.fr/25852/1/25852.pdf
complexityBregman distancefirst-order methodsdescent lemmaproximal-gradient algorithmscomposite nonsmooth convex minimizationmultiplicative Poisson linear inverse problems
Related Items
Constrained composite optimization and augmented Lagrangian methods, Smooth over-parameterized solvers for non-smooth structured optimization, Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization, No-regret algorithms in on-line learning, games and convex optimization, Radial duality. II: Applications and algorithms, Conditional gradient method for vector optimization, Super-Universal Regularized Newton Method, Bregman-Golden ratio algorithms for variational inequalities, Convergence rate analysis of the multiplicative gradient method for PET-type problems, Smoothing accelerated proximal gradient method with fast convergence rate for nonsmooth convex optimization beyond differentiability, An alternating structure-adapted Bregman proximal gradient descent algorithm for constrained nonconvex nonsmooth optimization problems and its inertial variant, Provable Phase Retrieval with Mirror Descent, Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness, First-order methods for convex optimization, Adaptive Third-Order Methods for Composite Convex Optimization, Local convexity of the TAP free energy and AMP convergence for \(\mathbb{Z}_2\)-synchronization, Convergence Analysis for Bregman Iterations in Minimizing a Class of Landau Free Energy Functionals, Block Bregman Majorization Minimization with Extrapolation, Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Nonconvex Optimization, Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems, Bregman proximal gradient algorithms for deep matrix factorization, Inexact basic tensor methods for some classes of convex optimization problems, Gradient methods with memory, Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in Hilbert spaces, Exact gradient methods with memory, Revisiting linearized Bregman iterations under Lipschitz-like convexity condition, Optimal complexity and certification of Bregman first-order methods, Global convergence of model function based Bregman proximal minimization algorithms, A simplified view of first order methods for optimization, Monotone operator theory in convex optimization, Extragradient and extrapolation methods with generalized Bregman distances for saddle point problems, Regularizing with Bregman--Moreau Envelopes, Screening for a reweighted penalized conditional gradient method, New convergence results for the inexact variable metric forward-backward method, Superfast second-order methods for unconstrained convex optimization, Gradient Flows, Second-Order Gradient Systems and Convexity, First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems, Bregman Proximal Point Algorithm Revisited: A New Inexact Version and Its Inertial Variant, Quadratic growth conditions and uniqueness of optimal solution to Lasso, Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier, New Bregman proximal type algoritms for solving DC optimization problems, Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient, Global convergence of the gradient method for functions definable in o-minimal structures, A simple convergence analysis of Bregman proximal gradient algorithm, Convergence Analysis of the Proximal Gradient Method in the Presence of the Kurdyka–Łojasiewicz Property Without Global Lipschitz Assumptions, Non-smooth non-convex Bregman minimization: unification and new algorithms, Convergence of the exponentiated gradient method with Armijo line search, Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods, Affine Invariant Convergence Rates of the Conditional Gradient Method, Dualities for Non-Euclidean Smoothness and Strong Convexity under the Light of Generalized Conjugacy, A Bregman stochastic method for nonconvex nonsmooth problem beyond global Lipschitz gradient continuity, Numerical methods for some classes of variational inequalities with relatively strongly monotone operators, Stochastic composition optimization of functions without Lipschitz continuous gradient, Novel Proximal Gradient Methods for Nonnegative Matrix Factorization with Sparsity Constraints, On the strong concavity of the dual function of an optimization problem, Affine-invariant contracting-point methods for convex optimization, Perturbed Fenchel duality and first-order methods, Resolvent of the parallel composition and the proximity operator of the infimal postcomposition, Inexact accelerated high-order proximal-point methods, Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions, Relatively Smooth Convex Optimization by First-Order Methods, and Applications, A simple nearly optimal restart scheme for speeding up first-order methods, Golden ratio algorithms for variational inequalities, Bregman three-operator splitting methods, An incremental mirror descent subgradient algorithm with random sweeping and proximal step, Re-examination of Bregman functions and new properties of their divergences, Contracting Proximal Methods for Smooth Convex Optimization, Implementable tensor methods in unconstrained convex optimization, Efficient Numerical Methods for Computing the Stationary States of Phase Field Crystal Models, Nonlinear Forward-Backward Splitting with Projection Correction, Efficient Search of First-Order Nash Equilibria in Nonconvex-Concave Smooth Min-Max Problems, Point process estimation with Mirror Prox algorithms, New characterizations of Hoffman constants for systems of linear constraints, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Inertial Variable Metric Techniques for the Inexact Forward--Backward Algorithm, Unnamed Item, Algorithms for nonnegative matrix factorization with the Kullback-Leibler divergence, A unified convergence analysis of stochastic Bregman proximal gradient and extragradient methods, On the linear convergence of forward-backward splitting method. I: Convergence analysis, Quartic first-order methods for low-rank minimization, A block coordinate variable metric linesearch based proximal gradient method, The condition number of a function relative to a set, An enhanced Baillon-Haddad theorem for convex functions defined on convex sets, Sharpness, Restart, and Acceleration, Accelerated Bregman proximal gradient methods for relatively smooth convex optimization, A Laplacian approach to \(\ell_1\)-norm minimization, Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization, Fastest rates for stochastic mirror descent methods, Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problems, A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization, Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems, Bregman forward-backward operator splitting, Bregman proximal mappings and Bregman-Moreau envelopes under relative prox-regularity, Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization, An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization, Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity, Composite Optimization by Nonconvex Majorization-Minimization, Modern regularization methods for inverse problems, Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions, A telescopic Bregmanian proximal gradient method without the global Lipschitz continuity assumption, On linear convergence of non-Euclidean gradient methods without strong convexity and Lipschitz gradient continuity, The forward-backward splitting method for non-Lipschitz continuous minimization problems in Banach spaces, Cyclic coordinate descent in the Hölder smooth setting, Curiosities and counterexamples in smooth convex optimization, A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima, A dual Bregman proximal gradient method for relatively-strongly convex optimization, Convergence properties of monotone and nonmonotone proximal gradient methods revisited, Dual Space Preconditioning for Gradient Descent, On inexact solution of auxiliary problems in tensor methods for convex optimization, Choose Your Path Wisely: Gradient Descent in a Bregman Distance Framework, Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure, Inexact model: a framework for optimization and variational inequalities, Self-Adaptive Inertial Projection and Contraction Algorithm for Monotone Variational Inequality, Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity, Scaled, Inexact, and Adaptive Generalized FISTA for Strongly Convex Optimization, Restarting Frank-Wolfe: faster rates under Hölderian error bounds
Uses Software