Projected subgradient methods with non-Euclidean distances for non-differentiable convex minimization and variational inequalities

From MaRDI portal
Publication:1016351

DOI10.1007/s10107-007-0147-zzbMath1190.90118OpenAlexW2161209207MaRDI QIDQ1016351

Alfred Auslender, Marc Teboulle

Publication date: 5 May 2009

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10107-007-0147-z



Related Items

Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in Hilbert spaces, Iterative Methods for the Elastography Inverse Problem of Locating Tumors, A simplified view of first order methods for optimization, A fast dual proximal gradient algorithm for convex minimization and applications, A penalty algorithm for solving convex separable knapsack problems, Stochastic composition optimization of functions without Lipschitz continuous gradient, Self-adaptive gradient projection algorithms for variational inequalities involving non-Lipschitz continuous operators, No-regret algorithms in on-line learning, games and convex optimization, Explainable bilevel optimization: an application to the Helsinki Deblur Challenge, First-order methods for convex optimization, Evolution differential inclusion with projection for solving constrained nonsmooth convex optimization in Hilbert space, An alternating extragradient method with non Euclidean projections for saddle point problems, Unnamed Item, Incremental quasi-subgradient methods for minimizing the sum of quasi-convex functions, Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems, Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization, Scaling Techniques for $\epsilon$-Subgradient Methods, Abstract convergence theorem for quasi-convex optimization problems with applications, Subgradient methods for saddle-point problems, On linear convergence of non-Euclidean gradient methods without strong convexity and Lipschitz gradient continuity, Convergence rates of subgradient methods for quasi-convex optimization problems, Asymptotic behavior analysis on multivalued evolution inclusion with projection in Hilbert space, Two-step inertial Bregman alternating minimization algorithm for nonconvex and nonsmooth problems, An interior projected-like subgradient method for mixed variational inequalities



Cites Work