Conditional Gradient Sliding for Convex Optimization

From MaRDI portal
Publication:2816241

DOI10.1137/140992382zbMath1342.90132OpenAlexW2188195356MaRDI QIDQ2816241

No author found.

Publication date: 4 July 2016

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://semanticscholar.org/paper/16b3f9790d37035faf5837ac68661c6df13a9dcb



Related Items

Decomposition techniques for bilinear saddle point problems and variational inequalities with affine monotone operators, On the Frank–Wolfe algorithm for non-compact constrained optimization problems, Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method, Projection-free accelerated method for convex optimization, Block coordinate type methods for optimization and learning, Subgradient method with feasible inexact projections for constrained convex optimization problems, A distributed Frank-Wolfe framework for learning low-rank matrices with the trace norm, Oracle complexity separation in convex optimization, A Newton Frank-Wolfe method for constrained self-concordant minimization, Frank--Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning, An inexact Newton-like conditional gradient method for constrained nonlinear systems, Improved complexities for stochastic conditional gradient methods under interpolation-like conditions, Universal Conditional Gradient Sliding for Convex Optimization, Inexact Newton method with feasible inexact projections for solving constrained smooth and nonsmooth equations, Secant-inexact projection algorithms for solving a new class of constrained mixed generalized equations problems, No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization, Generalized self-concordant analysis of Frank-Wolfe algorithms, First-order methods for convex optimization, Approximate Douglas-Rachford algorithm for two-sets convex feasibility problems, Zeroth-order nonconvex stochastic optimization: handling constraints, high dimensionality, and saddle points, Inexact gradient projection method with relative error tolerance, Conditional gradient type methods for composite nonlinear and stochastic optimization, Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems, Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis, Newton's method with feasible inexact projections for solving constrained generalized equations, Stochastic Conditional Gradient++: (Non)Convex Minimization and Continuous Submodular Maximization, Frank-Wolfe and friends: a journey into projection-free first-order optimization methods, Conditional Gradient Methods for Convex Optimization with General Affine and Nonlinear Constraints, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Conditional gradient method for multiobjective optimization, A Newton conditional gradient method for constrained nonlinear systems, Alternating conditional gradient method for convex feasibility problems, On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming, Unnamed Item, Complexity of linear minimization and projection on some sets, Network manipulation algorithm based on inexact alternating minimization, Avoiding bad steps in Frank-Wolfe variants, Restarting Frank-Wolfe: faster rates under Hölderian error bounds



Cites Work