Conditional Gradient Sliding for Convex Optimization
From MaRDI portal
Publication:2816241
DOI10.1137/140992382zbMath1342.90132OpenAlexW2188195356MaRDI QIDQ2816241
No author found.
Publication date: 4 July 2016
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/16b3f9790d37035faf5837ac68661c6df13a9dcb
Semidefinite programming (90C22) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Numerical methods based on nonlinear programming (49M37)
Related Items
Decomposition techniques for bilinear saddle point problems and variational inequalities with affine monotone operators, On the Frank–Wolfe algorithm for non-compact constrained optimization problems, Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method, Projection-free accelerated method for convex optimization, Block coordinate type methods for optimization and learning, Subgradient method with feasible inexact projections for constrained convex optimization problems, A distributed Frank-Wolfe framework for learning low-rank matrices with the trace norm, Oracle complexity separation in convex optimization, A Newton Frank-Wolfe method for constrained self-concordant minimization, Frank--Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning, An inexact Newton-like conditional gradient method for constrained nonlinear systems, Improved complexities for stochastic conditional gradient methods under interpolation-like conditions, Universal Conditional Gradient Sliding for Convex Optimization, Inexact Newton method with feasible inexact projections for solving constrained smooth and nonsmooth equations, Secant-inexact projection algorithms for solving a new class of constrained mixed generalized equations problems, No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization, Generalized self-concordant analysis of Frank-Wolfe algorithms, First-order methods for convex optimization, Approximate Douglas-Rachford algorithm for two-sets convex feasibility problems, Zeroth-order nonconvex stochastic optimization: handling constraints, high dimensionality, and saddle points, Inexact gradient projection method with relative error tolerance, Conditional gradient type methods for composite nonlinear and stochastic optimization, Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems, Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis, Newton's method with feasible inexact projections for solving constrained generalized equations, Stochastic Conditional Gradient++: (Non)Convex Minimization and Continuous Submodular Maximization, Frank-Wolfe and friends: a journey into projection-free first-order optimization methods, Conditional Gradient Methods for Convex Optimization with General Affine and Nonlinear Constraints, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Conditional gradient method for multiobjective optimization, A Newton conditional gradient method for constrained nonlinear systems, Alternating conditional gradient method for convex feasibility problems, On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming, Unnamed Item, Complexity of linear minimization and projection on some sets, Network manipulation algorithm based on inexact alternating minimization, Avoiding bad steps in Frank-Wolfe variants, Restarting Frank-Wolfe: faster rates under Hölderian error bounds
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- An optimal method for stochastic composite optimization
- On lower complexity bounds for large-scale smooth convex optimization
- Dual subgradient algorithms for large-scale nonsmooth learning problems
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- A generalized conditional gradient method and its connection to an iterative shrinkage method
- Introductory lectures on convex optimization. A basic course.
- A conditional gradient method with linear rate of convergence for solving convex linear systems
- A modified Frank--Wolfe algorithm for computing minimum-area enclosing ellipsoidal cylinders: theory and algorithms
- Accelerated and Inexact Forward-Backward Algorithms
- Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm
- Learning Kernel-Based Halfspaces with the 0-1 Loss
- Iterated Hard Shrinkage for Minimization Problems with Sparsity Constraints
- Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length Rules
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Conditional Gradient Algorithmsfor Rank-One Matrix Approximations with a Sparsity Constraint
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Optimal Primal-Dual Methods for a Class of Saddle Point Problems
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- Sparse Approximate Solutions to Semidefinite Programs
- Signal Recovery by Proximal Forward-Backward Splitting
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
- New analysis and results for the Frank-Wolfe method
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization