Conditional gradient algorithms for norm-regularized smooth convex optimization

From MaRDI portal
Publication:494314

DOI10.1007/s10107-014-0778-9zbMath1336.90069arXiv1302.2325OpenAlexW2003146967WikidataQ57392870 ScholiaQ57392870MaRDI QIDQ494314

Zaid Harchaoui, Arkadi Nemirovski, Anatoli B. Juditsky

Publication date: 31 August 2015

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1302.2325



Related Items

Decomposition techniques for bilinear saddle point problems and variational inequalities with affine monotone operators, On the Frank–Wolfe algorithm for non-compact constrained optimization problems, Projection-free accelerated method for convex optimization, Efficient numerical methods to solve sparse linear equations with application to PageRank, A distributed Frank-Wolfe framework for learning low-rank matrices with the trace norm, The Cyclic Block Conditional Gradient Method for Convex Optimization Problems, Frank--Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning, New results on subgradient methods for strongly convex optimization problems with a unified analysis, A Level-Set Method for Convex Optimization with a Feasible Solution Path, Screening for a reweighted penalized conditional gradient method, Unnamed Item, Improved complexities for stochastic conditional gradient methods under interpolation-like conditions, Noisy Euclidean Distance Realization: Robust Facial Reduction and the Pareto Frontier, Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier, A new low-cost feasible projection algorithm for pseudomonotone variational inequalities, Universal Conditional Gradient Sliding for Convex Optimization, Affine Invariant Convergence Rates of the Conditional Gradient Method, A unified analysis of stochastic gradient‐free Frank–Wolfe methods, The Frank-Wolfe algorithm: a short introduction, Secant-inexact projection algorithms for solving a new class of constrained mixed generalized equations problems, No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization, Generalized self-concordant analysis of Frank-Wolfe algorithms, An Extended Frank--Wolfe Method with “In-Face” Directions, and Its Application to Low-Rank Matrix Completion, Conditional gradient method for double-convex fractional programming matrix problems, Conditional gradient type methods for composite nonlinear and stochastic optimization, Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis, Conditional gradient algorithms for norm-regularized smooth convex optimization, Level-set methods for convex optimization, Frank-Wolfe and friends: a journey into projection-free first-order optimization methods, On the solution uniqueness characterization in the L1 norm and polyhedral gauge recovery, Conditional Gradient Methods for Convex Optimization with General Affine and Nonlinear Constraints, The Alternating Descent Conditional Gradient Method for Sparse Inverse Problems, A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point, Complexity bounds for primal-dual methods minimizing the model of objective function, Conditional gradient method for multiobjective optimization, Cut Pursuit: Fast Algorithms to Learn Piecewise Constant Functions on General Weighted Graphs, New analysis and results for the Frank-Wolfe method, Low Complexity Regularization of Linear Inverse Problems, The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy, Conditional Gradient Sliding for Convex Optimization, Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization, On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming, Scalable Robust Matrix Recovery: Frank--Wolfe Meets Proximal Methods, Generalized Conditional Gradient for Sparse Estimation, Inexact model: a framework for optimization and variational inequalities, Solving variational inequalities with monotone operators on domains given by linear minimization oracles


Uses Software


Cites Work