An Extended Frank--Wolfe Method with “In-Face” Directions, and Its Application to Low-Rank Matrix Completion
From MaRDI portal
Publication:2968175
DOI10.1137/15M104726XzbMath1357.90115arXiv1511.02204MaRDI QIDQ2968175
Robert M. Freund, Paul Grigas, Rahul Mazumder
Publication date: 10 March 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1511.02204
convex optimizationcomputational guaranteesFrank-Wolfe methodlow-rank matrix completionnuclear norm regularization
Related Items
On the Frank–Wolfe algorithm for non-compact constrained optimization problems, Linearly convergent away-step conditional gradient for non-strongly convex functions, Frank--Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning, Fast Cluster Detection in Networks by First Order Optimization, Screening for a reweighted penalized conditional gradient method, Improved complexities for stochastic conditional gradient methods under interpolation-like conditions, Linear convergence of Frank-Wolfe for rank-one matrix recovery without strong convexity, Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier, A new low-cost feasible projection algorithm for pseudomonotone variational inequalities, The Frank-Wolfe algorithm: a short introduction, Robust matrix estimations meet Frank-Wolfe algorithm, Matrix completion with nonconvex regularization: spectral operators and scalable algorithms, Generalized self-concordant analysis of Frank-Wolfe algorithms, Frank-Wolfe and friends: a journey into projection-free first-order optimization methods, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Conditional gradient method for multiobjective optimization, Flexible low-rank statistical modeling with missing data and side information, Low-Rank Spectral Optimization via Gauge Duality, Complexity of linear minimization and projection on some sets, Avoiding bad steps in Frank-Wolfe variants, Restarting Frank-Wolfe: faster rates under Hölderian error bounds
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Fixed point and Bregman iterative methods for matrix rank minimization
- Linearly convergent away-step conditional gradient for non-strongly convex functions
- Fast low-rank modifications of the thin singular value decomposition
- Exact matrix completion via convex optimization
- On the von Neumann and Frank--Wolfe Algorithms with Away Steps
- Scalable Robust Matrix Recovery: Frank--Wolfe Meets Proximal Methods
- A Singular Value Thresholding Algorithm for Matrix Completion
- Facial structures of schattenp-Norms
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Some comments on Wolfe's ‘away step’
- Forward–Backward Greedy Algorithms for Atomic Norm Regularization
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Low-Rank Optimization with Trace Norm Penalty
- New analysis and results for the Frank-Wolfe method