A conditional gradient method with linear rate of convergence for solving convex linear systems
From MaRDI portal
Publication:1762672
DOI10.1007/s001860300327zbMath1138.90440OpenAlexW1969244384MaRDI QIDQ1762672
Publication date: 11 February 2005
Published in: Mathematical Methods of Operations Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s001860300327
Slater's conditionconditional gradientConic linear systemsefficiency and rate of convergence analysis
Numerical mathematical programming methods (65K05) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Quadratic programming (90C20) Linear programming (90C05)
Related Items
On the Frank–Wolfe algorithm for non-compact constrained optimization problems, Projection-free accelerated method for convex optimization, Linearly convergent away-step conditional gradient for non-strongly convex functions, Subgradient method with feasible inexact projections for constrained convex optimization problems, The Cyclic Block Conditional Gradient Method for Convex Optimization Problems, A Newton Frank-Wolfe method for constrained self-concordant minimization, On the global convergence of an inexact quasi-Newton conditional gradient method for constrained nonlinear systems, Inexact Newton method with feasible inexact projections for solving constrained smooth and nonsmooth equations, The generalized conditional gradient method for composite multiobjective optimization problems on Riemannian manifolds, Generalized self-concordant analysis of Frank-Wolfe algorithms, First-order methods for convex optimization, Approximate Douglas-Rachford algorithm for two-sets convex feasibility problems, Bayesian Quadrature, Energy Minimization, and Space-Filling Design, A novel Frank-Wolfe algorithm. Analysis and applications to large-scale SVM training, Conditional gradient method without line-search, Simplified versions of the conditional gradient method, Conditional Gradient Methods for Convex Optimization with General Affine and Nonlinear Constraints, Primal and dual predicted decrease approximation methods, A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization, Unnamed Item, Conditional gradient method for multiobjective optimization, Proximal algorithms in statistics and machine learning, The condition number of a function relative to a set, Alternating conditional gradient method for convex feasibility problems, Conditional Gradient Sliding for Convex Optimization, Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization, Polytope Conditioning and Linear Convergence of the Frank–Wolfe Algorithm, On the inexact scaled gradient projection method