Complexity bounds for primal-dual methods minimizing the model of objective function

From MaRDI portal
Publication:1785201

DOI10.1007/s10107-017-1188-6zbMath1397.90351OpenAlexW2140180727MaRDI QIDQ1785201

G. Richomme

Publication date: 28 September 2018

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10107-017-1188-6




Related Items

Technical Note—Dynamic Data-Driven Estimation of Nonparametric Choice ModelsEfficient numerical methods to solve sparse linear equations with application to PageRankGradient methods with memoryExact gradient methods with memoryDual approaches to the minimization of strongly convex functionals with a simple structure under affine constraintsAnalysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrierUniversal Conditional Gradient Sliding for Convex OptimizationAffine Invariant Convergence Rates of the Conditional Gradient MethodA unified analysis of stochastic gradient‐free Frank–Wolfe methodsShort paper -- A note on the Frank-Wolfe algorithm for a class of nonconvex and nonsmooth optimization problemsAffine-invariant contracting-point methods for convex optimizationGeneralized self-concordant analysis of Frank-Wolfe algorithmsPerturbed Fenchel duality and first-order methodsA generalized Frank-Wolfe method with ``dual averaging for strongly convex composite optimizationFirst-order methods for convex optimizationPCA SparsifiedInexact proximal stochastic second-order methods for nonconvex composite optimizationThe Approximate Duality Gap Technique: A Unified Theory of First-Order MethodsDual methods for finding equilibriums in mixed models of flow distribution in large transportation networksFast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested pointAdaptive conditional gradient methodGeneralized Conditional Gradient with Augmented Lagrangian for Composite MinimizationOn the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex ProgrammingAccelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexityUnified Acceleration of High-Order Algorithms under General Hölder ContinuityDuality gap estimates for weak Chebyshev greedy algorithms in Banach spacesInexact model: a framework for optimization and variational inequalitiesHigh-Order Optimization Methods for Fully Composite ProblemsDuality gap estimates for a class of greedy optimization algorithms in Banach spaces



Cites Work