Complexity bounds for primal-dual methods minimizing the model of objective function
From MaRDI portal
Publication:1785201
DOI10.1007/s10107-017-1188-6zbMath1397.90351OpenAlexW2140180727MaRDI QIDQ1785201
Publication date: 28 September 2018
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-017-1188-6
convex optimizationcomplexity boundstrust-region methodconditional gradient methodlinear optimization oracle
Related Items
Technical Note—Dynamic Data-Driven Estimation of Nonparametric Choice Models ⋮ Efficient numerical methods to solve sparse linear equations with application to PageRank ⋮ Gradient methods with memory ⋮ Exact gradient methods with memory ⋮ Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints ⋮ Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier ⋮ Universal Conditional Gradient Sliding for Convex Optimization ⋮ Affine Invariant Convergence Rates of the Conditional Gradient Method ⋮ A unified analysis of stochastic gradient‐free Frank–Wolfe methods ⋮ Short paper -- A note on the Frank-Wolfe algorithm for a class of nonconvex and nonsmooth optimization problems ⋮ Affine-invariant contracting-point methods for convex optimization ⋮ Generalized self-concordant analysis of Frank-Wolfe algorithms ⋮ Perturbed Fenchel duality and first-order methods ⋮ A generalized Frank-Wolfe method with ``dual averaging for strongly convex composite optimization ⋮ First-order methods for convex optimization ⋮ PCA Sparsified ⋮ Inexact proximal stochastic second-order methods for nonconvex composite optimization ⋮ The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods ⋮ Dual methods for finding equilibriums in mixed models of flow distribution in large transportation networks ⋮ Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point ⋮ Adaptive conditional gradient method ⋮ Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization ⋮ On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming ⋮ Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity ⋮ Unified Acceleration of High-Order Algorithms under General Hölder Continuity ⋮ Duality gap estimates for weak Chebyshev greedy algorithms in Banach spaces ⋮ Inexact model: a framework for optimization and variational inequalities ⋮ High-Order Optimization Methods for Fully Composite Problems ⋮ Duality gap estimates for a class of greedy optimization algorithms in Banach spaces
Cites Work
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Solving variational inequalities with monotone operators on domains given by linear minimization oracles
- Gradient methods for minimizing composite functions
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Universal gradient methods for convex optimization problems
- A regularization of the Frank-Wolfe method and unification of certain nonlinear programming methods
- Introductory lectures on convex optimization. A basic course.
- Quasi-monotone subgradient methods for nonsmooth convex minimization
- Cubic regularization of Newton method and its global performance
- Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length Rules
- Trust Region Methods
- Generalized Conditional Gradient for Sparse Estimation
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
- New analysis and results for the Frank-Wolfe method