Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice

From MaRDI portal
Publication:4558545

zbMath1469.68101arXiv1712.05654MaRDI QIDQ4558545

Zaid Harchaoui, Hongzhou Lin, Julien Mairal

Publication date: 22 November 2018

Full work available at URL: https://arxiv.org/abs/1712.05654




Related Items (40)

Accelerated proximal algorithms with a correction term for monotone inclusionsOracle complexity separation in convex optimizationOn the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate AscentUnnamed ItemAccelerated meta-algorithm for convex optimization problemsBregman Proximal Point Algorithm Revisited: A New Inexact Version and Its Inertial VariantInexact first-order primal-dual algorithmsOne-step optimization method for equilibrium problemsInexact successive quadratic approximation for regularized optimizationAccelerated variance-reduced methods for saddle-point problemsAn accelerated variance reducing stochastic method with Douglas-Rachford splittingPrincipled analyses and design of first-order methods with inexact proximal operatorsAdaptive proximal SGD based on new estimating sequences for sparser ERMAccelerated methods for saddle-point problemContracting Proximal Methods for Smooth Convex OptimizationThe Approximate Duality Gap Technique: A Unified Theory of First-Order MethodsRevisiting EXTRA for Smooth Distributed OptimizationOn variance reduction for stochastic smooth convex optimization with multiplicative noiseAccelerated proximal point method for maximally monotone operatorsUnnamed ItemFast convergence of generalized forward-backward algorithms for structured monotone inclusionsFast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested pointAn optimal randomized incremental gradient methodStochastic quasi-gradient methods: variance reduction via Jacobian sketchingDistributed Learning with Sparse Communications by IdentificationProvable accelerated gradient method for nonconvex low rank optimizationAn Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton AccelerationStochastic Primal-Dual Coordinate Method for Regularized Empirical Risk MinimizationUnnamed ItemUnnamed ItemUnderstanding the acceleration phenomenon via high-resolution differential equationsGeneralized Momentum-Based Methods: A Hamiltonian PerspectiveUnnamed ItemUnnamed ItemA Proximal Bundle Variant with Optimal Iteration-Complexity for a Large Range of Prox StepsizesUnnamed ItemConvergence of Recursive Stochastic Algorithms Using Wasserstein DivergenceAccelerated proximal envelopes: application to componentwise methodsOn the computational efficiency of catalyst accelerated coordinate descentAccelerating variance-reduced stochastic gradient methods


Uses Software


Cites Work


This page was built for publication: Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice