Pages that link to "Item:Q4978059"
From MaRDI portal
The following pages link to Katyusha: the first direct acceleration of stochastic gradient methods (Q4978059):
Displaying 48 items.
- On variance reduction for stochastic smooth convex optimization with multiplicative noise (Q1739038) (← links)
- Stochastic variance reduced gradient methods using a trust-region-like scheme (Q1995995) (← links)
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute'' gradient for structured convex optimization (Q2020608) (← links)
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods (Q2023684) (← links)
- An accelerated directional derivative method for smooth stochastic convex optimization (Q2029381) (← links)
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (Q2039235) (← links)
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems (Q2042418) (← links)
- An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization (Q2062324) (← links)
- A stochastic Nesterov's smoothing accelerated method for general nonsmooth constrained stochastic composite convex optimization (Q2103421) (← links)
- On stochastic accelerated gradient with convergence rate (Q2111814) (← links)
- Accelerating variance-reduced stochastic gradient methods (Q2118092) (← links)
- A hybrid stochastic optimization framework for composite nonconvex optimization (Q2118109) (← links)
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization (Q2149551) (← links)
- A regularization interpretation of the proximal point method for weakly convex functions (Q2179443) (← links)
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate (Q2301128) (← links)
- Provable accelerated gradient method for nonconvex low rank optimization (Q2303662) (← links)
- Nesterov-aided stochastic gradient methods using Laplace approximation for Bayesian design optimization (Q2309392) (← links)
- Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization (Q2420797) (← links)
- An accelerated variance reducing stochastic method with Douglas-Rachford splitting (Q2425236) (← links)
- Cocoercivity, smoothness and bias in variance-reduced stochastic gradient methods (Q2674579) (← links)
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice (Q4558545) (← links)
- (Q4558562) (← links)
- Stochastic Model-Based Minimization of Weakly Convex Functions (Q4620418) (← links)
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization (Q4636997) (← links)
- (Q4637040) (← links)
- Random Gradient Extrapolation for Distributed and Stochastic Optimization (Q4687240) (← links)
- (Q4969167) (← links)
- (Q4969178) (← links)
- (Q4969259) (← links)
- Matrix completion and related problems via strong duality (Q4993268) (← links)
- Optimal Transport-Based Distributionally Robust Optimization: Structural Properties and Iterative Schemes (Q5085150) (← links)
- Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent (Q5094616) (← links)
- On the Adaptivity of Stochastic Gradient-Based Optimization (Q5114394) (← links)
- Accelerated dual-averaging primal–dual method for composite convex minimization (Q5135253) (← links)
- (Q5148937) (← links)
- (Q5149230) (← links)
- A Smooth Inexact Penalty Reformulation of Convex Problems with Linear Constraints (Q5152474) (← links)
- An Optimal Algorithm for Decentralized Finite-Sum Optimization (Q5162661) (← links)
- A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization (Q5244401) (← links)
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods (Q6073850) (← links)
- Unified analysis of stochastic gradient methods for composite convex and smooth optimization (Q6086133) (← links)
- SVRG meets AdaGrad: painless variance reduction (Q6097116) (← links)
- Linearly-convergent FISTA variant for composite optimization with duality (Q6101606) (← links)
- Optimal analysis of method with batching for monotone stochastic finite-sum variational inequalities (Q6124397) (← links)
- First-order methods for convex optimization (Q6169988) (← links)
- Gradient complexity and non-stationary views of differentially private empirical risk minimization (Q6199392) (← links)
- Random-reshuffled SARAH does not need full gradient computations (Q6204201) (← links)
- Stochastic Steffensen method (Q6606849) (← links)