Pages that link to "Item:Q6095736"
From MaRDI portal
The following pages link to Parallel and distributed asynchronous adaptive stochastic gradient methods (Q6095736):
Displaying 15 items.
- A sharp convergence rate for a model equation of the asynchronous stochastic gradient descent (Q2057038) (← links)
- Parallel stochastic gradient algorithms for large-scale matrix completion (Q2392935) (← links)
- On the parallelization upper bound for asynchronous stochastic gradients descent in non-convex optimization (Q2696976) (← links)
- An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization (Q2979326) (← links)
- Multitask Diffusion Adaptation Over<?Pub _newline ?>Asynchronous Networks (Q4619582) (← links)
- Asynchronous Distributed ADMM for Large-Scale Optimization—Part II: Linear Convergence Analysis and Numerical Performance (Q4619616) (← links)
- Group stochastic gradient descent: a tradeoff between straggler and staleness (Q5017316) (← links)
- Asynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizes (Q5038180) (← links)
- (Q5149233) (← links)
- The Convergence of Stochastic Gradient Descent in Asynchronous Shared Memory (Q5197680) (← links)
- An Uncertainty-Weighted Asynchronous ADMM Method for Parallel PDE Parameter Estimation (Q5241247) (← links)
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm (Q5502117) (← links)
- Asynchronous Gradient Push (Q5853993) (← links)
- Distributed Stochastic Inertial-Accelerated Methods with Delayed Derivatives for Nonconvex Problems (Q5863523) (← links)
- Improving the Transient Times for Distributed Stochastic Gradient Methods (Q6080216) (← links)