scientific article; zbMATH DE number 7306895
From MaRDI portal
Publication:5148997
Publication date: 5 February 2021
Full work available at URL: https://arxiv.org/abs/1908.02246
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
global convergenceapproximate Newton methodcommunication-efficient distributed learningheavy-ball acceleration
Related Items (2)
Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization ⋮ Compression and data similarity: combination of two techniques for communication-efficient solving of distributed variational inequalities
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- User-friendly tail bounds for sums of random matrices
- The landscape of empirical risk for nonconvex losses
- Distributed Coordinate Descent Method for Learning with Big Data
- Learning Kernel-Based Halfspaces with the 0-1 Loss
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice
- Distributed optimization with arbitrary local solvers
- DSCOVR: Randomized Primal-Dual Block Coordinate Algorithms for Asynchronous Distributed Optimization
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
- Communication-Efficient Distributed Statistical Inference
- Some methods of speeding up the convergence of iteration methods
This page was built for publication: