Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization
DOI10.1016/j.ejco.2022.100045zbMath1530.90061arXiv2102.08246MaRDI QIDQ6114954
Dmitry Kamzolov, Aleksandr Lukashevich, César A. Uribe, Erik Ordentlich, Soomin Lee, A. V. Gasnikov, Pavel Dvurechensky
Publication date: 12 July 2023
Published in: EURO Journal on Computational Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2102.08246
empirical risk minimizationdistributed optimizationstatistical preconditioningtensor optimization methods
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Stochastic programming (90C15)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Lectures on convex optimization
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Universal method for stochastic composite optimization problems
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Optimal complexity and certification of Bregman first-order methods
- Lower bounds for finding stationary points I
- First-order and stochastic optimization methods for machine learning
- Superfast second-order methods for unconstrained convex optimization
- Optimal combination of tensor optimization methods
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Near-Optimal Hyperfast Second-Order Method for Convex Optimization
- Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure
- Distributed Optimization Based on Gradient Tracking Revisited: Enhancing Convergence Rate via Surrogation
- Contracting Proximal Methods for Smooth Convex Optimization
- An Optimal Algorithm for Decentralized Finite-Sum Optimization
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Inexact model: a framework for optimization and variational inequalities
- First-order methods for convex optimization
This page was built for publication: Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization