The following pages link to Peter Richtárik (Q263210):
Displaying 50 items.
- Parallel coordinate descent methods for big data optimization (Q263212) (← links)
- Inexact coordinate descent: complexity and preconditioning (Q306308) (← links)
- On optimal probabilities in stochastic coordinate descent methods (Q315487) (← links)
- Approximate level method for nonsmooth convex minimization (Q415377) (← links)
- The complexity of primal-dual fixed point methods for ridge regression (Q1669015) (← links)
- Matrix completion under interval uncertainty (Q1752160) (← links)
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods (Q2023684) (← links)
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (Q2039235) (← links)
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization (Q2044481) (← links)
- Fastest rates for stochastic mirror descent methods (Q2044496) (← links)
- Dualize, split, randomize: toward fast nonsmooth optimization algorithms (Q2082232) (← links)
- Alternating maximization: unifying framework for 8 sparse PCA formulations and efficient parallel codes (Q2129204) (← links)
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods (Q2325237) (← links)
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function (Q2452370) (← links)
- On the convergence analysis of asynchronous SGD for solving consistent linear systems (Q2685380) (← links)
- Distributed coordinate descent method for learning with big data (Q2810888) (← links)
- Coordinate descent with arbitrary sampling I: algorithms and complexity<sup>†</sup> (Q2829565) (← links)
- Coordinate descent with arbitrary sampling II: expected separable overapproximation (Q2829566) (← links)
- Optimization in high dimensions via accelerated, parallel, and proximal coordinate descent (Q2832112) (← links)
- Generalized power method for sparse principal component analysis (Q2896039) (← links)
- Separable approximations and decomposition methods for the augmented Lagrangian (Q2943840) (← links)
- Improved Algorithms for Convex Minimization in Relative Scale (Q3105793) (← links)
- A Randomized Exchange Algorithm for Computing Optimal Approximate Designs of Experiments (Q3304857) (← links)
- Accelerated, Parallel, and Proximal Coordinate Descent (Q3449571) (← links)
- Randomized Iterative Methods for Linear Systems (Q3456879) (← links)
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions (Q3462314) (← links)
- (Q4558169) (← links)
- Distributed optimization with arbitrary local solvers (Q4594835) (← links)
- Semi-stochastic coordinate descent (Q4594842) (← links)
- Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms (Q4598334) (← links)
- Coordinate Descent Face-Off: Primal or Dual? (Q4617605) (← links)
- On the complexity of parallel coordinate descent (Q4638927) (← links)
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications (Q4687241) (← links)
- Parallel Stochastic Newton Method (Q4688182) (← links)
- Stochastic Three Points Method for Unconstrained Smooth Minimization (Q4971022) (← links)
- (Q4999081) (← links)
- Revisiting Randomized Gossip Algorithms: General Framework, Convergence Rates and Novel Block and Accelerated Protocols (Q5030260) (← links)
- Quasi-Newton methods for machine learning: forget the past, just sample (Q5058389) (← links)
- Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization (Q5076671) (← links)
- Uncertainty principle for communication compression in distributed and federated learning and the search for an optimal compressor (Q5095263) (← links)
- Best Pair Formulation & Accelerated Scheme for Non-Convex Principal Component Pursuit (Q5103177) (← links)
- Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory (Q5112239) (← links)
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design (Q5176277) (← links)
- New Convergence Aspects of Stochastic Gradient Algorithms (Q5214284) (← links)
- Randomized Projection Methods for Convex Feasibility: Conditioning and Convergence Rates (Q5242933) (← links)
- Convergence Analysis of Inexact Randomized Iterative Methods (Q5856678) (← links)
- Stochastic distributed learning with gradient quantization and double-variance reduction (Q5882226) (← links)
- Unified analysis of stochastic gradient methods for composite convex and smooth optimization (Q6086133) (← links)
- Direct nonlinear acceleration (Q6114957) (← links)
- Faster Rates for Compressed Federated Learning with Client-Variance Reduction (Q6202285) (← links)