Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization
From MaRDI portal
Publication:4580706
DOI10.1109/TSP.2015.2436357zbMath1394.94146OpenAlexW1580723439MaRDI QIDQ4580706
Vyacheslav Kungurtsev, Amir Daneshmand, Francisco Facchinei, Gesualdo Scutari
Publication date: 22 August 2018
Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tsp.2015.2436357
Convex programming (90C25) Nonconvex programming, global optimization (90C26) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Parallel numerical computation (65Y05)
Related Items (10)
Decentralized Dictionary Learning Over Time-Varying Digraphs ⋮ A flexible coordinate descent method ⋮ A primal Douglas-Rachford splitting method for the constrained minimization problem in compressive sensing ⋮ \(\mathrm{L_1RIP}\)-based robust compressed sensing ⋮ A stochastic averaging gradient algorithm with multi‐step communication for distributed optimization ⋮ A framework for parallel second order incremental optimization algorithms for solving partially separable problems ⋮ Asynchronous parallel algorithms for nonconvex optimization ⋮ A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares ⋮ Ghost Penalties in Nonconvex Constrained Optimization: Diminishing Stepsizes and Iteration Complexity ⋮ Newton-like Method with Diagonal Correction for Distributed Optimization
This page was built for publication: Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization