Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization
DOI10.1137/16M1082767zbMath1375.49040arXiv1607.00101OpenAlexW2963930582MaRDI QIDQ5355205
Publication date: 7 September 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1607.00101
convergenceconvex optimizationiteration complexitydamped Newton methodcomposite self-concordant minimizationproximal damped Newton methodrandomized block proximal damped Newton method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Learning and adaptive systems in artificial intelligence (68T05) Newton-type methods (49M15) Interior-point methods (90C51) Methods involving semicontinuity and convergence; relaxation (49J45)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Sparse inverse covariance estimation with the graphical lasso
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity analysis of block coordinate descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- Introductory lectures on convex optimization. A basic course.
- Communication-efficient distributed optimization of self-concordant empirical loss
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Block Coordinate Descent Methods for Semidefinite Programming
- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Adaptive First-Order Methods for General Sparse Inverse Covariance Selection
- Fixed-Point Continuation Applied to Compressed Sensing: Implementation and Numerical Experiments
- Alternating Direction Algorithms for $\ell_1$-Problems in Compressive Sensing
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Accelerated, Parallel, and Proximal Coordinate Descent
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- An algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategy
- Model selection and estimation in the Gaussian graphical model
- Probing the Pareto Frontier for Basis Pursuit Solutions
- First-Order Methods for Sparse Covariance Selection
- Sparse Reconstruction by Separable Approximation
- On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems
- A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- A Semismooth Newton Method with Multidimensional Filter Globalization for $l_1$-Optimization
- An Inexact Proximal Path-Following Algorithm for Constrained Convex Minimization
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- On the Convergence of Block Coordinate Descent Type Methods
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Composite Self-Concordant Minimization