An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration
From MaRDI portal
Publication:5231671
DOI10.1137/17M1125157zbMath1421.90117arXiv1610.00960OpenAlexW2883309838MaRDI QIDQ5231671
Julien Mairal, Hongzhou Lin, Zaid Harchaoui
Publication date: 27 August 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1610.00960
Related Items
An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems, The developments of proximal point algorithms, A modified conjugate gradient method for general convex functions, Distributed Learning with Sparse Communications by Identification
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Gradient methods for minimizing composite functions
- Minimizing finite sums with the stochastic average gradient
- Sample size selection in optimization methods for machine learning
- On the limited memory BFGS method for large scale optimization
- Proximal quasi-Newton methods for nondifferentiable convex optimization
- On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- A quasi-second-order proximal bundle algorithm
- Descentwise inexact proximal algorithms for smooth optimization
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Global Convergence of Online Limited Memory BFGS
- Sparse Modeling for Image and Vision Processing
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Smoothing and First Order Methods: A Unified Framework
- Randomized Smoothing for Stochastic Optimization
- Sparse and Redundant Representations
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Updating Quasi-Newton Matrices with Limited Storage
- New Proximal Point Algorithms for Convex Minimization
- Monotone Operators and the Proximal Point Algorithm
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Regularization and Variable Selection Via the Elastic Net
- Numerical optimization. Theoretical and practical aspects. Transl. from the French
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization