Convergence rates of accelerated proximal gradient algorithms under independent noise
From MaRDI portal
Publication:2420162
DOI10.1007/s11075-018-0565-4zbMath1420.90069OpenAlexW2811382114WikidataQ129647654 ScholiaQ129647654MaRDI QIDQ2420162
Li-Zhi Cheng, Tao Sun, Hao Jiang, Roberto Barrio
Publication date: 5 June 2019
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-018-0565-4
Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Applications of operator theory in optimization, convex analysis, mathematical programming, economics (47N10)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- First-order methods of smooth convex optimization with inexact oracle
- Minimizing finite sums with the stochastic average gradient
- Coupling the proximal point algorithm with approximation methods
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- Coordinate descent algorithms
- Statistical inverse problems: discretization, model reduction and inverse crimes
- Exact matrix completion via convex optimization
- Sparse Wavelet Representations of Spatially Varying Blurring Operators
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Accelerated and Inexact Forward-Backward Algorithms
- Convergence of a Proximal Point Method in the Presence of Computational Errors in Hilbert Spaces
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- An EM algorithm for wavelet-based image restoration
- Nonlinear wavelet image processing: variational problems, compression, and noise removal through wavelet shrinkage
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Numerical methods for nondifferentiable convex optimization
- An Inexact Accelerated Proximal Gradient Method for Large Scale Linearly Constrained Convex SDP
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- A new convergence analysis and perturbation resilience of some accelerated proximal forward–backward algorithms with errors
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- Compressed sensing
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: Convergence rates of accelerated proximal gradient algorithms under independent noise