A fast dual proximal gradient algorithm for convex minimization and applications
From MaRDI portal
Publication:1667162
DOI10.1016/j.orl.2013.10.007zbMath1408.90232OpenAlexW2088624648MaRDI QIDQ1667162
Publication date: 27 August 2018
Published in: Operations Research Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.orl.2013.10.007
Related Items
On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, Decomposition Methods for Sparse Matrix Nearness Problems, Primal–dual accelerated gradient methods with small-dimensional relaxation oracle, Convergence Analysis of the Proximal Gradient Method in the Presence of the Kurdyka–Łojasiewicz Property Without Global Lipschitz Assumptions, Composite optimization with coupling constraints via dual proximal gradient method with applications to asynchronous networks, Deautoconvolution in the two-dimensional case, Dual gradient method for ill-posed problems using multiple repeated measurement data, A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization, A Newton-type proximal gradient method for nonlinear multi-objective optimization problems, Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA), CGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completion, Acceleration of the PDHGM on partially strongly convex functions, The Glowinski-Le Tallec splitting method revisited: a general convergence and convergence rate analysis, FOM – a MATLAB toolbox of first-order methods for solving convex optimization problems, Rate of convergence analysis of dual-based variables decomposition methods for strongly convex problems, The Douglas-Rachford algorithm in the affine-convex case, Proximal algorithms in statistics and machine learning, On convergence analysis of dual proximal-gradient methods with approximate gradient for a class of nonsmooth convex minimization problems, Iteration complexity analysis of dual first-order methods for conic convex programming, An Alternating Semiproximal Method for Nonconvex Regularized Structured Total Least Squares Problems, Finding best approximation pairs for two intersections of closed convex sets, A general double-proximal gradient algorithm for d.c. programming, Accelerated Iterative Regularization via Dual Diagonal Descent, A dual Bregman proximal gradient method for relatively-strongly convex optimization, Dual Space Preconditioning for Gradient Descent, Stochastic proximal linear method for structured non-convex problems, A dual approach for optimal algorithms in distributed optimization over networks, Implicit regularization with strongly convex bias: Stability and acceleration, Proximal Gradient Methods for Machine Learning and Imaging
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Dualization of signal recovery problems
- Projected subgradient methods with non-Euclidean distances for non-differentiable convex minimization and variational inequalities
- Ergodic convergence to a zero of the sum of monotone operators in Hilbert space
- An algorithm for total variation minimization and applications
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Proximal Splitting Methods in Signal Processing
- Efficient Schemes for Total Variation Minimization Under Constraints in Image Processing
- Applications of a Splitting Algorithm to Decomposition in Convex Programming and Variational Inequalities
- Monotone Operators and the Proximal Point Algorithm
- Variational Analysis
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- Total Variation Projection With First Order Schemes
- Proximité et dualité dans un espace hilbertien