Inexact proximal \(\epsilon\)-subgradient methods for composite convex optimization problems
From MaRDI portal
Publication:2010107
DOI10.1007/s10898-019-00808-8zbMath1461.65176arXiv1805.10120OpenAlexW2964186284MaRDI QIDQ2010107
R. Díaz Millán, M. Pentón Machado
Publication date: 3 December 2019
Published in: Journal of Global Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1805.10120
Hilbert spacesplitting methodsoptimization probleminexact methods\(\epsilon\)-subdifferentialaccelerated methods
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Methods involving semicontinuity and convergence; relaxation (49J45)
Related Items
A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives, Primal-dual \(\varepsilon\)-subgradient method for distributed optimization, Principled analyses and design of first-order methods with inexact proximal operators, On FISTA with a relative error rule, The proximal methods for solving absolute value equation, Optimization and learning with nonlocal calculus
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Primal recovery from consensus-based dual decomposition for distributed convex optimization
- An inertial forward-backward-forward primal-dual splitting algorithm for solving monotone inclusion problems
- A relaxed-projection splitting algorithm for variational inequalities in Hilbert spaces
- Gradient methods for minimizing composite functions
- A direct splitting method for nonsmooth variational inequalities
- On generalized \(\epsilon \)-subdifferential and radial epiderivative of set-valued mappings
- Incremental proximal methods for large scale convex optimization
- An additive subfamily of enlargements of a maximally monotone operator
- Monotone (nonlinear) operators in Hilbert space
- Ergodic convergence to a zero of the sum of monotone operators in Hilbert space
- On the projected subgradient method for nonsmooth convex optimization in a Hilbert space
- Introductory lectures on convex optimization. A basic course.
- An algorithm for total variation minimization and applications
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- A simplified view of first order methods for optimization
- Two algorithms for solving systems of inclusion problems
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
- Incremental Subgradient Methods for Nondifferentiable Optimization
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Accelerated and Inexact Forward-Backward Algorithms
- Global Convergence of Splitting Methods for Nonconvex Composite Optimization
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Monotone Operators and the Proximal Point Algorithm
- Inexact spectral projected gradient methods on convex sets
- ϵ-subgradient algorithms for bilevel convex optimization
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- Proximité et dualité dans un espace hilbertien
- On the Subdifferentiability of Convex Functions
- Local linear convergence analysis of Primal–Dual splitting methods