A telescopic Bregmanian proximal gradient method without the global Lipschitz continuity assumption
DOI10.1007/s10957-019-01509-8zbMath1427.90222arXiv1804.10273OpenAlexW2916900753WikidataQ128174547 ScholiaQ128174547MaRDI QIDQ2322358
Daniel Reem, Simeon Reich, Alvaro Rodolfo de Pierro
Publication date: 4 September 2019
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1804.10273
minimizationLipschitz continuityBregman divergencestrongly convextelescopic proximal gradient methodTEPROG
Convex programming (90C25) Nonlinear programming (90C30) Iterative procedures involving nonlinear operators (47J25) Decomposition methods (49M27) Convexity of real functions of several variables, generalizations (26B25) Real-valued functions in general topology (54C30)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Convex functions, monotone operators and differentiability.
- Auxiliary problem principle and decomposition of optimization problems
- Elastic-net regularization in learning theory
- Functional analysis, Sobolev spaces and partial differential equations
- A relaxed version of Bregman's method for convex programming
- Strong convergence of contraction semigroups and of iterative methods for accretive operators in Banach spaces
- Ergodic convergence to a zero of the sum of monotone operators in Hilbert space
- An iterative row-action method for interval convex programming
- Produits infinis de resolvantes
- Proximal minimization algorithm with \(D\)-functions
- Introductory lectures on convex optimization. A basic course.
- Solutions to inexact resolvent inclusion problems with applications to nonlinear analysis and optimization
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- A telescopic Bregmanian proximal gradient method without the global Lipschitz continuity assumption
- Forward-backward splitting with Bregman distances
- On the convergence of the forward–backward splitting method with linesearches
- Optimization with Sparsity-Inducing Penalties
- The Bregman Distance without the Bregman Function II
- Convergence of a Proximal Point Method in the Presence of Computational Errors in Hilbert Spaces
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Image deblurring with Poisson data: from cells to galaxies
- Monotone Operators and the Proximal Point Algorithm
- Iterations of paracontractions and firmaly nonexpansive operators with applications to feasibility and optimization
- ESSENTIAL SMOOTHNESS, ESSENTIAL STRICT CONVEXITY, AND LEGENDRE FUNCTIONS IN BANACH SPACES
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- Re-examination of Bregman functions and new properties of their divergences
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- A new convergence analysis and perturbation resilience of some accelerated proximal forward–backward algorithms with errors
- Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing
- Signal Recovery by Proximal Forward-Backward Splitting
- Convex Analysis
- An Iterative Regularization Method for Total Variation-Based Image Restoration
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Stability of the optimal values under small perturbations of the constraint set
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: A telescopic Bregmanian proximal gradient method without the global Lipschitz continuity assumption