Optimal complexity and certification of Bregman first-order methods
DOI10.1007/s10107-021-01618-1zbMath1494.90076arXiv1911.08510OpenAlexW3156012899MaRDI QIDQ2149545
Radu-Alexandru Dragomir, Jérôme Bolte, Alexandre d'Aspremont, Adrien B. Taylor
Publication date: 29 June 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.08510
Analysis of algorithms and problem complexity (68Q25) Semidefinite programming (90C22) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Abstract computational complexity for mathematical programming problems (90C60)
Related Items (6)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Optimized first-order methods for smooth convex minimization
- An optimal variant of Kelley's cutting-plane method
- On lower complexity bounds for large-scale smooth convex optimization
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- The exact information-based complexity of smooth convex minimization
- Proximal minimization algorithm with \(D\)-functions
- Introductory lectures on convex optimization. A basic course.
- A simplified view of first order methods for optimization
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Quartic first-order methods for low-rank minimization
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Bregman forward-backward operator splitting
- Efficient first-order methods for convex minimization: a constructive approach
- Implementable tensor methods in unconstrained convex optimization
- Performance of first-order methods for smooth convex minimization: a novel approach
- On linear convergence of non-Euclidean gradient methods without strong convexity and Lipschitz gradient continuity
- The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography
- Duality Between Subgradient and Conditional Gradient Methods
- Image deblurring with Poisson data: from cells to galaxies
- Entropic Proximal Mappings with Applications to Nonlinear Programming
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Nonlinear Proximal Point Algorithms Using Bregman Functions, with Applications to Convex Programming
- Semidefinite Programming
- Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Nonconvex Optimization
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Proximité et dualité dans un espace hilbertien
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: Optimal complexity and certification of Bregman first-order methods