Interpolation conditions for linear operators and applications to performance estimation problems
DOI10.1137/23M1575391MaRDI QIDQ6601207
François Glineur, Julien M. Hendrickx, Nizar Bousselmi
Publication date: 10 September 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
linear operatorsquadratic functionsfirst-order algorithmsprimal-dual algorithmsperformance estimationexact convergence ratesinterpolation conditions
Analysis of algorithms and problem complexity (68Q25) Semidefinite programming (90C22) Convex programming (90C25) Numerical methods involving duality (49M29) Quadratic programming (90C20)
Cites Work
- Title not available (Why is that?)
- Nonlinear total variation based noise removal algorithms
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A primal-dual fixed point algorithm for minimization of the sum of three convex separable functions
- Optimized first-order methods for smooth convex minimization
- On the ergodic convergence rates of a first-order primal-dual algorithm
- Initialization-free distributed algorithms for optimal resource allocation with feasibility constraints and application to economic dispatch of power systems
- An optimal variant of Kelley's cutting-plane method
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- On the convergence analysis of the optimized gradient method
- On the convergence rate of the Halpern-iteration
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- A new primal-dual algorithm for minimizing the sum of three functions with a linear operator
- A simple algorithm for a class of nonsmooth convex-concave saddle-point problems
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Optimal complexity and certification of Bregman first-order methods
- Efficient first-order methods for convex minimization: a constructive approach
- Accelerated proximal point method for maximally monotone operators
- Performance of first-order methods for smooth convex minimization: a novel approach
- Linear convergence of first order methods for non-strongly convex optimization
- Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems
- Perturbation Theory for the Least Squares Problem with Linear Equality Constraints
- Norm-Preserving Dilations and Their Applications to Optimal Error Bounds
- Generalizing the Optimized Gradient Method for Smooth Convex Minimization
- Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA)
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
- Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection
- An optimal gradient method for smooth strongly convex minimization
- Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
- Conditions for linear convergence of the gradient method for non-convex optimization
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
- Automatic Performance Estimation for Decentralized Optimization
- Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities
- On the rate of convergence of the difference-of-convex algorithm (DCA)
- The exact worst-case convergence rate of the alternating direction method of multipliers
- PEPIT: computer-assisted worst-case analyses of first-order optimization methods in python
This page was built for publication: Interpolation conditions for linear operators and applications to performance estimation problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6601207)