Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities
DOI10.1007/s10957-022-02058-3zbMATH Open1545.65268MaRDI QIDQ6596341
Publication date: 2 September 2024
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
variational inequalityperformance estimationsublinear convergence raterelaxed proximal point algorithmtight complexity bound
Convex programming (90C25) Complementarity and equilibrium problems and variational inequalities (finite dimensions) (aspects of mathematical programming) (90C33) Numerical methods for variational inequalities and related problems (65K15)
Cites Work
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Optimized first-order methods for smooth convex minimization
- An optimal variant of Kelley's cutting-plane method
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- On the convergence analysis of the optimized gradient method
- Customized proximal point algorithms for linearly constrained convex minimization and saddle-point problems: a unified approach
- On the convergence rate of the Halpern-iteration
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Accelerated proximal point method for maximally monotone operators
- Performance of first-order methods for smooth convex minimization: a novel approach
- The proximal point algorithm with genuine superlinear convergence for the monotone complementarity problem
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- New Proximal Point Algorithms for Convex Minimization
- Monotone Operators and the Proximal Point Algorithm
- Using SeDuMi 1.02, A Matlab toolbox for optimization over symmetric cones
- Generalizing the Optimized Gradient Method for Smooth Convex Minimization
- Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA)
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers
- Quadratic Matrix Programming
- Proximité et dualité dans un espace hilbertien
- Convex analysis and monotone operator theory in Hilbert spaces
Related Items (1)
This page was built for publication: Tight ergodic sublinear convergence rate of the relaxed proximal point algorithm for monotone variational inequalities