Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
From MaRDI portal
Publication:5093649
DOI10.1137/21M1395302zbMath1496.90055arXiv2101.12101MaRDI QIDQ5093649
Puqian Wang, Jelena Diakonikolas
Publication date: 29 July 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2101.12101
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Minimax problems in mathematical programming (90C47)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Gradient methods for minimizing composite functions
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- The exact information-based complexity of smooth convex minimization
- Lectures on convex optimization
- On the convergence rate of the Halpern-iteration
- Dual extrapolation and its applications to solving variational inequalities and related problems
- A modification of the Arrow-Hurwicz method for search of saddle points
- On optimality of Krylov's information when solving linear operator equations
- Information-based complexity of linear operator equations
- Produits infinis de resolvantes
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Lower bounds for finding stationary points I
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- Performance of first-order methods for smooth convex minimization: a novel approach
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- On Lower and Upper Bounds for Smooth and Strongly Convex Optimization Problems
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- Generalizing the Optimized Gradient Method for Smooth Convex Minimization
- An Optimal First Order Method Based on Optimal Quadratic Averaging
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- A variational perspective on accelerated methods in optimization
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity
- Primal–dual accelerated gradient methods with small-dimensional relaxation oracle
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Fixed points of nonexpanding maps
- A First Order Method for Solving Convex Bilevel Optimization Problems
- Mean Value Methods in Iteration
- Generalized Momentum-Based Methods: A Hamiltonian Perspective
- Convex analysis and monotone operator theory in Hilbert spaces