A projected extrapolated gradient method with larger step size for monotone variational inequalities
From MaRDI portal
Publication:2046702
DOI10.1007/s10957-021-01902-2OpenAlexW3185398511MaRDI QIDQ2046702
Publication date: 18 August 2021
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-021-01902-2
Variational and other types of inequalities involving nonlinear operators (general) (47J20) Complementarity and equilibrium problems and variational inequalities (finite dimensions) (aspects of mathematical programming) (90C33) Random number generation in numerical analysis (65C10)
Related Items (2)
A self-adaptive extragradient algorithm for solving quasimonotone variational inequalities ⋮ A fully adaptive method for variational inequalities with quasi-monotonicity
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- An inertial forward-backward-forward primal-dual splitting algorithm for solving monotone inclusion problems
- New properties of forward-backward splitting and a practical proximal-descent algorithm
- Low-cost modification of Korpelevich's methods for monotone equilibrium problems
- The subgradient extragradient method for solving variational inequalities in Hilbert space
- NE/SQP: A robust algorithm for the nonlinear complementarity problem
- A class of ADMM-based algorithms for three-block separable convex programming
- Convergence of one-step projected gradient methods for variational inequalities
- Forward-backward and Tseng's type penalty schemes for monotone inclusion problems
- Linearized symmetric multi-block ADMM with indefinite proximal regularization and optimal proximal parameter
- An inertial forward-backward algorithm for monotone inclusions
- Convergence of the modified extragradient method for variational inequalities with non-Lipschitz operators
- Introductory lectures on convex optimization. A basic course.
- A modified projected gradient method for monotone variational inequalities
- Convergent prediction-correction-based ADMM for multi-block separable convex programming
- Modified projection method for pseudomonotone variational inequalities
- An extragradient algorithm for monotone variational inequalities
- A generalization of linearized alternating direction method of multipliers for solving two-block separable convex programming
- An extragradient-type algorithm for non-smooth variational inequalities
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- Modification of the extra-gradient method for solving variational inequalities and certain optimization problems
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Generalized Directional Derivatives and Subgradients of Nonconvex Functions
- Projection methods for variational inequalities with application to the traffic assignment problem
- A New Projection Method for Variational Inequality Problems
- A variant of korpelevich’s method for variational inequalities with a new search strategy
- Proximal extrapolated gradient methods for variational inequalities
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings
- Projected Reflected Gradient Methods for Monotone Variational Inequalities
- An Outer Approximation Method for the Variational Inequality Problem
- Variable metric forward–backward splitting with applications to monotone inclusions in duality
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: A projected extrapolated gradient method with larger step size for monotone variational inequalities