A Hybrid Proximal Extragradient Self-Concordant Primal Barrier Method for Monotone Variational Inequalities
DOI10.1137/130931862zbMath1326.90088OpenAlexW1844274806MaRDI QIDQ3449570
Renato D. C. Monteiro, Benar Fux Svaiter, Mauricio Romero Sicre
Publication date: 4 November 2015
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/4dc87550ac81040a06ee427572bbece9a2d16654
complexityinterior-point methodsNewton methodself-concordant barriersmonotone variational inequalityhybrid proximal extragradient
Numerical mathematical programming methods (65K05) Abstract computational complexity for mathematical programming problems (90C60) Nonlinear programming (90C30) Variational and other types of inequalities involving nonlinear operators (general) (47J20) Numerical optimization and variational techniques (65K10) Newton-type methods (49M15) Monotone operators and generalizations (47H05) Complementarity and equilibrium problems and variational inequalities (finite dimensions) (aspects of mathematical programming) (90C33) Interior-point methods (90C51)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Dual extrapolation and its applications to solving variational inequalities and related problems
- Accelerating the cubic regularization of Newton's method on convex problems
- Enlargement of monotone operators with applications to variational inequalities
- Introductory lectures on convex optimization. A basic course.
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- \(\varepsilon\)-enlargements of maximal monotone operators in Banach spaces
- Cubic regularization of Newton method and its global performance
- A Mathematical View of Interior-Point Methods in Convex Optimization
- An Inexact Hybrid Generalized Proximal Point Algorithm and Some New Results on the Theory of Bregman Functions
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- Interior-point methods for optimization
- Monotone Operators and the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Iteration-Complexity of a Newton Proximal Extragradient Method for Monotone Variational Inequalities and Inclusion Problems
- Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers