A Regularized Hybrid Steepest Descent Method for Variational Inclusions
DOI10.1080/01630563.2011.619676zbMath1237.49025OpenAlexW2010359383MaRDI QIDQ2881892
Publication date: 3 May 2012
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01630563.2011.619676
variational inequalitiesmonotone operatorhybrid steepest descent methodconvex constrained minimizationYosida approximate
Convex programming (90C25) Numerical optimization and variational techniques (65K10) Set-valued and variational analysis (49J53) Iterative procedures involving nonlinear operators (47J25) Numerical methods based on nonlinear programming (49M37) Applications of operator theory in optimization, convex analysis, mathematical programming, economics (47N10)
Related Items (2)
Cites Work
- Unnamed Item
- Convergence of hybrid steepest-descent methods for variational inequalities
- Nonstrictly Convex Minimization over the Bounded Fixed Point Set of a Nonexpansive Mapping
- Hybrid Steepest Descent Method for Variational Inequality Problem over the Fixed Point Set of Certain Quasi-nonexpansive Mappings
This page was built for publication: A Regularized Hybrid Steepest Descent Method for Variational Inclusions