Weak and strong convergence of generalized proximal point algorithms with relaxed parameters
From MaRDI portal
Publication:2694524
DOI10.1007/s10898-022-01241-0OpenAlexW4303045027MaRDI QIDQ2694524
Publication date: 3 April 2023
Published in: Journal of Global Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2110.07015
resolventstrong convergenceweak convergenceproximal point algorithmmaximally monotone operatorsfirmly nonexpansiveness
Convex programming (90C25) Nonlinear programming (90C30) Monotone operators and generalizations (47H05) Iterative procedures involving nonlinear operators (47J25) Numerical solutions to equations with nonlinear operators (65J15)
Cites Work
- A note on the regularized proximal point algorithm
- A proximal point algorithm converging strongly for general errors
- Strong convergence of a proximal point algorithm with general errors
- A regularization method for the proximal point algorithm
- On convergence criteria of generalized proximal point algorithms
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- On the optimal linear convergence rate of a generalized proximal point algorithm
- Convergence of generalized proximal point algorithms
- Forcing strong convergence of proximal point iterations in a Hilbert space
- Strong convergence theorems for infinite families of nonexpansive mappings in general Banach spaces
- Iterative Algorithms for Nonlinear Operators
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- Monotone Operators and the Proximal Point Algorithm
- Modified Lagrangians in convex programming and their generalizations
- A Generalized Proximal Point Algorithm and Its Convergence Rate
- Combining The Proximal Algorithm And Tikhonov Regularization
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: Weak and strong convergence of generalized proximal point algorithms with relaxed parameters