A strongly convergent proximal point method for vector optimization
From MaRDI portal
Publication:2046562
DOI10.1007/s10957-021-01877-0zbMath1475.90091OpenAlexW3174448913MaRDI QIDQ2046562
Jefferson G. Melo, Ray G. Serra, Alfredo Noel Iusem
Publication date: 18 August 2021
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-021-01877-0
Convex programming (90C25) Multi-objective and goal programming (90C29) Nonlinear programming (90C30) Nonsmooth analysis (49J52) Programming in abstract spaces (90C48)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Strong convergence in Hilbert spaces via \(\varGamma \)-duality
- A subgradient-like algorithm for solving vector convex inequalities
- Convergence of the projected gradient method for quasiconvex multiobjective optimization
- Hybrid approximate proximal method with auxiliary variational inequality for vector optimization
- A proximal point-type method for multicriteria optimization
- Produits infinis de resolvantes
- Une méthode itérative de résolution d'une inéquation variationnelle
- Steepest descent methods for multicriteria optimization.
- On the need for hybrid steps in hybrid proximal point methods
- Proximal point method for locally Lipschitz functions in multiobjective optimization of Hadamard manifolds
- Proximal point method for a special class of nonconvex multiobjective optimization functions
- Unconstrained steepest descent method for multicriteria optimization on Riemannian manifolds
- A steepest descent method for vector optimization
- A projected gradient method for vector optimization problems
- Forcing strong convergence of proximal point iterations in a Hilbert space
- A subgradient method for multiobjective optimization
- Proximal point method for vector optimization on Hadamard manifolds
- Proximal gradient methods for multiobjective optimization and their applications
- On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces
- Newton-like methods for solving vector optimization problems
- A new duality theory for mathematical programming
- A Strongly Convergent Method for Nonsmooth Convex Minimization in Hilbert Spaces
- Inertial forward–backward methods for solving vector optimization problems
- Newton's Method for Multiobjective Optimization
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- Monotone Operators and the Proximal Point Algorithm
- The Proximal Point Method for Locally Lipschitz Functions in Multiobjective Optimization with Application to the Compromise Problem
- Nonlinear Conjugate Gradient Methods for Vector Optimization
- A dual approach to solving nonlinear programming problems by unconstrained optimization
- Conditional extragradient algorithms for solving variational inequalities
- A proximal gradient splitting method for solving convex vector optimization problems
- A quadratically convergent Newton method for vector optimization
- Gradient Method for Optimization on Riemannian Manifolds with Lower Bounded Curvature
- Proximal Methods in Vector Optimization
- A Subgradient Method for Vector Optimization Problems
- A Weak-to-Strong Convergence Principle for Fejér-Monotone Methods in Hilbert Spaces
- Convex analysis and monotone operator theory in Hilbert spaces