Improved front steepest descent for multi-objective optimization
From MaRDI portal
Publication:6106527
DOI10.1016/j.orl.2023.03.001zbMath1525.90388arXiv2301.03310OpenAlexW4323349933MaRDI QIDQ6106527
Matteo Lapucci, Pierluigi Mansueto
Publication date: 3 July 2023
Published in: Operations Research Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2301.03310
Numerical mathematical programming methods (65K05) Multi-objective and goal programming (90C29) Methods of reduced gradient type (90C52)
Cites Work
- On the convergence of steepest descent methods for multiobjective optimization
- Steepest descent methods for multicriteria optimization.
- Scalarizing vector optimization problems
- Globally convergent Newton-type methods for multiobjective optimization
- Proximal gradient methods for multiobjective optimization and their applications
- On the choice of parameters for the weighting method in vector optimization
- Direct Multisearch for Multiobjective Optimization
- A Derivative-Free Approach to Constrained Multiobjective Nonsmooth Optimization
- Newton's Method for Multiobjective Optimization
- An Adaptive Scalarization Method in Multiobjective Optimization
- Benchmarking optimization software with performance profiles.
- Pareto front approximation through a multi-objective augmented Lagrangian method
- Twenty years of continuous multiobjective optimization in the twenty-first century
This page was built for publication: Improved front steepest descent for multi-objective optimization