A class of derivative-free trust-region methods with interior backtracking technique for nonlinear optimization problems subject to linear inequality constraints
DOI10.1186/s13660-018-1698-7zbMath1497.49043OpenAlexW2801537889WikidataQ55334201 ScholiaQ55334201MaRDI QIDQ824557
Publication date: 15 December 2021
Published in: Journal of Inequalities and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1186/s13660-018-1698-7
inequality constraintsderivative-free optimizationtrust-region methodaffine scalinginterior backtracking technique
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Numerical methods based on nonlinear programming (49M37) Interior-point methods (90C51)
Related Items (1)
Cites Work
- Unnamed Item
- On the local convergence of a derivative-free algorithm for least-squares minimization
- An affine scaling derivative-free trust region method with interior backtracking technique for bounded-constrained nonlinear programming
- On affine-scaling interior-point Newton methods for nonlinear minimization with bound constraints
- More test examples for nonlinear programming codes
- Test examples for nonlinear programming codes
- A new affine scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints.
- A trust region and affine scaling interior point method for nonconvex minimization with linear inequality constraints
- Superlinear and quadratic convergence of affine-scaling interior-point Newton methods for problems with simple bounds without strict complementarity assumption
- An interior-point affine-scaling trust-region method for semismooth equations with box constraints
- Sequential Penalty Derivative-Free Methods for Nonlinear Constrained Optimization
- A Derivative-Free Algorithm for Least-Squares Minimization
- A Derivative-Free Approach to Constrained Multiobjective Nonsmooth Optimization
- Two simple relaxed perturbed extragradient methods for solving variational inequalities in Euclidean spaces
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- Benchmarking optimization software with performance profiles.
This page was built for publication: A class of derivative-free trust-region methods with interior backtracking technique for nonlinear optimization problems subject to linear inequality constraints