A derivative-free affine scaling trust region methods based on probabilistic models with new nonmonotone line search technique for linear inequality constrained minimization without strict complementarity
DOI10.1080/00207160.2018.1517208zbMath1499.90231OpenAlexW2889509307MaRDI QIDQ5031804
Publication date: 16 February 2022
Published in: International Journal of Computer Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00207160.2018.1517208
global convergencenonlinear programmingderivative-free optimizationprobabilistic modelsnonmonotone linear searchaffine matrix
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Analysis of direct searches for discontinuous functions
- Superlinear convergence of affine scaling interior point Newton method for linear inequality constrained minimization without strict complementarity
- On affine-scaling interior-point Newton methods for nonlinear minimization with bound constraints
- More test examples for nonlinear programming codes
- Test examples for nonlinear programming codes
- A class of methods for solving large, convex quadratic programs subject to box constraints
- On trust region methods for unconstrained minimization without derivatives
- A new affine scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints.
- A trust region and affine scaling interior point method for nonconvex minimization with linear inequality constraints
- Geometry of interpolation sets in derivative free optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Convergence of Trust-Region Methods Based on Probabilistic Models
- A Derivative-Free Algorithm for Least-Squares Minimization
- Newton Methods For Large-Scale Linear Inequality-Constrained Minimization
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Introduction to Derivative-Free Optimization
- Global Convergence of a Class of Trust Region Algorithms for Optimization with Simple Bounds
- Newton-type methods for unconstrained and linearly constrained optimization
- A Method of Conjugate Directions for Linearly Constrained Nonlinear Programming Problems
- An alternate implementation of Goldfarb's minimization algorithm
- Large-scale linearly constrained optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- An affine-scaling derivative-free trust-region method for solving nonlinear systems subject to linear inequality constraints
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- A superlinearly convergent method for minimization problems with linear inequality constraints
- Probability
- Benchmarking optimization software with performance profiles.
This page was built for publication: A derivative-free affine scaling trust region methods based on probabilistic models with new nonmonotone line search technique for linear inequality constrained minimization without strict complementarity