Model-Based Derivative-Free Methods for Convex-Constrained Optimization
From MaRDI portal
Publication:5043286
DOI10.1137/21M1460971zbMath1506.65076arXiv2111.05443OpenAlexW3214407917MaRDI QIDQ5043286
Publication date: 21 October 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2111.05443
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A derivative-free algorithm for linearly constrained optimization problems
- Analysis of direct searches for discontinuous functions
- Global convergence of trust-region algorithms for convex constrained minimization without derivatives
- On the convergence of an inexact Gauss-Newton trust-region method for nonlinear least-squares problems with simple bounds
- A derivative-free trust-region algorithm for composite nonsmooth optimization
- A derivative-free Gauss-Newton method
- Direct search based on probabilistic feasible descent for bound and linearly constrained problems
- Geometry of interpolation sets in derivative free optimization
- Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case
- Algorithm 909
- A Derivative-Free Algorithm for Least-Squares Minimization
- An active-set trust-region method for derivative-free nonlinear bound-constrained optimization
- An Algorithm for Restricted Least Squares Regression
- OrthoMADS: A Deterministic MADS Instance with Orthogonal Directions
- Introduction to Derivative-Free Optimization
- Testing Unconstrained Optimization Software
- Trust Region Methods
- Derivative-Free and Blackbox Optimization
- First-Order Methods in Optimization
- Approximate norm descent methods for constrained nonlinear systems
- The Mesh Adaptive Direct Search Algorithm for Granular and Discrete Variables
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Global Convergence of a Class of Trust Region Algorithms for Optimization Using Inexact Projections on Convex Constraints
- Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers
- Benchmarking Derivative-Free Optimization Algorithms
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- Derivative-free optimization methods
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization
- Robust Stopping Criteria for Dykstra's Algorithm
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Model-Based Derivative-Free Methods for Convex-Constrained Optimization