On the numerical performance of finite-difference-based methods for derivative-free optimization
From MaRDI portal
Publication:5882235
DOI10.1080/10556788.2022.2121832OpenAlexW4297093107MaRDI QIDQ5882235
Figen Oztoprak, Hao-Jun Michael Shi, Nocedal, Jorge, Melody Qiming Xuan
Publication date: 15 March 2023
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2102.09762
nonlinear optimizationfinite differencesderivative-free optimizationnoisy optimizationzeroth-order optimization
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56) Methods of quasi-Newton type (90C53) Methods of successive quadratic programming type (90C55)
Related Items
Latent Gaussian Count Time Series, Quadratic regularization methods with finite-difference gradient approximations, Constrained Optimization in the Presence of Noise
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On fast trust region methods for quadratic models with linear constraints
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- A numerical study of limited memory BFGS methods
- A derivative-free Gauss-Newton method
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- Random gradient-free minimization of convex functions
- Geometry of interpolation sets in derivative free optimization
- Estimating Derivatives of Noisy Simulations
- A Derivative-Free Algorithm for Least-Squares Minimization
- Estimating Computational Noise
- Implicit Filtering
- Algorithm 856
- Introduction to Derivative-Free Optimization
- Computing Forward-Difference Intervals for Numerical Optimization
- Numerical Optimization
- Superlinear Convergence and Implicit Filtering
- Derivative-Free and Blackbox Optimization
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- A Subspace, Interior, and Conjugate Gradient Method for Large-Scale Bound-Constrained Minimization Problems
- Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers
- A Noise-Tolerant Quasi-Newton Algorithm for Unconstrained Optimization
- Benchmarking Derivative-Free Optimization Algorithms
- Analysis of the BFGS Method with Errors
- Derivative-free optimization methods
- Complete search in continuous global optimization and constraint satisfaction
- GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization
- A Simplex Method for Function Minimization