Quadratic regularization methods with finite-difference gradient approximations
From MaRDI portal
Publication:6175465
DOI10.1007/s10589-022-00373-zOpenAlexW4280633477WikidataQ114227018 ScholiaQ114227018MaRDI QIDQ6175465
Publication date: 24 July 2023
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-022-00373-z
Nonconvex programming, global optimization (90C26) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items (2)
Preface to the 5th Brazil-China symposium on applied and computational mathematics ⋮ Worst-case evaluation complexity of a derivative-free quadratic regularization method
Cites Work
- Unnamed Item
- On the optimal order of worst case complexity of direct search
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Worst case complexity of direct search
- On the worst-case evaluation complexity of non-monotone line search algorithms
- A derivative-free trust-region algorithm for composite nonsmooth optimization
- A cubic regularization of Newton's method with finite difference Hessian approximations
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- Efficient unconstrained black box optimization
- Random gradient-free minimization of convex functions
- Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- Testing Unconstrained Optimization Software
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- Stochastic Three Points Method for Unconstrained Smooth Minimization
- Benchmarking Derivative-Free Optimization Algorithms
- Derivative-free optimization methods
- Direct Search Based on Probabilistic Descent
- A Simplex Method for Function Minimization
- On the numerical performance of finite-difference-based methods for derivative-free optimization
- Full-low evaluation methods for derivative-free optimization
- Scalable subspace methods for derivative-free nonlinear least-squares optimization
This page was built for publication: Quadratic regularization methods with finite-difference gradient approximations