Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
From MaRDI portal
Publication:4634094
DOI10.1137/18M1177718zbMath1411.90359arXiv1803.10173OpenAlexW2963477800WikidataQ128120528 ScholiaQ128120528MaRDI QIDQ4634094
Nocedal, Jorge, Albert S. Berahas, Byrd, Richard H.
Publication date: 7 May 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.10173
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56) Methods of quasi-Newton type (90C53)
Related Items
On the numerical performance of finite-difference-based methods for derivative-free optimization, Full-low evaluation methods for derivative-free optimization, A discussion on variational analysis in derivative-free optimization, A theoretical and empirical comparison of gradient approximations in derivative-free optimization, An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization, Adaptive Finite-Difference Interval Estimation for Noisy Derivative-Free Optimization, Zeroth-order optimization with orthogonal random directions, A trust region method for noisy unconstrained optimization, Latent Gaussian Count Time Series, Limiting behaviour of the generalized simplex gradient as the number of points tends to infinity on a fixed shape in \(\mathrm{IR}^n\), Quadratic regularization methods with finite-difference gradient approximations, Adaptive sampling quasi-Newton methods for zeroth-order stochastic optimization, Constrained Optimization in the Presence of Noise, Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition, How to catch a lion in the desert: on the solution of the coverage directed generation (CDG) problem, An accelerated directional derivative method for smooth stochastic convex optimization, Analysis of the BFGS Method with Errors, A stochastic subspace approach to gradient-free optimization in high dimensions, Derivative-free optimization methods, Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise, Optimization of Stochastic Blackboxes with Adaptive Precision, A Noise-Tolerant Quasi-Newton Algorithm for Unconstrained Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Optimization by Simulated Annealing
- Nonsmooth optimization via quasi-Newton methods
- More test examples for nonlinear programming codes
- Lipschitzian optimization without the Lipschitz constant
- Direct search methods: Then and now
- UOBYQA: unconstrained optimization by quadratic approximation
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- Random gradient-free minimization of convex functions
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- Non-intrusive termination of noisy optimization
- Estimating Derivatives of Noisy Simulations
- Estimating Computational Noise
- Implicit Filtering
- A derivative-free comirror algorithm for convex optimization
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Algorithm 856
- Introduction to Derivative-Free Optimization
- ORBIT: Optimization by Radial Basis Function Interpolation in Trust-Regions
- `` Direct Search Solution of Numerical and Statistical Problems
- Direct Search Methods on Parallel Machines
- Numerical Optimization
- Using Complex Variables to Estimate Derivatives of Real Functions
- Superlinear Convergence and Implicit Filtering
- A Derivative-Free Trust-Region Algorithm for the Optimization of Functions Smoothed via Gaussian Convolution Using Adaptive Multiple Importance Sampling
- Optimization Methods for Large-Scale Machine Learning
- A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization
- Benchmarking Derivative-Free Optimization Algorithms
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- A Simplex Method for Function Minimization
- Benchmarking optimization software with performance profiles.
- Wedge trust region method for derivative free optimization.