A Noise-Tolerant Quasi-Newton Algorithm for Unconstrained Optimization
From MaRDI portal
Publication:5026838
DOI10.1137/20M1373190zbMath1484.90117arXiv2010.04352OpenAlexW4225873238MaRDI QIDQ5026838
Nocedal, Jorge, Yuchen Xie, Byrd, Richard H., Hao-Jun Michael Shi
Publication date: 8 February 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.04352
unconstrained optimizationstochastic optimizationnonlinear optimizationquasi-Newton methodderivative-free optimizationnoisy optimization
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56) Methods of quasi-Newton type (90C53)
Related Items
On the numerical performance of finite-difference-based methods for derivative-free optimization, Adaptive Finite-Difference Interval Estimation for Noisy Derivative-Free Optimization, An overview of stochastic quasi-Newton methods for large-scale machine learning, Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition, Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- On the limited memory BFGS method for large scale optimization
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Random gradient-free minimization of convex functions
- Remark on “algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound constrained optimization”
- Estimating Derivatives of Noisy Simulations
- Estimating Computational Noise
- Implicit Filtering
- Computing Forward-Difference Intervals for Numerical Optimization
- The Effect of Rounding Errors on Newton-like Methods
- Inaccuracy in quasi-Newton methods: Local improvement theorems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Numerical Optimization
- Superlinear Convergence and Implicit Filtering
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise
- Analysis of the BFGS Method with Errors
- CUTEr and SifDec