A trust region method for noisy unconstrained optimization
From MaRDI portal
Publication:6052069
DOI10.1007/s10107-023-01941-9zbMath1526.65023arXiv2201.00973OpenAlexW4360829331MaRDI QIDQ6052069
Publication date: 23 October 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2201.00973
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Complexity and performance of numerical algorithms (65Y20)
Cites Work
- Unnamed Item
- Unnamed Item
- Sample size selection in optimization methods for machine learning
- More test examples for nonlinear programming codes
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Stochastic optimization using a trust-region method and random models
- Numerical experiments with the Lancelot package (Release \(A\)) for large-scale nonlinear optimization
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- Random gradient-free minimization of convex functions
- The impact of noise on evaluation complexity: the deterministic trust-region case
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Multifidelity approaches for optimization under uncertainty
- Estimating Computational Noise
- Numerical Optimization
- Adaptive Sampling Strategies for Stochastic Optimization
- Survey of Multifidelity Methods in Uncertainty Propagation, Inference, and Optimization
- On Sampling Rates in Simulation-Based Recursions
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- Optimization Methods for Large-Scale Machine Learning
- Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization
- Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise
- Adaptive Finite-Difference Interval Estimation for Noisy Derivative-Free Optimization
- On the Global Convergence of Trust Region Algorithms Using Inexact Gradient Information
- Analysis of the BFGS Method with Errors
- A Stochastic Line Search Method with Expected Complexity Analysis
- Exact and inexact subsampled Newton methods for optimization
- Constrained Optimization in the Presence of Noise