The impact of noise on evaluation complexity: the deterministic trust-region case
From MaRDI portal
Publication:2696963
DOI10.1007/s10957-022-02153-5OpenAlexW3145743572MaRDI QIDQ2696963
Gianmarco Gurioli, Benedetta Morini, Stefania Bellavia, Philippe Louis Toint
Publication date: 17 April 2023
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2104.02519
Related Items (1)
Uses Software
Cites Work
- Gradient methods for minimizing composite functions
- Universal gradient methods for convex optimization problems
- Stochastic optimization using a trust-region method and random models
- An algorithm for the minimization of nonsmooth nonconvex functions using inexact evaluations and its worst-case complexity
- Adaptive regularization for nonconvex optimization using inexact function values and randomly perturbed derivatives
- A note on solving nonlinear optimization problems in variable precision
- Newton-type methods for non-convex optimization under inexact Hessian information
- Recent advances in trust region algorithms
- Convergence of Trust-Region Methods Based on Probabilistic Models
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Testing Unconstrained Optimization Software
- Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients
- On the employment of inexact restoration for the minimization of functions whose evaluation is subject to errors
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise
- Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy
- Worst-Case Examples for Lasserre’s Measure–Based Hierarchy for Polynomial Optimization on the Hypercube
- On the Global Convergence of Trust Region Algorithms Using Inexact Gradient Information
- A Stochastic Line Search Method with Expected Complexity Analysis
- Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints
- Iteration and evaluation complexity for the minimization of functions whose computation is intrinsically inexact
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
This page was built for publication: The impact of noise on evaluation complexity: the deterministic trust-region case