No dimension-free deterministic algorithm computes approximate stationarities of Lipschitzians
From MaRDI portal
Publication:6634520
DOI10.1007/S10107-023-02031-6MaRDI QIDQ6634520
Publication date: 7 November 2024
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Analysis of algorithms and problem complexity (68Q25) Abstract computational complexity for mathematical programming problems (90C60) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Lectures on convex optimization
- The string guessing problem as a method to prove lower bounds on the advice complexity
- On the limited memory BFGS method for large scale optimization
- Perturbed iterate SGD for Lipschitz continuous loss functions
- Lower bounds for finding stationary points I
- Lower bounds for finding stationary points II: first-order methods
- Stochastic subgradient method converges on tame functions
- Cubic regularization of Newton method and its global performance
- A Nonderivative Version of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Optimization of lipschitz continuous functions
- A random polynomial-time algorithm for approximating the volume of convex bodies
- Trust Region Methods
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Black-Box Complexity of Local Minimization
- Modern Nonconvex Nondifferentiable Optimization
- On Nonconvex Optimization for Machine Learning
- Pathological Subgradient Dynamics
- Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems
- Stochastic Approximations and Differential Inclusions
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- Lower Bounds on the Oracle Complexity of Nonsmooth Convex Optimization via Information Theory
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
This page was built for publication: No dimension-free deterministic algorithm computes approximate stationarities of Lipschitzians
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6634520)