Stochastic Three Points Method for Unconstrained Smooth Minimization
From MaRDI portal
Publication:4971022
DOI10.1137/19M1244378zbMath1451.90150arXiv1902.03591OpenAlexW3090853514MaRDI QIDQ4971022
Peter Richtárik, Eduard Gorbunov, El Houcine Bergou
Publication date: 8 October 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.03591
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56) Randomized algorithms (68W20)
Related Items (10)
Efficient unconstrained black box optimization ⋮ Zeroth-order algorithms for stochastic distributed nonconvex optimization ⋮ Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling ⋮ Scalable subspace methods for derivative-free nonlinear least-squares optimization ⋮ Direct Search Based on Probabilistic Descent in Reduced Spaces ⋮ Global optimization using random embeddings ⋮ Quadratic regularization methods with finite-difference gradient approximations ⋮ Worst-case evaluation complexity of a derivative-free quadratic regularization method ⋮ Recent Theoretical Advances in Non-Convex Optimization ⋮ Worst-case complexity bounds of directional direct-search methods for multiobjective optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Worst case complexity of direct search
- A derivative-free nonmonotone line-search technique for unconstrained optimization
- Convergence of a random optimization method for constrained optimization problems
- On the convergence of the Baba and Dorea random optimization methods
- Random gradient-free minimization of convex functions
- Random optimization
- Optimization of Convex Functions with Random Pursuit
- Introduction to Derivative-Free Optimization
- Testing Unconstrained Optimization Software
- Convergence estimates for iterative minimization methods
- On Convergence of a Random Search Method in Convex Minimization Problems
- Introduction to Shape Optimization
- An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization
- Direct Search Based on Probabilistic Descent
- Expected number of steps of a random optimization method
- Shape optimization by the homogenization method
- Benchmarking optimization software with performance profiles.
- Worst case complexity of direct search under convexity
This page was built for publication: Stochastic Three Points Method for Unconstrained Smooth Minimization