The following pages link to Albert S. Berahas (Q2143219):
Displaying 21 items.
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization (Q2143221) (← links)
- Limited-memory BFGS with displacement aggregation (Q2149548) (← links)
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods (Q4634094) (← links)
- A robust multi-batch L-BFGS method for machine learning (Q4972551) (← links)
- Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization (Q4989938) (← links)
- Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise (Q4997171) (← links)
- Quasi-Newton methods for machine learning: forget the past, just sample (Q5058389) (← links)
- An investigation of Newton-Sketch and subsampled Newton methods (Q5135249) (← links)
- Balancing Communication and Computation in Distributed Optimization (Q5228298) (← links)
- On the Convergence of Nested Decentralized Gradient Methods With Multiple Consensus and Gradient Steps (Q5868671) (← links)
- Full-low evaluation methods for derivative-free optimization (Q5882241) (← links)
- Accelerating stochastic sequential quadratic programming for equality constrained optimization using predictive variance reduction (Q6166650) (← links)
- Balancing Communication and Computation in Gradient Tracking Algorithms for Decentralized Optimization (Q6430795) (← links)
- A Flexible Gradient Tracking Algorithmic Framework for Decentralized Optimization (Q6463116) (← links)
- Gradient descent in the absence of global Lipschitz continuity of the gradients (Q6583712) (← links)
- First- and second-order high probability complexity bounds for trust-region methods with noisy oracles (Q6608030) (← links)
- Adaptive consensus: a network pruning approach for decentralized optimization (Q6644846) (← links)
- Balancing communication and computation in gradient tracking algorithms for decentralized optimization (Q6655816) (← links)
- A sequential quadratic programming method with high-probability complexity bounds for nonlinear equality-constrained stochastic optimization (Q6663117) (← links)
- Modified Line Search Sequential Quadratic Methods for Equality-Constrained Optimization with Unified Global and Local Convergence Guarantees (Q6732847) (← links)
- Exploiting Negative Curvature in Conjunction with Adaptive Sampling: Theoretical Results and a Practical Algorithm (Q6753681) (← links)