An investigation of Newton-Sketch and subsampled Newton methods
From MaRDI portal
Publication:5135249
DOI10.1080/10556788.2020.1725751zbMath1454.90112arXiv1705.06211OpenAlexW3005768517MaRDI QIDQ5135249
Albert S. Berahas, Nocedal, Jorge, Raghu Bollapragada
Publication date: 19 November 2020
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1705.06211
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items
Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy, Quasi-Newton methods for machine learning: forget the past, just sample, A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization, Subsampled nonmonotone spectral gradient methods, Scalable subspace methods for derivative-free nonlinear least-squares optimization, An adaptive stochastic sequential quadratic programming with differentiable exact augmented Lagrangians, Convergence analysis of a subsampled Levenberg-Marquardt algorithm, M-IHS: an accelerated randomized preconditioning method avoiding costly matrix decompositions, An overview of stochastic quasi-Newton methods for large-scale machine learning, Inexact restoration with subsampled trust-region methods for finite-sum minimization, Newton-MR: inexact Newton method with minimum residual sub-problem solver, Global optimization using random embeddings, Hessian averaging in stochastic Newton methods achieves superlinear convergence, Random projections of linear and semidefinite problems with linear inequalities, Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Unnamed Item, Convergence of Newton-MR under Inexact Hessian Information, Sub-sampled Newton methods, Stochastic proximal quasi-Newton methods for non-convex composite optimization, A robust multi-batch L-BFGS method for machine learning, An Inertial Newton Algorithm for Deep Learning, A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization, Linesearch Newton-CG methods for convex optimization with noise, Convergence Analysis of Inexact Randomized Iterative Methods
Cites Work
- Faster least squares approximation
- Sketching meets random projection in the dual: a provable recovery algorithm for big and high-dimensional data
- Sub-sampled Newton methods
- Coordinate descent algorithms
- Computational Advertising: Techniques for Targeting Relevant Ads
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
- Numerical Optimization
- Optimization Methods for Large-Scale Machine Learning
- A Stochastic Approximation Method
- Exact and inexact subsampled Newton methods for optimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item