Sub-sampled Newton methods
From MaRDI portal
Publication:1739039
DOI10.1007/s10107-018-1346-5zbMath1412.49059OpenAlexW2900789157MaRDI QIDQ1739039
Michael W. Mahoney, Farbod Roosta-Khorasani
Publication date: 24 April 2019
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-018-1346-5
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Newton-type methods (49M15)
Related Items
Quasi-Newton methods for machine learning: forget the past, just sample, A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization, Subsampled nonmonotone spectral gradient methods, Sketched Newton--Raphson, Scalable subspace methods for derivative-free nonlinear least-squares optimization, An adaptive stochastic sequential quadratic programming with differentiable exact augmented Lagrangians, Convergence analysis of a subsampled Levenberg-Marquardt algorithm, SCORE: approximating curvature information under self-concordant regularization, An adaptive sampling augmented Lagrangian method for stochastic optimization with deterministic constraints, Statistically equivalent surrogate material models: impact of random imperfections on the elasto-plastic response, An overview of stochastic quasi-Newton methods for large-scale machine learning, Inexact restoration with subsampled trust-region methods for finite-sum minimization, Newton-MR: inexact Newton method with minimum residual sub-problem solver, Generalized linear models for massive data via doubly-sketching, Unnamed Item, Faster Riemannian Newton-type optimization by subsampling and cubic regularization, Global optimization using random embeddings, Hessian averaging in stochastic Newton methods achieves superlinear convergence, Adaptive sampling quasi-Newton methods for zeroth-order stochastic optimization, Differentially private inference via noisy optimization, Riemannian Natural Gradient Methods, Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions, Stable architectures for deep neural networks, Newton-type methods for non-convex optimization under inexact Hessian information, An investigation of Newton-Sketch and subsampled Newton methods, Randomized Approach to Nonlinear Inversion Combining Random and Optimized Simultaneous Sources and Detectors, Convergence of Newton-MR under Inexact Hessian Information, Optimization Methods for Large-Scale Machine Learning, An algorithm for the minimization of nonsmooth nonconvex functions using inexact evaluations and its worst-case complexity, Combining stochastic adaptive cubic regularization with negative curvature for nonconvex optimization, Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization, A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization, On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization, Linesearch Newton-CG methods for convex optimization with noise, Generalized self-concordant functions: a recipe for Newton-type methods, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, A hybrid stochastic optimization framework for composite nonconvex optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An inexact successive quadratic approximation method for L-1 regularized optimization
- User-friendly tail bounds for sums of random matrices
- Sample size selection in optimization methods for machine learning
- Introductory lectures on convex optimization. A basic course.
- Cubic regularization of Newton method and its global performance
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Stochastic Algorithms for Inverse Problems Involving PDEs and many Measurements
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- Uniform Sampling for Matrix Approximation
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- IMPROVED ANALYSIS OF THE SUBSAMPLED RANDOMIZED HADAMARD TRANSFORM
- On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning
- Randomized Algorithms for Matrices and Data
- Assessing Stochastic Algorithms for Large Scale Nonlinear Least Squares Problems Using Extremal Probabilities of Linear Combinations of Gamma Random Variables
- Inexact Newton Methods
- Pseudoinversus and conjugate gradients
- Trust Region Methods
- Choosing the Forcing Terms in an Inexact Newton Method
- An investigation of Newton-Sketch and subsampled Newton methods
- Randomized Approximation of the Gram Matrix: Exact Computation and Probabilistic Bounds
- Subsampled Hessian Newton Methods for Supervised Learning
- Fast Monte Carlo Algorithms for Matrices I: Approximating Matrix Multiplication
- Exact and inexact subsampled Newton methods for optimization