Sub-sampled Newton methods

From MaRDI portal
Publication:1739039

DOI10.1007/s10107-018-1346-5zbMath1412.49059OpenAlexW2900789157MaRDI QIDQ1739039

Michael W. Mahoney, Farbod Roosta-Khorasani

Publication date: 24 April 2019

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10107-018-1346-5



Related Items

Quasi-Newton methods for machine learning: forget the past, just sample, A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization, Subsampled nonmonotone spectral gradient methods, Sketched Newton--Raphson, Scalable subspace methods for derivative-free nonlinear least-squares optimization, An adaptive stochastic sequential quadratic programming with differentiable exact augmented Lagrangians, Convergence analysis of a subsampled Levenberg-Marquardt algorithm, SCORE: approximating curvature information under self-concordant regularization, An adaptive sampling augmented Lagrangian method for stochastic optimization with deterministic constraints, Statistically equivalent surrogate material models: impact of random imperfections on the elasto-plastic response, An overview of stochastic quasi-Newton methods for large-scale machine learning, Inexact restoration with subsampled trust-region methods for finite-sum minimization, Newton-MR: inexact Newton method with minimum residual sub-problem solver, Generalized linear models for massive data via doubly-sketching, Unnamed Item, Faster Riemannian Newton-type optimization by subsampling and cubic regularization, Global optimization using random embeddings, Hessian averaging in stochastic Newton methods achieves superlinear convergence, Adaptive sampling quasi-Newton methods for zeroth-order stochastic optimization, Differentially private inference via noisy optimization, Riemannian Natural Gradient Methods, Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions, Stable architectures for deep neural networks, Newton-type methods for non-convex optimization under inexact Hessian information, An investigation of Newton-Sketch and subsampled Newton methods, Randomized Approach to Nonlinear Inversion Combining Random and Optimized Simultaneous Sources and Detectors, Convergence of Newton-MR under Inexact Hessian Information, Optimization Methods for Large-Scale Machine Learning, An algorithm for the minimization of nonsmooth nonconvex functions using inexact evaluations and its worst-case complexity, Combining stochastic adaptive cubic regularization with negative curvature for nonconvex optimization, Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization, A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization, On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization, Linesearch Newton-CG methods for convex optimization with noise, Generalized self-concordant functions: a recipe for Newton-type methods, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, A hybrid stochastic optimization framework for composite nonconvex optimization



Cites Work