Fast Approximation of the Gauss--Newton Hessian Matrix for the Multilayer Perceptron
DOI10.1137/19M129961XzbMath1459.65037arXiv1910.12184WikidataQ114074223 ScholiaQ114074223MaRDI QIDQ5150836
Severin Reiz, Chenhan D. Yu, George Biros, Chao Chen, Hans-Joachim Bungartz
Publication date: 15 February 2021
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1910.12184
multilayer perceptronhierarchical matrixsecond-order optimizationGauss-Newton Hessianfast Monte Carlo sampling
Computational methods for problems pertaining to statistics (62-08) Sampling theory, sample surveys (62D05) Preconditioners for iterative methods (65F08)
Related Items (2)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- GitHub
- GOFMM
- Hierarchical matrices. A means to efficiently solve elliptic boundary value problems
- Kernel methods in machine learning
- Jacobian-free Newton-Krylov methods: a survey of approaches and applications.
- A fast direct solver for boundary integral equations in two dimensions
- A fast block low-rank dense solver with applications to finite-element matrices
- An Efficient Multicore Implementation of a Novel HSS-Structured Multifrontal Solver Using Randomized Sampling
- ASKIT: An Efficient, Parallel Library for High-Dimensional Kernel Summations
- Hierarchical Matrices: Algorithms and Analysis
- Fast algorithms for hierarchically semiseparable matrices
- Fast Algorithms for Classical Physics
- On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning
- Iterative Solution Methods
- Optimization Methods for Large-Scale Machine Learning
This page was built for publication: Fast Approximation of the Gauss--Newton Hessian Matrix for the Multilayer Perceptron