Visualizing high-dimensional loss landscapes with Hessian directions
From MaRDI portal
Publication:6625316
DOI10.1088/1742-5468/ad13fcMaRDI QIDQ6625316
Gregory Wheeler, Lucas Böttcher
Publication date: 28 October 2024
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Multilayer feedforward networks are universal approximators
- Adaptive estimation of a quadratic functional by model selection.
- Numerical evaluation of methods approximating the distribution of a large quadratic form in normal variables
- Some large-scale matrix computation problems
- On the efficient calculation of a linear combination of chi-square random variables with an application in counting string vacua
- Randomized algorithms for estimating the trace of an implicit symmetric positive semi-definite matrix
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- A Stochastic Estimator of the Trace of the Influence Matrix for Laplacian Smoothing Splines
- On variants of the Johnson–Lindenstrauss lemma
- Algorithm AS 155: The Distribution of a Linear Combination of χ 2 Random Variables
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- ARPACK Users' Guide
- The Gap between Theory and Practice in Function Approximation with Deep Neural Networks
- Global Minima of Overparameterized Neural Networks
- Entropic gradient descent algorithms and wide flat minima*
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Shaping the learning landscape in neural networks around wide flat minima
- Universal statistics of Fisher information in deep neural networks: mean field approach*
This page was built for publication: Visualizing high-dimensional loss landscapes with Hessian directions