A Simple Algorithm For Scaling Up Kernel Methods
From MaRDI portal
Publication:6424522
arXiv2301.11414MaRDI QIDQ6424522
Author name not available (Why is that?)
Publication date: 26 January 2023
Abstract: The recent discovery of the equivalence between infinitely wide neural networks (NNs) in the lazy training regime and Neural Tangent Kernels (NTKs) (Jacot et al., 2018) has revived interest in kernel methods. However, conventional wisdom suggests kernel methods are unsuitable for large samples due to their computational complexity and memory requirements. We introduce a novel random feature regression algorithm that allows us (when necessary) to scale to virtually infinite numbers of random features. We illustrate the performance of our method on the CIFAR-10 dataset.
Has companion code repository: https://github.com/tengandreaxu/fabr
This page was built for publication: A Simple Algorithm For Scaling Up Kernel Methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6424522)