Does learning require memorization? a short tale about a long tail
From MaRDI portal
Publication:5144980
DOI10.1145/3357713.3384290OpenAlexW3035261884MaRDI QIDQ5144980
Publication date: 19 January 2021
Published in: Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1906.05271
Related Items (9)
Mehler’s Formula, Branching Process, and Compositional Kernels of Deep Neural Networks ⋮ Deep learning: a statistical viewpoint ⋮ Nyström landmark sampling and regularized Christoffel functions ⋮ Switching: understanding the class-reversed sampling in tail sample memorization ⋮ On differential privacy and adaptive data analysis with bounded space ⋮ A Universal Trade-off Between the Model Size, Test Loss, and Training Loss of Linear Predictors ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Diversity Sampling is an Implicit Regularization for Kernel Methods
This page was built for publication: Does learning require memorization? a short tale about a long tail