Are Gaussian data all you need? Extents and limits of universality in high-dimensional generalized linear estimation

From MaRDI portal
Publication:6426809

arXiv2302.08923MaRDI QIDQ6426809

Author name not available (Why is that?)

Publication date: 17 February 2023

Abstract: In this manuscript we consider the problem of generalized linear estimation on Gaussian mixture data with labels given by a single-index model. Our first result is a sharp asymptotic expression for the test and training errors in the high-dimensional regime. Motivated by the recent stream of results on the Gaussian universality of the test and training errors in generalized linear estimation, we ask ourselves the question: "when is a single Gaussian enough to characterize the error?". Our formula allow us to give sharp answers to this question, both in the positive and negative directions. More precisely, we show that the sufficient conditions for Gaussian universality (or lack of thereof) crucially depend on the alignment between the target weights and the means and covariances of the mixture clusters, which we precisely quantify. In the particular case of least-squares interpolation, we prove a strong universality property of the training error, and show it follows a simple, closed-form expression. Finally, we apply our results to real datasets, clarifying some recent discussion in the literature about Gaussian universality of the errors in this context.




Has companion code repository: https://github.com/lucpoisson/gaussianmixtureuniversality








This page was built for publication: Are Gaussian data all you need? Extents and limits of universality in high-dimensional generalized linear estimation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6426809)