A New Look at an Old Problem: A Universal Learning Approach to Linear Regression

From MaRDI portal
Publication:6318606

arXiv1905.04708MaRDI QIDQ6318606

Author name not available (Why is that?)

Publication date: 12 May 2019

Abstract: Linear regression is a classical paradigm in statistics. A new look at it is provided via the lens of universal learning. In applying universal learning to linear regression the hypotheses class represents the label yincalR as a linear combination of the feature vector xTheta where xincalRM, within a Gaussian error. The Predictive Normalized Maximum Likelihood (pNML) solution for universal learning of individual data can be expressed analytically in this case, as well as its associated learnability measure. Interestingly, the situation where the number of parameters M may even be larger than the number of training samples N can be examined. As expected, in this case learnability cannot be attained in every situation; nevertheless, if the test vector resides mostly in a subspace spanned by the eigenvectors associated with the large eigenvalues of the empirical correlation matrix of the training data, linear regression can generalize despite the fact that it uses an ``over-parametrized model. We demonstrate the results with a simulation of fitting a polynomial to data with a possibly large polynomial degree.




Has companion code repository: https://github.com/kobybibas/pnml_linear_regression_simulation








This page was built for publication: A New Look at an Old Problem: A Universal Learning Approach to Linear Regression

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6318606)