On principal components regression, random projections, and column subsampling
From MaRDI portal
Publication:1616329
DOI10.1214/18-EJS1486zbMath1414.62219arXiv1709.08104OpenAlexW2963022876MaRDI QIDQ1616329
Publication date: 1 November 2018
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1709.08104
Factor analysis and principal components; correspondence analysis (62H25) Linear regression; mixed models (62J05)
Related Items (6)
High-dimensional clustering via random projections ⋮ Dimensionality Reduction, Regularization, and Generalization in Overparameterized Regressions ⋮ Sketching for Principal Component Regression ⋮ Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model ⋮ Thin-shell theory for rotationally invariant random simplices ⋮ Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators
- Statistics for high-dimensional data. Methods, theory and applications.
- Optimal selection of reduced rank estimators of high-dimensional matrices
- A tail inequality for quadratic forms of subgaussian random vectors
- On regularization algorithms in learning theory
- A simple proof of the restricted isometry property for random matrices
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Adaptive estimation of a quadratic functional by model selection.
- Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform
- Computational Advertising: Techniques for Targeting Relevant Ads
- Randomized Sketches of Convex Programs With Sharp Guarantees
- New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
- IMPROVED ANALYSIS OF THE SUBSAMPLED RANDOMIZED HADAMARD TRANSFORM
- Extensions of Lipschitz mappings into a Hilbert space
- On variants of the Johnson–Lindenstrauss lemma
- Nearest-neighbor-preserving embeddings
- On b-bit min-wise hashing for large-scale regression and classification with sparse data
- Random Projections for Large-Scale Regression
- Optimization Methods for Large-Scale Machine Learning
- Compressed and Privacy-Sensitive Sparse Regression
- A Random Matrix-Theoretic Approach to Handling Singular Covariance Estimates
- Random-projection Ensemble Classification
- Normal Multivariate Analysis and the Orthogonal Group
This page was built for publication: On principal components regression, random projections, and column subsampling