``Preconditioning for feature selection and regression in high-dimensional problems
From MaRDI portal
Publication:939656
DOI10.1214/009053607000000578zbMath1142.62022arXivmath/0703858OpenAlexW1997840761WikidataQ60691592 ScholiaQ60691592MaRDI QIDQ939656
Debashis Paul, Eric Bair, Robert Tibshirani, Trevor Hastie
Publication date: 28 August 2008
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0703858
Nonparametric regression and quantile regression (62G08) Factor analysis and principal components; correspondence analysis (62H25) Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic properties of nonparametric inference (62G20)
Related Items
Canonical thresholding for nonsparse high-dimensional linear regression, On data preconditioning for regularized loss minimization, A variable selection proposal for multiple linear regression analysis, Projective inference in high-dimensional problems: prediction and feature selection, Using reference models in variable selection, High-dimensional Cox regression analysis in genetic studies with censored survival outcomes, Bayesian projection approaches to variable selection in generalized linear models, Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization, Adaptive index models for marker-based risk stratification, Dimensionality reduction by feature clustering for regression problems, Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing, Statistical and Knowledge Supported Visualization of Multivariate Data, Nonparametric estimation of a latent variable model, Robust high-dimensional factor models with applications to statistical machine learning, Augmented sparse reconstruction of protein signaling networks, Optimal EMG placement for a robotic prosthesis controller with sequential, adaptive functional estimation (SAFE), Variable Selection for Kernel Classification, Compressed and Penalized Linear Regression, Tournament screening cum EBIC for feature selection with high-dimensional feature spaces, High-dimensional model recovery from random sketched data by exploring intrinsic sparsity, Unnamed Item, Sure Independence Screening for Ultrahigh Dimensional Feature Space, Preconditioning the Lasso for sign consistency, Estimation of a regression function corresponding to latent~variables
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Asymptotics for Lasso-type estimators.
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- High-dimensional graphs and variable selection with the Lasso
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Adaptive Model Selection
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Prediction by Supervised Principal Components