Feature selection for high-dimensional data
DOI10.1007/s10287-008-0070-7zbMath1168.62301OpenAlexW2071792840MaRDI QIDQ2271790
Christine De Mol, Francesca Odone, Sofia Mosci, Alessandro Verri, Augusto Destrero
Publication date: 4 August 2009
Published in: Computational Management Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10287-008-0070-7
computer visioniterative solutionscomputational biologyfeature selectionregularized methods\(L_1-L_2\) penalties
Ridge regression; shrinkage estimators (Lasso) (62J07) Applications of statistics to biology and medical sciences; meta analysis (62P10) Linear inference, regression (62J99) Computing methodologies for image processing (68U10)
Related Items (2)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Wrappers for feature subset selection
- Least angle regression. (With discussion)
- Atomic Decomposition by Basis Pursuit
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- 10.1162/153244303322753616
- 10.1162/153244303322753670
- 10.1162/153244303322753751
- Regularization and Variable Selection Via the Elastic Net
- A Sparsity-Enforcing Method for Learning Face Features
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- The elements of statistical learning. Data mining, inference, and prediction
This page was built for publication: Feature selection for high-dimensional data