Feature elimination in kernel machines in moderately high dimensions
From MaRDI portal
Publication:1731769
DOI10.1214/18-AOS1696zbMath1420.68167arXiv1304.5245WikidataQ90597532 ScholiaQ90597532MaRDI QIDQ1731769
Yair Goldberg, Michael R. Kosorok, Sayan Dasgupta
Publication date: 14 March 2019
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1304.5245
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (6)
Efficient kernel-based variable selection with sparsistency ⋮ Receiver operating characteristic curves and confidence bands for support vector machines ⋮ Net benefit index: Assessing the influence of a biomarker for individualized treatment rules ⋮ Structure learning via unstructured kernel-based M-estimation ⋮ Feature elimination in kernel machines in moderately high dimensions ⋮ Kernel variable selection for multicategory support vector machines
Cites Work
- Unnamed Item
- Unnamed Item
- Support vector regression for right censored data
- Fast rates for support vector machines using Gaussian kernels
- Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimina\-tion
- Support vector machines with adaptive \(L_q\) penalty
- Feature elimination in kernel machines in moderately high dimensions
- Principal component analysis.
- Optimal aggregation of classifiers in statistical learning.
- Support Vector Machines
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Feature Screening via Distance Correlation Learning
- 10.1162/153244303322753706
- 10.1162/153244303322753751
- Gene selection for cancer classification using support vector machines
This page was built for publication: Feature elimination in kernel machines in moderately high dimensions