Variable Selection in Nonparametric Classification Via Measurement Error Model Selection Likelihoods
DOI10.1080/01621459.2013.858630zbMath1367.62219OpenAlexW2090524995WikidataQ40993588 ScholiaQ40993588MaRDI QIDQ4975400
Leonard A. Stefanski, Yichao Wu, Kyle White
Publication date: 4 August 2017
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: http://europepmc.org/articles/pmc4066561
attenuationconvolutionmodel selectiondiscriminant analysisridge regressionlinear regressionbinary regressionLassoBayes rulekernel discriminant analysismaximum likelihood rule
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Nonparametric estimation of regression functions with both categorical and continuous data
- Sparse linear discriminant analysis by thresholding for high dimensional data
- High-dimensional classification using features annealed independence rules
- A decision-theoretic generalization of on-line learning and an application to boosting
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- Covariance-Regularized Regression and Classification for high Dimensional Problems
- Median-Based Classifiers for High-Dimensional Data
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- A Direct Estimation Approach to Sparse Linear Discriminant Analysis
- Measurement Error
- Classification via kernel product estimators
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Measurement Error in Nonlinear Models