Feature Scaling for Kernel Fisher Discriminant Analysis Using Leave-One-Out Cross Validation
From MaRDI portal
Publication:5468702
DOI10.1162/neco.2006.18.4.961zbMath1095.68632OpenAlexW4244156276WikidataQ48456413 ScholiaQ48456413MaRDI QIDQ5468702
Ling Wang, Liefeng Bo, Li-Cheng Jiao
Publication date: 12 May 2006
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2006.18.4.961
Related Items
Properties of the sample estimators used for statistical normalization of feature vectors, Sparse multinomial kernel discriminant analysis (sMKDA), Feature scaling via second-order cone programming, Efficient approximate leave-one-out cross-validation for kernel logistic regression, Gaussian kernel optimization for pattern classification, Kernel learning at the first level of inference
Cites Work
- Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers.
- 10.1162/15324430152748236
- Bayesian Framework for Least-Squares Support Vector Machine Classifiers, Gaussian Processes, and Kernel Fisher Discriminant Analysis
- Theory of Reproducing Kernels
- Choosing multiple parameters for support vector machines