Fast exact leave-one-out cross-validation of sparse least-squares support vector machines
From MaRDI portal
Publication:705597
DOI10.1016/j.neunet.2004.07.002zbMath1073.68072OpenAlexW2069659771WikidataQ80997718 ScholiaQ80997718MaRDI QIDQ705597
Gavin C. Cawley, Nicola L. C. Talbot
Publication date: 31 January 2005
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2004.07.002
Related Items (14)
A Least-Squares Method for Sparse Low Rank Approximation of Multivariate Functions ⋮ \textsf{StreaMRAK} a streaming multi-resolution adaptive kernel algorithm ⋮ Efficient sparse least squares support vector machines for pattern classification ⋮ Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression ⋮ Training sparse least squares support vector machines by the QR decomposition ⋮ Low rank updated LS-SVM classifiers for fast variable selection ⋮ FEATURE SELECTION VIA LEAST SQUARES SUPPORT FEATURE MACHINE ⋮ Efficient cross-validation for kernelized least-squares regression with sparse basis expansions ⋮ Optimized fixed-size kernel models for large data sets ⋮ A multiscale method for semi-linear elliptic equations with localized uncertainties and non-linearities ⋮ Efficient approximate k‐fold and leave‐one‐out cross‐validation for ridge regression ⋮ Model selection for the LS-SVM. Application to handwriting recognition ⋮ Efficient approximate leave-one-out cross-validation for kernel logistic regression ⋮ Nuclear discrepancy for single-shot batch active learning
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Interpolation of scattered data: distance matrices and conditionally positive definite functions
- Hedonic housing prices and the demand for clean air
- Weighted least squares support vector machines: robustness and sparse approximation
- Improved sparse least-squares support vector machines
- Some results on Tchebycheffian spline functions and stochastic processes
- Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks
- 10.1162/15324430260185637
- Updating the Inverse of a Matrix
- Chaos control using least-squares support vector machines
- The Relationship between Variable Selection and Data Agumentation and a Method for Prediction
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- A Simplex Method for Function Minimization
- Adjustment of an Inverse Matrix Corresponding to a Change in One Element of a Given Matrix
- An Inverse Matrix Adjustment Arising in Discriminant Analysis
- The elements of statistical learning. Data mining, inference, and prediction
- Choosing multiple parameters for support vector machines
This page was built for publication: Fast exact leave-one-out cross-validation of sparse least-squares support vector machines