Additive regularization trade-off: fusion of training and validation levels in kernel methods
From MaRDI portal
Publication:2491372
DOI10.1007/s10994-005-5315-xzbMath1470.68157OpenAlexW2062112423MaRDI QIDQ2491372
Bart De Moor, Johan A. K. Suykens, Kristiaan Pelckmans
Publication date: 29 May 2006
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-005-5315-x
Convex programming (90C25) Applications of mathematical programming (90C90) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Optimally regularised kernel Fisher discriminant classification ⋮ SVD-LSSVM and its application in chemical pattern classification ⋮ Efficient cross-validation for kernelized least-squares regression with sparse basis expansions ⋮ Classifier learning with a new locality regularization method ⋮ Four encounters with system identification ⋮ Implementation of algorithms for tuning parameters in regularized least squares problems in system identification
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Modeling by shortest data description
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers.
- Weighted least squares support vector machines: robustness and sparse approximation
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Statistical predictor identification
- Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks
- A comparative study of ordinary cross-validation, v-fold cross-validation and the repeated learning-testing methods
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- Analysis of Discrete Ill-Posed Problems by Means of the L-Curve
- 10.1162/153244302760200704
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Some Comments on C P
- The elements of statistical learning. Data mining, inference, and prediction
- Choosing multiple parameters for support vector machines