Recursive Finite Newton Algorithm for Support Vector Regression in the Primal
From MaRDI portal
Publication:3440430
DOI10.1162/neco.2007.19.4.1082zbMath1118.68114OpenAlexW2109534047MaRDI QIDQ3440430
Liefeng Bo, Ling Wang, Li-Cheng Jiao
Publication date: 22 May 2007
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2007.19.4.1082
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Learning and adaptive systems in artificial intelligence (68T05) Neural nets and related approaches to inference from stochastic processes (62M45)
Related Items
Training robust support vector regression with smooth non-convex loss function ⋮ Robust support vector regression in the primal ⋮ Least absolute deviation support vector regression
Uses Software
Cites Work
- Generalized Hessian matrix and second-order optimality conditions for problems with \(C^{1,1}\) data
- Finite algorithms for robust linear regression
- A finite newton method for classification
- Improvements to Platt's SMO Algorithm for SVM Classifier Design
- A Correspondence Between Bayesian Estimation on Stochastic Processes and Smoothing by Splines