Mathematical Research Data Initiative
Main page
Recent changes
Random page
Help about MediaWiki
Create a new Item
Create a new Property
Create a new EntitySchema
Merge two items
In other projects
Discussion
View source
View history
Purge
English
Log in

Feasible direction decomposition algorithms for training support vector machines

From MaRDI portal
Publication:5959965
Jump to:navigation, search

DOI10.1023/A:1012479116909zbMath1050.68125OpenAlexW1512068198MaRDI QIDQ5959965

Pavel Laskov

Publication date: 11 April 2002

Published in: Machine Learning (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1023/a:1012479116909

zbMATH Keywords

working set selection


Mathematics Subject Classification ID

Learning and adaptive systems in artificial intelligence (68T05)


Related Items

A Note on the Decomposition Methods for Support Vector Regression, On the complexity of working set selection, A faster gradient ascent learning algorithm for nonlinear SVM, Parameter selection of support vector regression based on hybrid optimization algorithm and its application, Neighborhood Property–Based Pattern Selection for Support Vector Machines, An efficient support vector machine learning method with second-order cone programming for large-scale problems, Training v-Support Vector Regression: Theory and Algorithms


Uses Software

  • SVMlight


Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:5959965&oldid=12129722"
Tools
What links here
Related changes
Special pages
Printable version
Permanent link
Page information
MaRDI portal item
This page was last edited on 30 January 2024, at 02:25.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki