Combining multiple classifiers for wrapper feature selection (Q1046572)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Combining multiple classifiers for wrapper feature selection |
scientific article; zbMATH DE number 5651377
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Combining multiple classifiers for wrapper feature selection |
scientific article; zbMATH DE number 5651377 |
Statements
Combining multiple classifiers for wrapper feature selection (English)
0 references
22 December 2009
0 references
Summary: Wrapper feature selection methods are widely used to select relevant features. However, wrappers only use a single classifier. The downside to this approach is that each classifier will have its own biases and will therefore select very different features. In order to overcome the biases of individual classifiers, this study introduces a new data mining method called wrapper-based decision trees (WDT), which combines different classifiers and uses decision trees to classify selected features. The WDT method combines multiple classifiers so selecting classifiers for use in the combinations is an important issue. Thus, we investigate how the number and nature of classifiers influence the results of feature selection. Regarding the number of classifiers, results showed that few classifiers selected more relevant features whereas many selected few features.!!par!!Regarding the nature of classifier, decision tree classifiers selected more features and the features that generated accuracies much higher than other classifiers.
0 references
feature selection
0 references
wrappers
0 references
decision tree
0 references
DT
0 references
support vector machine
0 references
SVM
0 references
Bayesian networks
0 references
BN
0 references
0.8998191
0 references
0 references