Skip to Main Content
Skip Nav Destination
ASME Press Select Proceedings
Intelligent Engineering Systems through Artificial Neural Networks Volume 18
Editor
Cihan H. Dagli
Cihan H. Dagli
Search for other works by this author on:
ISBN-10:
0791802823
ISBN:
9780791802823
No. of Pages:
700
Publisher:
ASME Press
Publication date:
2008

The proposed method uses different input features to partition the sample space in to subspaces in a two-level decision treelike structure to enhance the performance of a classifier. The support vector machine is used as the classifier in this paper. Each input feature used is associated with a threshold such that an input vector traverses to either the right node or the left node of a parent node. Given a feature, the best threshold is usually found by minimizing a measure such as the impurity, characterized by Gini index or information entropy. In this way, for each pair of a feature and the corresponding threshold, the data is partitioned in to two groups. Each group is trained with a specialized SVM. During testing, each data point is directed to one of the SVM's based on the feature used and its threshold. The method is further generalized by choosing a subset of rank-ordered features. For this purpose, an impurity measure is used. In this way, a number of subspace classifiers are generated. In the end, the final classification is done by consensus between the subspace classifiers. This usually results in better accuracy as compared to a single SVM classification.

Abstract
Introduction
Support Vector Machine
Impurity Measure
Consensual Subspace Method
Training Algorithm
Experimental Results
Discussions
Conclusions
Acknowledgement
References
This content is only available via PDF.
You do not currently have access to this chapter.
Close Modal

or Create an Account

Close Modal
Close Modal