TY - GEN
T1 - An empirical study of reducing multiclass classification methodologies
AU - Eichelberger, R. Kyle
AU - Sheng, Victor S.
N1 - Copyright:
Copyright 2013 Elsevier B.V., All rights reserved.
PY - 2013
Y1 - 2013
N2 - One-against-all and one-against-one are two popular methodologies for reducing multiclass classification problems into a set of binary classifications. In this paper, we are interested in the performance of both one-against-all and one-against-one for basic classification algorithms, such as decision tree, naïve bayes, support vector machine, and logistic regression. Since both one-against-all and one-against-one work like creating a classification committee, they are expected to improve the performance of classification algorithms. However, our experimental results surprisingly show that one-against-all worsens the performance of the algorithms on most datasets. One-against-one helps, but performs worse than the same iterations of bagging these algorithms. Thus, we conclude that both one-against-all and one-against-one should not be used for the algorithms that can perform multiclass classifications directly. Bagging is an better approach for improving their performance.
AB - One-against-all and one-against-one are two popular methodologies for reducing multiclass classification problems into a set of binary classifications. In this paper, we are interested in the performance of both one-against-all and one-against-one for basic classification algorithms, such as decision tree, naïve bayes, support vector machine, and logistic regression. Since both one-against-all and one-against-one work like creating a classification committee, they are expected to improve the performance of classification algorithms. However, our experimental results surprisingly show that one-against-all worsens the performance of the algorithms on most datasets. One-against-one helps, but performs worse than the same iterations of bagging these algorithms. Thus, we conclude that both one-against-all and one-against-one should not be used for the algorithms that can perform multiclass classifications directly. Bagging is an better approach for improving their performance.
KW - All-at-Once
KW - C4.5
KW - Logistic Regression
KW - Naive Bayes
KW - One-Against-All
KW - One-Against-One
KW - SVM
KW - multiclass classification
UR - http://www.scopus.com/inward/record.url?scp=84881236873&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-39712-7_39
DO - 10.1007/978-3-642-39712-7_39
M3 - Conference contribution
AN - SCOPUS:84881236873
SN - 9783642397110
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 505
EP - 519
BT - Machine Learning and Data Mining in Pattern Recognition - 9th International Conference, MLDM 2013, Proceedings
Y2 - 19 July 2013 through 25 July 2013
ER -