Abstract
We present an improved version of One-Against-All (OAA) method for multiclass SVM classification based on a decision tree approach. The proposed decision tree based OAA (DT-OAA) is aimed at increasing the classification speed of OAA by using posterior probability estimates of binary SVM outputs. DT-OAA decreases the average number of binary SVM tests required in testing phase to a greater extent when compared to OAA and other multiclass SVM methods. For a balanced multiclass dataset with K classes, under best situation, DT-OAA requires only (K + 1)/2 binary tests on an average as opposed to K binary tests in OAA; however, on imbalanced multiclass datasets we observed DT-OAA to be much faster with proper selection of order in which the binary SVMs are arranged in the decision tree. Computational comparisons on publicly available datasets indicate that the proposed method can achieve almost the same classification accuracy as that of OAA, but is much faster in decision making.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Vapnik V (1998) Statistical learning theory. Wiley, New York
Burges CJC (1996) Simplified support vector decision rules. In: Proceedings of the 13th international conference on machine learning, Italy. Morgan Kaufmann, San Francisco, CA, pp 71–77
Downs T, Gates K, Masters A (2001) Exact simplification of support vector solutions. Journal of Machine Learning Research 2: 293–297
Osuna E, Girosi F (1999) Reducing the run-time complexity of support vector machines. Advances in kernel methods—support vector learning. MIT Press, Cambridge, pp 271–283
Li Q, Jiao L, Hao Y (2007) Adaptive simplification of solution for support vector machines. Pattern Recognit 40: 972–980
Zhang K, Kwok JT (2010) Simplifying mixture models through function approximation. IEEE Trans Neural Netw 21(4): 644–658
Tang B, Mazzoni D (2006) Multiclass reduced set support vector machines. In: Proceedings of the 23rd international conference on machine learning. Pittsburgh, PA, pp 921–928
Knerr S, Personnaz L, Dreyfus G (1990) Single-layer learning revisited a stepwise procedure for building and training a neural network. In: Fogelman J (eds) Neurocomputing: algorithms, architechtures, and applications. Springer-Verlag, New York
Hastie T, Tibshirani R (1998) Classification by pairwise coupling. Ann Stat 26(2): 451–471
Platt JC, Christiani N, Shawe–Taylor (1999) Large margin DAGs for multiclass classification. In: Solla SA, Leen TK, Muller KR (eds) Proceedings of the neural information processing systems (NIPS’99). MIT Press, pp 547–553
Fei B, Liu J (2006) Binary tree of SVM: a new fast multiclass training and classification algorithm. IEEE Trans Neural Netw 17(3): 696–704
Pedrajas NG, Boyer DO (2006) Improving multiclass pattern recognition by the combination of two strategies. IEEE Trans Pattern Anal Mach Intell 28(6): 1001–1006
Hsu C, Lin C (2001) A comparison of methods for multi-class support vector machines. IEEE Trans Neural Netw 13(2): 415–425
Rifkin R, Klautau A (2004) In defense of one-vs-all classification. J Mach Learn Res 5: 101–141
Mangasarian OL (1994) Nonlinear programming. SIAM, Philadelphia
Frank A, Asuncion A (2010) UCI machine learning repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science.
Statlog datasets available at ftp://ncc.up.pt/pub/statlog/
Hull JJ (1994) A database for handwritten text recognition research. IEEE Trans Pattern Anal Mach Intell 16(5): 550–554
Joachims T (1998) Making large-scale SVM learning practical. Advances in kernel methods—support vector learning. MIT Press, Cambridge
MATLAB. http://www/mathworks.com
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Arun Kumar, M., Gopal, M. Fast Multiclass SVM Classification Using Decision Tree Based One-Against-All Method. Neural Process Lett 32, 311–323 (2010). https://doi.org/10.1007/s11063-010-9160-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-010-9160-y