Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Multi-class LSTMSVM based on optimal directed acyclic graph and shuffled frog leaping algorithm

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Although TWSVM always achieves good performance for data classification, it does not take full advantage of the statistical information of the training data. Recently proposed twin mahalanobis distance-based support vector machine (TMSVM) modifies the standard TWSVM by constructing a pair of Mahalanobis distance-based kernels according to the covariance matrices of two classes of training data, which improves the generalization ability. However, TMSVW solves two dual quadratic programming problems. Moreover, it is proposed to deal with binary classification problems, while most of pattern recognition problems are problems of multi-class classification. In order to enhance the performance of TMSVM, in this paper, we formulate a fast least squares version of TMSVM which solves two modified primal problems instead of two dual problems. The solution of two modified primal problems can easily be obtained by solving a set of linear equations in the primal space. Then we propose a new multiclass classification algorithm, named DAG-LSTMSVM for multi-class classification, by combining least squares TMSVM and directed acyclic graph (DAG). A mahalanobis distance-based distance measure is designed as the class separability criterion to construct the optimal DAG structure. A modified shuffled frog leaping algorithm-based model selection for DAG-LSTMSVM is suggested for parameter selection. The experimental results on artificial dataset and UCI datasets show that the proposed algorithm obtains high classification accuracy and good generalization ability.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Cortes C, Vapnik VN (1995) Support vector networks. Mach Learn 20(2):273–297

    MATH  Google Scholar 

  2. Vapnik VN (1998) The nature of statistical learning theory. Springer, New York

    MATH  Google Scholar 

  3. Wang XZ, He Q, Chen DG et al (2005) A genetic algorithm for solving the inverse problem of support vector machines. Neurocomputing 68:225–238

    Article  Google Scholar 

  4. Suykens J, Vandewalle J (1999) Least square support vector machine classifiers. Neural Process Lett 9(3):293–300

    Article  MathSciNet  Google Scholar 

  5. G Fung, Mangasarian OL (2001) Proximal support vector machine classifiers[C]. In: Proc. 7th ACMSIFKDD intl. conf. on knowledge discovery and data mining, ACM Press, New York, pp 77–86

  6. Mangasarian OL, Wild E (2006) Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74

    Article  Google Scholar 

  7. Jayadeva R, Khemchandni S (2007) Chandra. Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  Google Scholar 

  8. Ding SF, Yu JZ, Qi BJ et al (2014) An overview on twin support vector machines. Artif Intell Rev 42(2):245–252

    Article  Google Scholar 

  9. Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543

    Article  Google Scholar 

  10. Wang YN, Zhao X, Tian YJ (2013) Local and global regularized twin SVM. Proced Comput Sci 18:1710–1719

    Article  Google Scholar 

  11. Tanveer M (2015) Robust and sparse linear programming twin support vector machines. Cogn Comput 7(1):137–149

    Article  Google Scholar 

  12. Shao YH, Chen WJ, Zhang JJ et al (2014) An efficient weighted Lagrangian twin support vector machine for imbalanced data classification. Pattern Recogn 47(9):3158–3167

    Article  Google Scholar 

  13. Kumar MA, Gopal M (2008) Application of smoothing technique on twin support vector machines. Pattern Recogn Lett 29(13):1842–1848

    Article  Google Scholar 

  14. Ding SF, Huang HJ, Weighted ZZS (2013) Smooth CHKS twin support vector machines. J Softw 24(11):2548–2557

    Article  MathSciNet  Google Scholar 

  15. Xie XJ, Sun SL (2015) Multitask centroid twin support vector machines. Neurocomputing 149(2):1085–1091

    Article  Google Scholar 

  16. Ding SF, Wu FL, Shi ZZ (2014) Wavelet twin support vector machine. Neural Comput Appl 25(6):1241–1247

    Article  Google Scholar 

  17. Ding SF, Huang HJ, Xu XZ et al (2014) Polynomial Smooth Twin Support Vector Machines. Appl Math Inf Sci 8(4):2063–2071

    Article  MathSciNet  Google Scholar 

  18. Peng XJ, Xu D (2012) Twin Mahalanobis distance-based support vector machines for pattern recognition. Inf Sci 200:22–37

    Article  MathSciNet  MATH  Google Scholar 

  19. Wang XZ, Lu SX, Zhai JH (2008) Fast fuzzy multi-category SVM based on support vector domain description. Int J Pattern Recogn Artif Intell 22(1):109–120

    Article  Google Scholar 

  20. Chu M, Wang A, Gong R et al (2014) Multi-class classification methods of enhanced LS-TWSVM for strip steel surface defects. J Iron Steel Res Int 21(2):174–180

    Article  Google Scholar 

  21. Xu Y, Guo R (2014) A twin hyper-sphere multi-class classification support vector machine. J Intell Fuzzy Syst 27(4):1783–1790

    MathSciNet  MATH  Google Scholar 

  22. Yang ZX, Shao YH, Zhang XS (2013) Multiple birth support vector machine for multi-class classification. Neural Comput Appl 22(1):153–161

    Article  Google Scholar 

  23. Tomar D, Agarwal S (2015) A comparison on multi-class classification methods based on least squares twin support vector machine. Knowl-Based Syst 81:131–147

    Article  Google Scholar 

  24. Xu YT, Guo R, Wang LS (2013) A twin multi-class classification support vector machine. Cogn Comput 5(4):580–588

    Article  Google Scholar 

  25. Xie J, Hone K, Xie W et al (2013) Extending twin support vector machine classifier for multi-category classification problems. Intell Data Anal 17(4):649–664

    Google Scholar 

  26. Shao YH, Chen WJ, Huang WB et al (2013) The best separating decision tree twin support vector machine for multi-class classification. Proced Comput Sci 17:1032–1038

    Article  Google Scholar 

  27. Elbeltagi E, Hegazy T, Grierson D (2005) Comparison among five evolutionary-based optimization algorithms. Adv Eng Inform 19(1):43–53

    Article  Google Scholar 

  28. Wang XB (2014) Applied Multivariate Analysis. Shanghai University of Finance and Economics Press, Shanghai

    Google Scholar 

  29. Wu FL, Ding SF, Huang HJ, Zhu ZB (2014) Mixed kernel twin support vector machines based on the shuffled frog leaping algorithm. J Comput 9(4):947–955

    Google Scholar 

  30. Lin J, Zhong YW (2013) Accelerated shuffled frog-leaping algorithm with Gaussian mutation. Inf Technol J 12(23):7391–7395

    Article  Google Scholar 

Download references

Acknowledgments

This work is supported by the National Natural Science Foundation of China (No. 61379101), and the National Key Basic Research Program of China (No. 2013CB329502).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shifei Ding.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, X., Ding, S. & Sun, T. Multi-class LSTMSVM based on optimal directed acyclic graph and shuffled frog leaping algorithm. Int. J. Mach. Learn. & Cyber. 7, 241–251 (2016). https://doi.org/10.1007/s13042-015-0435-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-015-0435-5

Keywords

Navigation