Nothing Special   »   [go: up one dir, main page]

Skip to main content

Improved Performance of EK-NNClus by Selecting Appropriate Parameter

  • Conference paper
  • First Online:
Belief Functions: Theory and Applications (BELIEF 2018)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11069))

Included in the following conference series:

Abstract

EK-NNclus is an evidential clustering method based on the evidential K-nearest neighbors classification rule. Its one significant merit is that it does not require any priori on the number of clusters. However, the EK-NNclus suffers from the influence of number K. In other words, the performance of EK-NNclus is sensitive to K: if the number K is too small, the natural cluster may be split into two or more clusters; otherwise, two or more natural clusters may be merged into one cluster. In this paper, we indicated that tuning the parameters (such as \(\alpha \) in the discounting function) can take full advantage of the distances between the object and its nearest neighbors, which can prevent natural clusters from being merged. Some numerical experiments were conducted and the experimental results suggested that the performance of EK-NNclus can be improved if appropriate \(\alpha \) is selected.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Chuang, K.S., Tzeng, H.L., Chen, S., Wu, J., Chen, T.J.: Fuzzy c-means clustering with spatial information for image segmentation. Comput. Med. Imaging Graph. 30(1), 9–15 (2006)

    Article  Google Scholar 

  2. Zhang, X., Wang, G., Su, G., Guo, Q., Zhang, C., Chen, B.: An improved Fuzzy algorithm for image segmentation using peak detection, spatial information and reallocation. Soft Comput. 21(8), 2165–2173 (2017)

    Article  Google Scholar 

  3. Ikonen, E., Selek, I., Najim, K.: Process control using finite Markov chains with iterative clustering. Comput. Chem. Eng. 93, 293–308 (2016)

    Article  Google Scholar 

  4. Yiakopoulos, C.T., Gryllias, K.C., Antoniadis, I.A.: Rolling element bearing fault detection in industrial environments based on a K-means clustering approach. Expert Syst. Appl. 38(3), 2888–2911 (2011)

    Article  Google Scholar 

  5. Yin, S., Huang, Z.: Performance monitoring for vehicle suspension system via fuzzy positivistic C-Means clustering based on accelerometer measurements. IEEE/ASME Trans. Mechatron. 20(5), 2613–2620 (2015)

    Article  Google Scholar 

  6. Dunn, J.C.: A fuzzy relative of the ISODATA process and its use in detecting compact well-separated clusters. J. Cybern. 3(3), 32–57 (1974)

    Article  MathSciNet  Google Scholar 

  7. Masson, M.H., Denoeux, T.: ECM: an evidential version of the fuzzy c-means algorithm. Pattern Recogn. 41(4), 1384–1397 (2008)

    Article  Google Scholar 

  8. Denoeux, T., Kanjanatarakul, O., Sriboonchitta, S.: EK-NNclus: a clustering procedure based on the evidential K-Nearest neighbor rule. Knowl. Based Syst. 88(3), 57–69 (2015)

    Article  Google Scholar 

  9. Galan-Marin, G., Munoz-Perez, J.: Design and analysis of maximum Hopfield networks. IEEE Trans. Neural Netw. 12(2), 329–39 (2001)

    Article  Google Scholar 

  10. Hopfield, J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79(8), 2554–2558 (1982)

    Article  MathSciNet  Google Scholar 

  11. Smets, P.: Decision making in the TBM: the necessity of the pignistic transformation. Int. J. Approximate Reasoning 38, 133–147 (2005). Elsevier Science Inc

    Google Scholar 

  12. Shafer, G.: A Mathematical Theory of Evidence. Princeton University Press, Princeton (1976)

    MATH  Google Scholar 

  13. Denoeux, T.: A K-Nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans. Syst. Man Cybern. 25(5), 804–813 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qian Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, Q., Su, Zg. (2018). Improved Performance of EK-NNClus by Selecting Appropriate Parameter. In: Destercke, S., Denoeux, T., Cuzzolin, F., Martin, A. (eds) Belief Functions: Theory and Applications. BELIEF 2018. Lecture Notes in Computer Science(), vol 11069. Springer, Cham. https://doi.org/10.1007/978-3-319-99383-6_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-99383-6_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-99382-9

  • Online ISBN: 978-3-319-99383-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics