Nothing Special   »   [go: up one dir, main page]

Skip to main content

Advertisement

Log in

Joint Feature Selection with Dynamic Spectral Clustering

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Current clustering algorithms solved a few of the issues around clustering such as similarity measure learning, or the cluster number estimation. For instance, some clustering algorithms can learn the data similarity matrix, but to do so they need to know the cluster number beforehand. On the other hand, some clustering algorithms estimate the cluster number, but to do so they need the similarity matrix as an input. Real-world data often contains redundant features and outliers, which many algorithms are susceptive to. None of the current clustering algorithms are able to learn the data similarity measure and the cluster number simultaneously, and at the same time reduce the influence of outliers and redundant features. Here we propose a joint feature selection with dynamic spectral clustering (FSDS) algorithm that not only learns the cluster number k and data similarity measure simultaneously, but also employs the \( {\text{L}}_{2,1} \)-norm to reduce the influence of outliers and redundant features. The optimal performance could be reached when all the separated stages are combined in a unified way. Experimental results on eight real-world benchmark datasets show that our FSDS clustering algorithm outperformed the comparison clustering algorithms in terms of two evaluation metrics for clustering algorithms including ACC and Purity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Zhang Z et al (2018) Binary multi-view clustering. IEEE Trans Pattern Anal Mach Intell 41(7):1774–1782

    Article  Google Scholar 

  2. Shah SA, Koltun V (2017) Robust continuous clustering. Proc Natl Acad Sci 114(37):9814–9819

    Article  Google Scholar 

  3. Nie F, Wang X, Huang H (2014) Clustering and projected clustering with adaptive neighbors. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining. ACM

  4. Ji Y et al (2019) A context knowledge map guided coarse-to-fine action recognition. IEEE Trans Image Process 29:2742–2752

    Article  Google Scholar 

  5. Nie F et al (2011) Unsupervised and semi-supervised learning via ℓ 1-norm graph. In: In ICCV 2011. IEEE

  6. Park S, Zhao H (2018) Spectral clustering based on learning similarity matrix. Bioinformatics 34(12):2069–2076

    Article  Google Scholar 

  7. Gao L et al (2019) Hierarchical LSTMs with adaptive attention for visual captioning. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1109/TPAMI.2019.2894139

    Article  Google Scholar 

  8. Ieva C et al (2018) Discovering program topoi via hierarchical agglomerative clustering. IEEE Trans Reliab 67(3):758–770

    Article  MathSciNet  Google Scholar 

  9. Zhu X et al (2019) Spectral clustering via half-quadratic optimization. World Wide Web. https://doi.org/10.1007/s11280-019-00731-8

    Article  Google Scholar 

  10. Yang Y et al (2014) Multitask spectral clustering by exploring intertask correlation. IEEE Trans Cybern 45(5):1083–1094

    Article  Google Scholar 

  11. Zhu X, Zhu Y, Zheng W (2019) Spectral rotation for deep one-step clustering. Pattern Recogn. https://doi.org/10.1016/j.patcog.2019.107175

    Article  Google Scholar 

  12. Zhu X et al (2018) One-step multi-view spectral clustering. IEEE Trans Knowl Data Eng 31(10):2022–2034

    Article  Google Scholar 

  13. Hu R et al (2019) Robust SVM with adaptive graph learning. World Wide Web. https://doi.org/10.1007/s11280-019-00766-x

    Article  Google Scholar 

  14. Agbehadji IE et al (2019) Integration of Kestrel-based search algorithm with artificial neural network for feature subset selection. Int J Bio Inspired Comput 13(4):222–233

    Article  Google Scholar 

  15. Zhu X et al (2016) Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans Neural Netw Learn Syst 28(6):1263–1275

    Article  MathSciNet  Google Scholar 

  16. Zhu X, Li X, Zhang S (2015) Block-row sparse multiview multilabel learning for image classification. IEEE Trans Cybern 46(2):450–461

    Article  Google Scholar 

  17. Mo D, Lai Z (2019) Robust jointly sparse regression with generalized orthogonal learning for image feature selection. Pattern Recogn 93:164–178

    Article  Google Scholar 

  18. Zhao M et al (2018) Trace ratio criterion based discriminative feature selection via l2, p-norm regularization for supervised learning. Neurocomputing 321:1–16

    Article  Google Scholar 

  19. Zhang Z et al (2017) Robust neighborhood preserving projection by nuclear/L2, 1-norm regularization for image feature extraction. IEEE Trans Image Process 26(4):1607–1622

    Article  MathSciNet  MATH  Google Scholar 

  20. Zhu X et al (2017) Graph PCA hashing for similarity search. IEEE Trans Multimed 19(9):2033–2044

    Article  Google Scholar 

  21. Zhu X et al (2019) Efficient utilization of missing data in cost-sensitive learning. IEEE Trans Knowl Data Eng. https://doi.org/10.1109/TKDE.2019.2956530

    Article  Google Scholar 

  22. Knox EM, Ng RT (1998) Algorithms for mining distancebased outliers in large datasets. In: Proceedings of the international conference on very large data bases. Citeseer

  23. Suri NR, Murty MN, Athithan G (2019) Research issues in outlier detection. In: Kacprzyk J, Jain LC (eds) Outlier detection: techniques and applications. Springer, pp 29–51

  24. Liu H et al (2018) Clustering with outlier removal. arXiv preprint arXiv:1801.01899

  25. Al-Obaidi SAR et al (2019) Robust metric learning based on the rescaled hinge loss. arXiv preprint arXiv:1904.11711

  26. Ren Z et al (2019) Simultaneous learning of reduced prototypes and local metric for image set classification. Expert Syst Appl 134:102–111

    Article  Google Scholar 

  27. Yang C et al (2019) Joint correntropy metric weighting and block diagonal regularizer for robust multiple kernel subspace clustering. Inf Sci 500:48–66

    Article  MathSciNet  Google Scholar 

  28. Kang Z et al (2019) Low-rank kernel learning for graph-based clustering. Knowl Based Syst 163:510–517

    Article  Google Scholar 

  29. You C-Z, Palade V, Wu X-J (2019) Robust structure low-rank representation in latent space. Eng Appl Artif Intell 77:117–124

    Article  Google Scholar 

  30. Mojarad M et al (2019) A fuzzy clustering ensemble based on cluster clustering and iterative Fusion of base clusters. Appl Intell 49(7):2567–2581

    Article  Google Scholar 

  31. Sohn SY, Lee SH (2003) Data fusion, ensemble and clustering to improve the classification accuracy for the severity of road traffic accidents in Korea. Saf Sci 41(1):1–14

    Article  Google Scholar 

  32. Chen M et al (2019) Capped l1-norm sparse representation method for graph clustering. IEEE Access 7:54464–54471

    Article  Google Scholar 

  33. Du L et al (2015) Robust multiple kernel k-means using l21-norm. In: Twenty-fourth international joint conference on artificial intelligence

  34. Jiang B, Ding C (2017) Outlier regularization for vector data and L21 norm robustness. arXiv preprint arXiv:1706.06409

  35. Singh A, Yadav A, Rana A (2013) K-means with three different distance metrics. Int J Comput Appl 67(10):13–17

    Google Scholar 

  36. Doad PK, Mahip MB (2013) Survey on clustering algorithm & diagnosing unsupervised anomalies for network security. Int J Curr Eng Technol 3(5):2277–2410

    Google Scholar 

  37. Zhu X et al (2019) Low-rank sparse subspace for spectral clustering. IEEE Trans Knowl Data Eng 31(8):1532–1543

    Article  Google Scholar 

  38. Nie F et al (2010) Efficient and robust feature selection via joint ℓ2, 1-norms minimization. In: Advances in neural information processing systems

  39. Barron JT (2017) A more general robust loss function. arXiv preprint arXiv:1701.03077

  40. Zheng W et al (2018) Unsupervised feature selection by self-paced learning regularization. Pattern Recognit Lett. https://doi.org/10.1016/j.patrec.2018.06.029

    Article  Google Scholar 

  41. Geman S, McClure DE (1987) Statistical methods for tomographic image reconstruction. Bull Int Stat Inst 52(4):5–21

    MathSciNet  Google Scholar 

  42. Nikolova M, Chan RH (2007) The equivalence of half-quadratic minimization and the gradient linearization iteration. IEEE Trans Image Process 16(6):1623–1627

    Article  MathSciNet  Google Scholar 

  43. Black MJ, Rangarajan A (1996) On the unification of line processes, outlier rejection, and robust statistics with applications in early vision. Int J Comput Vis 19(1):57–91

    Article  Google Scholar 

  44. Voloshinov VV (2018) A generalization of the Karush–Kuhn–Tucker theorem for approximate solutions of mathematical programming problems based on quadratic approximation. Comput Math Math Phys 58(3):364–377

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was partially supported by the Research Fund of Guangxi Key Lab of Multi-source Information Mining & Security (MIMS18-M-01), the National Natural Science Foundation of China (Grants Nos.: 61876046 and 61573270); the Guangxi High Institutions Program of Introducing 100 High-Level Overseas Talents; the Strategic Research Excellence Fund at Massey University, and the Marsden Fund of New Zealand (Grant No: MAU1721).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tong Liu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, T., Martin, G. Joint Feature Selection with Dynamic Spectral Clustering. Neural Process Lett 52, 1745–1763 (2020). https://doi.org/10.1007/s11063-020-10216-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-020-10216-9

Keywords

Navigation