Augmentation of Densest Subgraph Finding Unsupervised Feature Selection Using Shared Nearest Neighbor Clustering
<p>Block diagram of the proposed method.</p> "> Figure 2
<p>Bar chart for the number of times the best value is reported by considered methods on ACC and MCC.</p> "> Figure 3
<p>Comparison of considered feature-selection methods with different classifiers on the number of datasets for which best-value is attained on (<b>a</b>) ACC parameter; and (<b>b</b>) MCC parameter.</p> "> Figure 4
<p>Comparison of the number of selected features by considered method.</p> ">
Abstract
:1. Introduction
2. Preliminaries
2.1. Maximal Information Compression Index (MICI)
2.2. Graph Density
2.3. Edge-Weighted Degree
2.4. Shared Nearest Neighbors
2.5. Nearest Neighbor Threshold Factor (β)
3. Proposed Method
- First Phase: Finding the maximally non-redundant feature subset.
- Second Phase: Maintaining the cluster structure of the original subspace at the cost of including some redundant features.
Algorithm 1: Densest Feature Graph Augmentation with Disjoint Feature Clusters |
Input: Graph G = (V,E,W); Parameters 0 < β <= 1, k >= 0 |
Output: Resultant Feature Subset S |
(1) Set S = V |
(2) Find S’ s.t. |
(3) if ≥ d(S) then |
(4) Set |
(5) go to 2 |
(6) end if |
(7) Find |
(8) Generate nearest neighbor list containing N nearest neighbors of i |
(9) Initialize C as an empty list of clusters |
(10) Let C’ = C |
(11) for each do |
(12) Generate iRj) |
(13) if |Ci| ≠ 0 then |
(14) |
(15) Add to C’ |
(16) else |
(17) Add {i} to C’ |
(18) end if |
(19) end for |
(20) if |C’| < k then |
(21) Set C = C’ |
(22) if |C| = 0 then |
(23) Set N = N − 1 |
(24) go to 10 |
(25) end if |
(26) end if |
(27) if C’ ≠ C then |
(28) Set C = C’ |
(29) go to 10 |
(30) end if |
(31) do |
(32) then |
(33) Add i to S where |
(34) end if |
(35) end for |
(36) Output the set S |
4. Experimental Setup
4.1. Considered Dataset
4.2. Performance Evaluation
5. Performance Analysis
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Bolón-Canedo, V.; Sánchez-Maroño, N.; Alonso-Betanzos, A. Recent advances and emerging challenges of feature selection in the context of big data. Knowl.-Based Syst. 2015, 86, 33–45. [Google Scholar] [CrossRef]
- Bellman, R. Dynamic Programming; Princeton University Press: Princeton, NJ, USA, 1957; p. 18. [Google Scholar]
- Keogh, E.; Mueen, A. Curse of Dimensionality. In Encyclopedia of Machine Learning and Data Mining; Springer: Boston, MA, USA, 2017; pp. 314–315. [Google Scholar] [CrossRef]
- Kohavi, R.; John, G.H. Wrappers for feature subset selection. Artif. Intell. 1997, 97, 273–324. [Google Scholar] [CrossRef] [Green Version]
- Guyon, I.; Elisseeff, A. An Introduction of Variable and Feature Selection. J. Mach. Learn. Res. Spec. Issue Var. Feature Sel. 2003, 3, 1157–1182. [Google Scholar] [CrossRef]
- Bolón-Canedo, V.; Sánchez-Maroño, N.; Alonso-Betanzos, A.; Benítez, J.; Herrera, F. A review of microarray datasets and applied feature selection methods. Inf. Sci. 2014, 282, 111–135. [Google Scholar] [CrossRef]
- Forman, G. An extensive empirical study of feature selection metrics for text classification. J. Mach. Learn. Res. 2003, 3, 1289–1305. [Google Scholar]
- Setia, L.; Burkhardt, H. Feature Selection for Automatic Image Annotation. Lect. Notes Comput. Sci. 2006, 2, 294–303. [Google Scholar] [CrossRef] [Green Version]
- Lin, C.-T.; Prasad, M.; Saxena, A. An Improved Polynomial Neural Network Classifier Using Real-Coded Genetic Algorithm. IEEE Trans. Syst. Man Cybern. Syst. 2015, 45, 1389–1401. [Google Scholar] [CrossRef]
- Pal, N.; Eluri, V.; Mandal, G. Fuzzy logic approaches to structure preserving dimensionality reduction. IEEE Trans. Fuzzy Syst. 2002, 10, 277–286. [Google Scholar] [CrossRef]
- Zhang, G. Neural networks for classification: A survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2000, 30, 451–462. [Google Scholar] [CrossRef] [Green Version]
- Bandyopadhyay, S.; Bhadra, T.; Mitra, P.; Maulik, U. Integration of dense subgraph finding with feature clustering for unsupervised feature selection. Pattern Recognit. Lett. 2014, 40, 104–112. [Google Scholar] [CrossRef]
- Peng, H.; Long, F.; Ding, C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1226–1238. [Google Scholar] [CrossRef] [PubMed]
- Mittal, H.; Saraswat, M.; Bansal, J.; Nagar, A. Fake-Face Image Classification using Improved Quantum-Inspired Evolutionary-based Feature Selection Method. In Proceedings of the IEEE Symposium Series on Computational Intelligence, Canberra, Australia, 1–4 December 2020; pp. 989–995. [Google Scholar]
- Guyon, I.; Gunn, S.; Nikravesh, M.; Zadeh, L. Feature Extraction: Foundations and Applications; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Bennasar, M.; Hicks, Y.; Setchi, R. Feature selection using Joint Mutual Information Maximisation. Expert Syst. Appl. 2015, 42, 8520–8532. [Google Scholar] [CrossRef] [Green Version]
- Mandal, M.; Mukhopadhyay, A. Unsupervised Non-redundant Feature Selection: A Graph-Theoretic Approach. In Advances in Intelligent Systems and Computing; Springer: Berlin, Heidelberg, 2013; pp. 373–380. [Google Scholar] [CrossRef]
- Lim, H.; Kim, D.-W. Pairwise dependence-based unsupervised feature selection. Pattern Recognit. 2020, 111, 107663. [Google Scholar] [CrossRef]
- Cai, D.; Zhang, C.; He, X. Unsupervised feature selection for multi-cluster data. In Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining-KDD ‘10, Washington, DC, USA, 24–28 July 2010. [Google Scholar] [CrossRef] [Green Version]
- Liu, Y.; Liu, K.; Zhang, C.; Wang, J.; Wang, X. Unsupervised feature selection via Diversity-induced Self-representation. Neurocomputing 2016, 219, 350–363. [Google Scholar] [CrossRef]
- Zhu, P.; Zuo, W.; Zhang, L.; Hu, Q.; Shiu, S.C. Unsupervised feature selection by regularized self-representation. Pattern Recognit. 2015, 48, 438–446. [Google Scholar] [CrossRef]
- Mittal, H.; Saraswat, M. A New Fuzzy Cluster Validity Index for Hyperellipsoid or Hyperspherical Shape Close Clusters with Distant Centroids. IEEE Trans. Fuzzy Syst. 2020, 29, 3249–3258. [Google Scholar] [CrossRef]
- Lee, J.; Seo, W.; Kim, D. Efficient information-theoretic unsupervised feature selection. Electron. Lett. 2018, 54, 76–77. [Google Scholar] [CrossRef]
- Han, D.; Kim, J. Unsupervised Simultaneous Orthogonal basis Clustering Feature Selection. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015. [Google Scholar] [CrossRef]
- Das, A.; Kumar, S.; Jain, S.; Goswami, S.; Chakrabarti, A.; Chakraborty, B. An information-theoretic graph-based approach for feature selection. Sādhanā 2019, 45, 1. [Google Scholar] [CrossRef]
- He, X.; Cai, D.; Niyogi, P. Laplacian Score for Feature Selection. In Proceedings of the 18th International Conference on Neural Information Processing Systems 2005, Vancouver, BA, Canada, 5–8 December 2005; pp. 507–514. [Google Scholar]
- Mitra, P.; Murthy, C.; Pal, S. Unsupervised feature selection using feature similarity. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 301–312. [Google Scholar] [CrossRef]
- Dua, D.; Graff, C. UCI Machine Learning Repository; University of California, School of Information and Computer Science: Irvine, CA, USA, 2019; Available online: http://archive.ics.uci.edu/ml (accessed on 1 June 2022).
- Gakii, C.; Mireji, P.O.; Rimiru, R. Graph Based Feature Selection for Reduction of Dimensionality in Next-Generation RNA Sequencing Datasets. Algorithms 2022, 15, 21. [Google Scholar] [CrossRef]
- Das, A.K.; Goswami, S.; Chakrabarti, A.; Chakraborty, B. A new hybrid feature selection approach using feature association map for supervised and unsupervised classification. Expert Syst. Appl. 2017, 88, 81–94. [Google Scholar] [CrossRef]
- Yan, X.; Nazmi, S.; Erol, B.A.; Homaifar, A.; Gebru, B.; Tunstel, E. An efficient unsupervised feature selection procedure through feature clustering. Pattern Recognit. Lett. 2020, 131, 277–284. [Google Scholar] [CrossRef]
- Bhadra, T.; Bandyopadhyay, S. Supervised feature selection using integration of densest subgraph finding with floating forward–backward search. Inf. Sci. 2021, 566, 1–18. [Google Scholar] [CrossRef]
- Goswami, S.; Chakrabarti, A.; Chakraborty, B. An efficient feature selection technique for clustering based on a new measure of feature importance. J. Intell. Fuzzy Syst. 2017, 32, 3847–3858. [Google Scholar] [CrossRef]
- Kumar, G.; Jain, G.; Panday, M.; Das, A.; Goswami, S. Graph-based supervised feature selection using correlation exponential. In Emerging Technology in Modelling and Graphics; Springer: Singapore, 2020; pp. 29–38. [Google Scholar]
- Peralta, D.; Saeys, Y. Robust unsupervised dimensionality reduction based on feature clustering for single-cell imaging data. Appl. Soft Comput. 2020, 93, 10–42. [Google Scholar] [CrossRef]
- Das, A.; Pati, S.; Ghosh, A. Relevant feature selection and ensemble classifier design using bi-objective genetic algorithm. Knowl. Inf. Syst. 2020, 62, 423–455. [Google Scholar] [CrossRef]
- Saxena, A.; Prasad, M.; Gupta, A.; Bharill, N.; Patel, O.P.; Tiwari, A.; Er, M.J.; Ding, W.; Lin, C.T. A review of clustering techniques and developments. Neurocomputing 2017, 267, 664–681. [Google Scholar] [CrossRef]
- Saxena, A.; Chugh, D.; Mittal, H.; Sajid, M.; Chauhan, R.; Yafi, E.; Cao, J.; Prasad, M. A Novel Unsupervised Feature Selection Approach Using Genetic Algorithm on Partitioned Data. Adv. Artif. Intell. Mach. Learn. 2022, 2, 500–515. [Google Scholar]
Dataset Name | No. of Features | No. of Classes | No. of Samples |
---|---|---|---|
Colon | 6000 | 2 | 62 |
Multiple Features | 649 | 10 | 2000 |
Isolet | 617 | 26 | 6238 |
Spambase | 57 | 2 | 4601 |
Ionosphere | 33 * | 2 | 351 |
WDBC | 30 | 2 | 569 |
Connectionist Bench | 60 | 2 | 208 |
SPECTF | 44 | 2 | 80 |
Dataset | Algorithm | SVM | Naive Bayes | KNN | Adaboost |
---|---|---|---|---|---|
Colon | FSFS | 81.45(0.85) | 73.39(3.16) | 74.84(1.56) | 76.29(3.57) |
LSFS | 71.62(2.04) | 51.29(1.83) | 73.55(1.56) | 60.97(4.35) | |
MCFS | 79.52(1.09) | 67.96(3.41) | 78.06(1.13) | 77.10(2.72) | |
DSFFC | 82.10(1.19) | 73.87(1.67) | 77.42(1.32) | 79.03(3.48) | |
DFG-A-DFC | 86.90(1.44) | 61.42(1.57) | 81.90(1.57) | 56.42(1.81) | |
None | 80.71(1.92) | 58.57(2.64) | 75.71(1.92) | 50.47(1.65) | |
Multiple Features | FSFS | 97.91(0.11) | 95.51(0.16) | 94.49(0.21) | 96.54(0.29) |
LSFS | 97.74(0.11) | 94.32(0.2) | 93.02(0.2) | 96.15(0.20) | |
MCFS | 98.13(0.13) | 95.59(0.13) | 95.58(0.13) | 97.06(0.19) | |
DSFFC | 98.35(0.13) | 94.43(0.12) | 95.61(0.12) | 96.22(0.17) | |
DFG-A-DFC | 98.60(0.07) | 96.00(0.12) | 96.24(0.14) | 95.55(0.13) | |
None | 98.45(0.07) | 95.49(0.16) | 96.10(0.11) | 96.10(0.09) | |
Isolet | FSFS | 88.17(0.23) | 65.82(0.21) | 71.42(0.25) | 65.78(0.19) |
LSFS | 92.95(0.11) | 75.49(0.27) | 82.6(0.19) | 75.53(0.31) | |
MCFS | 95.75(0.12) | 82.09(0.33) | 87.99(0.13) | 81.99(0.21) | |
DSFFC | 95.26(0.08) | 83.61(0.22) | 86.19(0.14) | 84.82(0.38) | |
DFG-A-DFC | 97.06(0.05) | 79.99(0.13) | 88.40(0.12) | 80.33(0.13) | |
None | 97.38(0.06) | 81.45(0.13) | 88.69(0.10) | 81.29(0.18) | |
Spambase | FSFS | 78.95(0.11) | 66.68(0.10) | 80.81(0.18) | 66.85(0.15) |
LSFS | 83.84(0.16) | 69.26(0.11) | 82.68(0.16) | 69.28(0.22) | |
MCFS | 80(0.09) | 65.27(0.09) | 82.27(0.14) | 65.24(0.12) | |
DSFFC | 86.69(0.07) | 75.63(0.12) | 84.31(0.11) | 75.71(0.15) | |
DFG-A-DFC | 93.65(0.06) | 80.06(0.13) | 86.22(0.12) | 82.67(0.244) | |
None | 93.63(0.11) | 81.65(0.19) | 85.95(0.15) | 83.50(0.25) | |
Ionosphere | FSFS | 91.77(0.49) | 73.73(0.61) | 75.41(0.64) | 85.93(1.36) |
LSFS | 91.37(0.43) | 76.84(0.71) | 84.67(0.6) | 88.83(1.18) | |
MCFS | 94.22(0.7) | 87.89(0.73) | 82.11(0.6) | 90.46(0.91) | |
DSFFC | 94.07(0.29) | 89.06(0.57) | 82.54(0.72) | 90.85(0.81) | |
DFG-A-DFC | 95.73(0.36) | 89.47(0.60) | 84.02(0.99) | 90.31(0.38) | |
None | 94.02(0.19) | 88.88(0.50) | 84.61(0.62) | 92.03(0.45) | |
WDBC | FSFS | 94.41(0.18) | 91.11(0.22) | 93.22(0.43) | 94.22(0.63) |
LSFS | 96.87(0.2) | 93.71(0.16) | 95.87(0.21) | 95.85(0.50) | |
MCFS | 96.68(0.24) | 93.39(0.24) | 96.22(0.24) | 95.11(0.44) | |
DSFFC | 96.82(0.15) | 94.34(0.16) | 95.73(0.17) | 96.22(0.31) | |
DFG-A-DFC | 97.77(0.01) | 91.73(0.43) | 95.95(0.02) | 95.78(0.27) | |
None | 97.36(0.21) | 93.14(0.56) | 96.13(0.21) | 97.01(0.22) | |
Sonar | FSFS | 80.24(1.35) | 70.82(2.41) | 68.51(1.62) | 77.16(1.97) |
LSFS | 81.01(1.27) | 71.88(1.98) | 67.98(1.20) | 75.67(1.64) | |
MCFS | 82.45(1.04) | 67.36(1.37) | 70.14(1.12) | 77.21(2.07) | |
DSFFC | 82.21(1.38) | 69.42(0.94) | 71.83(1.09) | 79.09(1.94) | |
DFG-A-DFC | 83.59(1.00) | 71.21(0.84) | 72.04(0.99) | 81.85(0.91) | |
None | 83.52(0.99) | 67.35(0.97) | 68.64(1.08) | 79.30(0.819) | |
SPECTF | FSFS | 73.38(2.13) | 73.63(1.61) | 66(1.94) | 65.50(2.78) |
LSFS | 74(1.42) | 72.75(1.42) | 69.63(2.50) | 69(3.48) | |
MCFS | 71.88(2.14) | 72.13(1.45) | 66.38(2.32) | 72.75(3.16) | |
DSFFC | 76.88(1.79) | 79.75(1.84) | 68.13(1.59) | 76.88(1.79) | |
DFG-A-DFC | 80.00(1.39) | 80.00(1.39) | 67.50(0.16) | 82.50(0.99) | |
None | 78.75(1.25) | 78.75(1.68) | 65.00(2.07) | 77.5(1.22) |
Dataset | Algorithm | SVM | Naive Bayes | KNN | Adaboost |
---|---|---|---|---|---|
Colon | FSFS | 0.585(0.02) | 0.439(0.07) | 0.439(0.044) | 0.465(0.092) |
LSFS | 0.336(0.061) | 0.16(0.042) | 0.406(0.052) | 0.232(0.088) | |
MCFS | 0.54(0.026) | 0.347(0.076) | 0.528(0.025) | 0.495(0.056) | |
DSFFC | 0.600(0.028) | 0.461(0.039) | 0.512(0.037) | 0.537(0.084) | |
DFG-A-DFC | 0.639(0.034) | 0.202(0.036) | 0.559(0.040) | 0.242(0.033) | |
None | 0.399(0.394) | 0.257(0.051) | 0.434(0.038) | 0.170(0.034) | |
Spambase | FSFS | 0.554(0.002) | 0.456(0.002) | 0.613(0.004) | 0.459(0.003) |
LSFS | 0.659(0.004) | 0.497(0.002) | 0.633(0.003) | 0.497(0.004) | |
MCFS | 0.586(0.002) | 0.451(0.002) | 0.624(0.003) | 0.449(0.002) | |
DSFFC | 0.719(0.002) | 0.585(0.002) | 0.668(0.002) | 0.586(0.002) | |
DFG-A-DFC | 0.866(0.012) | 0.643(0.022) | 0.709(0.027) | 0.675(0.036) | |
None | 0.865(0.023) | 0.668(0.035) | 0.703(0.031) | 0.688(0.045) | |
Ionosphere | FSFS | 0.823(0.011) | 0.435(0.013) | 0.462(0.016) | 0.689(0.030) |
LSFS | 0.814(0.01) | 0.521(0.01) | 0.669(0.013) | 0.755(0.026) | |
MCFS | 0.874(0.015) | 0.746(0.013) | 0.615(0.013) | 0.792(0.020) | |
DSFFC | 0.873(0.006) | 0.766(0.011) | 0.627(0.017) | 0.822(0.015) | |
DFG-A-DFC | 0.908(0.07) | 0.770(0.138) | 0.672(0.160) | 0.788(0.101) | |
None | 0.859(0.058) | 0.761(0.096) | 0.673(0.117) | 0.827(0.094) | |
WDBC | FSFS | 0.88(0.004) | 0.809(0.005) | 0.854(0.01) | 0.876(0.014) |
LSFS | 0.933(0.004) | 0.866(0.003) | 0.912(0.004) | 0.911(0.011) | |
MCFS | 0.929(0.005) | 0.859(0.005) | 0.92(0.005) | 0.895(0.009) | |
DSFFC | 0.932(0.003) | 0.879(0.003) | 0.909(0.004) | 0.919(0.007) | |
DFG-A-DFC | 0.950(0.02) | 0.823(0.094) | 0.915(0.041) | 0.910(0.059) | |
None | 0.944(0.04) | 0.855(0.114) | 0.918(0.044) | 0.938(0.045) | |
Sonar | FSFS | 0.606(0.026) | 0.415(0.048) | 0374(0.036) | 0.541(0.039) |
LSFS | 0.620(0.026) | 0.438(0.039) | 0.360(0.027) | 0.511(0.033) | |
MCFS | 0.650(0.021) | 0.379(0.026) | 0.408(0.024) | 0.543(0.042) | |
DSFFC | 0.642(0.028) | 0.409(0.020) | 0.440(0.022) | 0.580(0.039) | |
DFG-A-DFC | 0.691(0.184) | 0.438(0.170) | 0.458(0.146) | 0.647(0.162) | |
None | 0.679(0.200) | 0.353(0.172) | 0.390(0.236) | 0.589(0.166) | |
SPECTF | FSFS | 0.493(0.039) | 0.480(0.032) | 0.424(0.037) | 0.312(0.055) |
LSFS | 0.513(0.030) | 0.474(0.029) | 0.472(0.060) | 0.381(0.069) | |
MCFS | 0.479(0.039) | 0.468(0.028) | 0.383(0.047) | 0.458(0.066) | |
DSFFC | 0.540(0.033) | 0.600(0.038) | 0.468(0.027) | 0.540(0.033) | |
DFG-A-DFC | 0.529(0.327) | 0.600(0.283) | 0.433(0.029) | 0.666(0.221) | |
None | 0.489(0.282) | 0.591(0.326) | 0.433(0.029) | 0.590(0.242) |
Dataset | Algorithm | SVM | Naive Bayes | KNN | Adaboost |
---|---|---|---|---|---|
Colon | First Phase | 74.04(0.82) | 57.61(1.65) | 78.33(2.13) | 52.14(1.53) |
Second Phase | 86.90(1.44) | 61.42(1.57) | 81.90(1.57) | 56.42(1.81) | |
Multiple Features | First Phase | 97.44(0.05) | 87.79(0.23) | 92.15(0.16) | 83.44(0.26) |
Second Phase | 98.60(0.07) | 96.00(0.12) | 96.24(0.14) | 95.55(0.13) | |
Isolet | First Phase | 33.50(0.21) | 21.00(0.13) | 31.66(0.14) | 18.78(0.19) |
Second Phase | 97.06(0.05) | 79.99(0.13) | 88.40(0.12) | 80.33(0.13) | |
Spambase | First Phase | 79.24(0.13) | 55.05(0.24) | 55.98(0.24) | 79.04(0.11) |
Second Phase | 93.65(0.06) | 80.06(0.13) | 86.22(0.12) | 82.67(0.244) | |
Ionosphere | First Phase | 93.72(0.25) | 87.15(0.72) | 82.61(0.53) | 89.15(0.51) |
Second Phase | 95.73(0.36) | 89.47(0.60) | 84.02(0.99) | 90.31(0.38) | |
WDBC | First Phase | 82.08(0.62) | 77.32(0.43) | 80.67(0.37) | 81.18(0.26) |
Second Phase | 97.77(0.01) | 91.73(0.43) | 95.95(0.02) | 95.78(0.27) | |
Sonar | First Phase | 64.45(0.57) | 65.30(1.00) | 62.50(0.84) | 63.50(0.86) |
Second Phase | 83.59(1.00) | 71.21(0.84) | 72.04(0.99) | 81.85(0.91) | |
SPECTF | First Phase | 70.00(1.39) | 78.75(0.80) | 64.50(1.39) | 70.00(1.39) |
Second Phase | 80.00(1.39) | 80.00(1.39) | 67.50(0.16) | 82.50(0.99) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chugh, D.; Mittal, H.; Saxena, A.; Chauhan, R.; Yafi, E.; Prasad, M. Augmentation of Densest Subgraph Finding Unsupervised Feature Selection Using Shared Nearest Neighbor Clustering. Algorithms 2023, 16, 28. https://doi.org/10.3390/a16010028
Chugh D, Mittal H, Saxena A, Chauhan R, Yafi E, Prasad M. Augmentation of Densest Subgraph Finding Unsupervised Feature Selection Using Shared Nearest Neighbor Clustering. Algorithms. 2023; 16(1):28. https://doi.org/10.3390/a16010028
Chicago/Turabian StyleChugh, Deepesh, Himanshu Mittal, Amit Saxena, Ritu Chauhan, Eiad Yafi, and Mukesh Prasad. 2023. "Augmentation of Densest Subgraph Finding Unsupervised Feature Selection Using Shared Nearest Neighbor Clustering" Algorithms 16, no. 1: 28. https://doi.org/10.3390/a16010028
APA StyleChugh, D., Mittal, H., Saxena, A., Chauhan, R., Yafi, E., & Prasad, M. (2023). Augmentation of Densest Subgraph Finding Unsupervised Feature Selection Using Shared Nearest Neighbor Clustering. Algorithms, 16(1), 28. https://doi.org/10.3390/a16010028