Nothing Special   »   [go: up one dir, main page]

Skip to main content

Advertisement

Log in

Recognition of Cancer Mediating Genes using the Novel Restricted Boltzmann Machines

  • Published:
Wireless Personal Communications Aims and scope Submit manuscript

Abstract

The use of Restricted Boltzmann machines has been considered in the construction of deep neural networks. One reason for this use is the feature engineering capability of the restricted Boltzmann machine. One of the issues facing deep neural networks is weight training. Because of the complexity of training processes, these topics are the most important in deep networks. Based on the differences between the means and means of all the values of features of training vectors, we have attempted in this paper to modify the initial weights in the Restricted Boltzmann Machine. The name of the model is Modify Restricted Boltzmann Machine (MRBM1, and MRBM2). By excellence of this, the probability of training vector reconstruction by the model is increased at the beginning of the training processes. Subsequently, the error amount of the deep belief network in the training process is reduced. The reason for b the use of this approach is the consideration of common values of a feature concerning the values of all features. A grouping of correlations is included in our proposed models then we select some possible genes. The usefulness of the proposed techniques has been successfully applied to four microarray gene expression data sets. The microarray gene expression data sets are human leukemia, lung, colon, and breast cancer. The superiority of the methods has been established in some earlier procedures like Boltzmann Machine (BM) Restricted Boltzmann Machine (RBM), Auto-Encoder (AE), and Denoising autoencoders (DAE) are verified, GO attributes of Top-10, Top-20, and Top-50 genes of each data set and find the significant genes depend on p-values. The overall results are properly verified by existing techniques, gene expression profiles, biochemical pathways, t-test, and F-test.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Algorithm 1
Algorithm 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data Availability

Data will be made available on request.

References

  1. Jemal, A., Siegel, R., Ward, E., Murray, T., Xu, J., Smigal, C., & Thun, M. J. (2006). Cancer Statistics, 2006. CA: A Cancer Journal for Clinicians, 56(2), 106–130. https://doi.org/10.3322/canjclin.56.2.106

    Article  Google Scholar 

  2. Salto-Tellez, M., & Cree, I. A. (2019). Cancer taxonomy: pathology beyond pathology. European Journal of Cancer, 115, 57–60. https://doi.org/10.1016/j.ejca.2019.03.026

    Article  Google Scholar 

  3. Sheet, S., Ghosh, A., & Mandal, S. B. (2018). Cancer Mediating Genes Recognition using Multilayer Perceptron Model- An Application on Human Leukemia. Advances in Science, Technology and Engineering Systems Journal, 3(2), 08–20. https://doi.org/10.25046/aj030202

    Article  Google Scholar 

  4. Roggli, V. L., Vollmer, R. T., Greenberg, S. D., McGavran, M. H., Spjut, H. J., & Yesner, R. (1985). Lung cancer heterogeneity: A blinded and randomized study of 100 consecutive cases. Human Pathology, 16(6), 569–579. https://doi.org/10.1016/S0046-8177(85)80106-4

    Article  Google Scholar 

  5. Kaisermann, M., Trajman, A., & Madi, K. (2001). Evolving features of lung adenocarcinoma in rio de janeiro, brazil. Human Pathology., 8(1), 189–192. https://doi.org/10.3892/or.8.1.189

    Article  Google Scholar 

  6. Cawley, G., & Talbot, N. (2006). Gene selection in cancer classification using sparse logistic regression with bayesian regularization. Bioinformatics (Oxford, England)., 22(1), 2348–2355. https://doi.org/10.1093/bioinformatics/btl386

    Article  Google Scholar 

  7. Jo, H. S., Park, K., & Jung, S. M. (2019). A scoping review of consumer needs for cancer information. Patient Education and Counseling, 102(7), 1237–1250. https://doi.org/10.1016/j.pec.2019.02.004

    Article  Google Scholar 

  8. Hinton, G. E., Osindero, S., & Teh, Y.-W. (2006). A Fast Learning Algorithm for Deep Belief Nets. Neural Computation, 18(7), 1527–1554. https://doi.org/10.1162/neco.2006.18.7.1527

    Article  MathSciNet  Google Scholar 

  9. Krizhevsky, A., Sutskever, I., & Hinton, G. (2012). Imagenet classification with deep convolutional neural networks. Neural Information Processing Systems., 25, 1–9. https://doi.org/10.1145/3065386

    Article  Google Scholar 

  10. Pacheco, A., Krohling, R., & Silva, C. (2017). Restricted boltzmann machine to determine the input weights for extreme learning machines. Expert Systems with Applications., 96, 77–85. https://doi.org/10.1016/j.eswa.2017.11.054

    Article  Google Scholar 

  11. Görgel, P., & Simsek, A. (2019). Face recognition via Deep Stacked Denoising Sparse Autoencoders (DSDSA). Applied Mathematics and Computation, 355, 325–342. https://doi.org/10.1016/j.amc.2019.02.071

    Article  MathSciNet  Google Scholar 

  12. Ghahabi, O., & Hernando, J. (2018). Restricted Boltzmann machines for vector representation of speech in speaker recognition. Computer Speech & Language, 47, 16–29. https://doi.org/10.1016/j.csl.2017.06.007

    Article  Google Scholar 

  13. Tomczak, J. M., & Zięba, M. (2015). Classification Restricted Boltzmann Machine for comprehensible credit scoring model. Expert Systems with Applications, 42(4), 1789–1796. https://doi.org/10.1016/j.eswa.2014.10.016

    Article  Google Scholar 

  14. Elfwing, S., Uchibe, E., & Doya, K. (2015). Expected energy-based restricted Boltzmann machine for classification. Neural Networks, 64, 29–38. https://doi.org/10.1016/j.neunet.2014.09.006

    Article  Google Scholar 

  15. Papa, J. P., Rosa, G. H., Marana, A. N., Scheirer, W., & Cox, D. D. (2015). Model selection for Discriminative Restricted Boltzmann Machines through meta-heuristic techniques. Journal of Computational Science, 9, 14–18. https://doi.org/10.1016/j.jocs.2015.04.014

    Article  Google Scholar 

  16. Taherkhani, A., Cosma, G., & McGinnity, T. M. (2018). Deep-FS: A feature selection algorithm for Deep Boltzmann Machines. Neurocomputing, 322, 22–37. https://doi.org/10.1016/j.neucom.2018.09.040

    Article  Google Scholar 

  17. Sheet, S., Ghosh, A., & Mandal, S. B. (2018). Selection of genes mediating human leukemia, using boltzmann machine. In R. K. Choudhary, J. K. Mandal, & D. Bhattacharyya (Eds.), Advanced Computing and Communication Technologies (pp. 83–90). Singapore: Springer.

    Chapter  Google Scholar 

  18. Zeng, X., Chen, F., & Wang, M. (2018). Shape group Boltzmann machine for simultaneous object segmentation and action classification. Pattern Recognition Letters, 111, 43–50. https://doi.org/10.1016/j.patrec.2018.04.014

    Article  Google Scholar 

  19. Wu, J., Mazur, T. R., Ruan, S., Lian, C., Daniel, N., Lashmett, H., Ochoa, L., Zoberi, I., Anastasio, M. A., Gach, H. M., Mutic, S., Thomas, M., & Li, H. (2018). A deep Boltzmann machine-driven level set method for heart motion tracking using cine MRI images. Medical Image Analysis, 47, 68–80. https://doi.org/10.1016/j.media.2018.03.015

    Article  Google Scholar 

  20. Lü X., Long L., Deng R., Meng. R.: Image feature extraction based on fuzzy restricted boltzmann machine. Measurement 204, 1–13 (2022). https://doi.org/10.1016/j.measurement.2022.112063

  21. Nie, S., Wang, Z., & Ji, Q. (2015). A generative restricted Boltzmann machine based method for high-dimensional motion data modeling. Computer Vision and Image Understanding, 136, 14–22. https://doi.org/10.1016/j.cviu.2014.12.005

    Article  Google Scholar 

  22. Feng F., Li R., & Wang. X. (2015). Deep correspondence restricted boltzmann machine for cross-modal retrieval. Neurocomputing, 154, 50–60. https://doi.org/10.1016/j.neucom.2014.12.020

    Article  Google Scholar 

  23. Shen, H., & Li, H. (2019). A gradient approximation algorithm based weight momentum for restricted Boltzmann machine. Neurocomputing, 361, 40–49. https://doi.org/10.1016/j.neucom.2019.07.074

    Article  Google Scholar 

  24. Pujahari, A., & Sisodia, D. S. (2019). Modeling Side Information in Preference Relation based Restricted Boltzmann Machine for recommender systems. Information Sciences, 490, 126–145. https://doi.org/10.1016/j.ins.2019.03.064

    Article  Google Scholar 

  25. Zhang, J., Wang, H., Chu, J., Huang, S., Li, T., & Zhao, Q. (2019). Improved Gaussian–Bernoulli restricted Boltzmann machine for learning discriminative representations. Knowledge-Based Systems, 185, 104911. https://doi.org/10.1016/j.knosys.2019.104911

    Article  Google Scholar 

  26. Luo, L., Zhang, S., Wang, Y., & Peng, H. (2018). An alternate method between generative objective and discriminative objective in training classification Restricted Boltzmann Machine. Knowledge-Based Systems, 144, 144–152. https://doi.org/10.1016/j.knosys.2017.12.032

    Article  Google Scholar 

  27. Harrington, P. B. (2018). Feature expansion by a continuous restricted Boltzmann machine for near-infrared spectrometric calibration. Analytica Chimica Acta, 1010, 20–28. https://doi.org/10.1016/j.aca.2018.01.026

    Article  Google Scholar 

  28. Sheri A.M., Rafique A., Pedrycz W., & Jeon. M. (2015). Contrastive divergence for memristor-based restricted boltzmann machine. Engineering Applications of Artificial Intelligence, 37, 336–342. https://doi.org/10.1016/j.engappai.2014.09.013

    Article  Google Scholar 

  29. Xie, C., Lv, J., Li, Y., & Sang, Y. (2018). Cross-correlation conditional restricted Boltzmann machines for modeling motion style. Knowledge-Based Systems, 159, 259–269. https://doi.org/10.1016/j.knosys.2018.06.026

    Article  Google Scholar 

  30. Fischer, A., & Igel, C. (2015). A bound for the convergence rate of parallel tempering for sampling restricted Boltzmann machines. Theoretical Computer Science, 598, 102–117. https://doi.org/10.1016/j.tcs.2015.05.019

    Article  MathSciNet  Google Scholar 

  31. Fiore U., Palmieri F., Castiglione A., & De Santis. A. (2013). Network anomaly detection with the restricted boltzmann machine. Neurocomputing, 122, 13–23. https://doi.org/10.1016/j.neucom.2012.11.050

    Article  Google Scholar 

  32. Leng, B., Zhang, X., Yao, M., & Xiong, Z. (2015). A 3D model recognition mechanism based on deep Boltzmann machines. Neurocomputing, 151, 593–602. https://doi.org/10.1016/j.neucom.2014.06.084

    Article  Google Scholar 

  33. Welling, M., & Teh, Y. W. (2003). Approximate inference in Boltzmann machines. Artificial Intelligence, 143(1), 19–50. https://doi.org/10.1016/S0004-3702(02)00361-2

    Article  MathSciNet  Google Scholar 

  34. Okuhara, K., & Osaki, S. (1995). A study on the characteristics in a symmetry Boltzmann machine composed of two Boltzmann machines. Mathematical and Computer Modelling, 22(10–12), 273–278. https://doi.org/10.1016/0895-7177(95)00204-F

    Article  Google Scholar 

  35. Karakida, R., Okada, M., & Amari, S. (2016). Dynamical analysis of contrastive divergence learning: Restricted Boltzmann machines with Gaussian visible units. Neural Networks, 79, 78–87. https://doi.org/10.1016/j.neunet.2016.03.013

    Article  Google Scholar 

  36. Leisink, M. A. R., & Kappen, H. J. (2000). Learning in higher order Boltzmann machines using linear response. Neural Networks, 13(3), 329–335. https://doi.org/10.1016/S0893-6080(00)00011-3

    Article  Google Scholar 

  37. Shim, V. A., Tan, K. C., Cheong, C. Y., & Chia, J. Y. (2013). Enhancing the scalability of multi-objective optimization via restricted Boltzmann machine-based estimation of distribution algorithm. Information Sciences, 248, 191–213. https://doi.org/10.1016/j.ins.2013.06.037

    Article  MathSciNet  Google Scholar 

  38. Balzer, W., Takahashi, M., Ohta, J., & Kyuma, K. (1991). Weight quantization in Boltzmann machines. Neural Networks, 4(3), 405–409. https://doi.org/10.1016/0893-6080(91)90077-I

    Article  Google Scholar 

  39. Parra, L., & Deco, G. (1995). Continuous Boltzmann machine with rotor neurons. Neural Networks, 8(3), 375–385. https://doi.org/10.1016/0893-6080(94)00074-V

    Article  Google Scholar 

  40. Moser, E., & Kameda, T. (1992). Bounds on the number of hidden units of boltzmann machines. Neural Networks, 5(6), 911–921. https://doi.org/10.1016/S0893-6080(05)80087-5

    Article  Google Scholar 

  41. Zhang N., Ding S., Zhang J., & Xue. Y. (2018). An overview on restricted boltzmann machines. Neurocomputing, 275, 1186–1199. https://doi.org/10.1016/j.neucom.2017.09.065

    Article  Google Scholar 

  42. Sokolovska, N., Clément, K., & Zucker, J.-D. (2019). Revealing causality between heterogeneous data sources with deep restricted Boltzmann machines. Information Fusion, 50, 139–147. https://doi.org/10.1016/j.inffus.2018.11.016

    Article  Google Scholar 

  43. Sheet S., Ghosh R., & Ghosh. A. (2024). Recognition of cancer mediating genes using mlp-sdae model. Systems and Soft Computing, 6, 1–15. https://doi.org/10.1016/j.sasc.2024.200079

    Article  Google Scholar 

  44. Sheet, S., Ghosh, A., Ghosh, R., & Chakrabarti, A. (2020). Identification of Cancer Mediating Biomarkers using Stacked Denoising Autoencoder Model - An Application on Human Lung Data. Procedia Computer Science, 167, 686–695. https://doi.org/10.1016/j.procs.2020.03.341

    Article  Google Scholar 

  45. Hu, J., Zhang, J., Ji, N., & Zhang, C. (2017). A new regularized restricted Boltzmann machine based on class preserving. Knowledge-Based Systems, 123, 1–12. https://doi.org/10.1016/j.knosys.2017.02.012

    Article  Google Scholar 

  46. Aldwairi, T., Perera, D., & Novotny, M. A. (2018). An evaluation of the performance of Restricted Boltzmann Machines as a model for anomaly network intrusion detection. Computer Networks, 144, 111–119. https://doi.org/10.1016/j.comnet.2018.07.025

    Article  Google Scholar 

  47. Sokolova, M., Japkowicz, N., & Szpakowicz, S. (2006). Beyond accuracy, f-score and roc: A family of discriminant measures for performance evaluation. In A. Sattar & B.-H. Kang (Eds.), AI 2006: Advances in Artificial Intelligence (pp. 1015–1021). Berlin, Heidelberg: Springer.

    Chapter  Google Scholar 

Download references

Funding

No funding was received for this article.

Author information

Authors and Affiliations

Authors

Contributions

All authors have contributed to the conception, design of the work, the acquisition, analysis, and interpretation of data, and have edited and approved the submitted version. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sougata Sheet.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A

Appendix A

F-score is defined

$$\begin{aligned} F-score = (1+\beta ^{2}). \frac{Precision . Recall}{(\beta ^{2} . Precision)+ Recall} \end{aligned}$$
(A1)

Where

$$\begin{aligned} Precision= & \frac{tp}{tp +fp} \end{aligned}$$
(A2)
$$\begin{aligned} Recall= & \frac{tp}{tp +fn} \end{aligned}$$
(A3)

The true positive, false positive, false negative, and balance factor are terms of tp, fp, fn, and \(\beta\) respectively. The above three calculations differentiate the accurate labels of classification within various classes. The recall function is an example of true positives and false negatives. The precision function is an example of true positives and false positives. When the value of \(\beta =1\), the F-score [47] is always balanced. When \(\beta > 1\), it uses precision otherwise recall. We use \(\beta =1\), in our experiment.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sheet, S., Ghosh, A., Ghosh, R. et al. Recognition of Cancer Mediating Genes using the Novel Restricted Boltzmann Machines. Wireless Pers Commun 138, 2275–2298 (2024). https://doi.org/10.1007/s11277-024-11600-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11277-024-11600-7

Keywords

Navigation