Nothing Special   »   [go: up one dir, main page]

Skip to main content

Advertisement

Log in

Novel hybrid success history intelligent optimizer with Gaussian transformation: application in CNN hyperparameter tuning

  • Published:
Cluster Computing Aims and scope Submit manuscript

Abstract

This research proposes a novel Hybrid Success History Intelligent Optimizer with Gaussian Transformation (SHIOGT) for solving different complexity level optimization problems and for Convolutional Neural Network (CNNs) hyperparameter tuning. SHIOGT algorithm is designed to balance exploration and exploitation phases through the addition of Gaussian Transformation to the original Success History Intelligent Optimizer. The inclusion of Gaussian Transformation enhances solution diversity enables SHIO to avoid local optima. SHIOGT also demonstrates robustness and adaptability by dynamically adjusting its search strategy based on problem characteristics. Furthermore, the combination of Gaussian and SHIO facilitates faster convergence, accelerating the discovery of optimal or near-optimal solutions. Moreover, the hybridization of these two techniques brings a synergistic effect, enabling SHIOGT to overcome individual limitations and achieve superior performance in hyperparameter optimization tasks. SHIOGT was thoroughly assessed against an array of benchmark functions of varying complexities, demonstrating its ability to efficiently locate optimal or near-optimal solutions across different problem categories. Its robustness in tackling multimodal and deceptive landscapes and high-dimensional search spaces was particularly notable. SHIOGT has been benchmarked over 43 challenging optimization problems and have been compared with state-of-the art algorithm. Further, SHIOGT algorithm is applied to the domain of deep learning, with a case study focusing on hyperparameter tuning of CNNs. With the intelligent exploration–exploitation balance of SHIOGT, we hypothesized it could effectively optimize the CNN's hyperparameters. We evaluated the performance of SHIOGT across a variety of datasets, including MNIST, Fashion-MNIST, CIFAR-10, and CIFAR-100, with the aim of optimizing CNN model hyperparameters. The results show an impressive accuracy rate of 98% on the MNIST dataset. Similarly, the algorithm achieved a 92% accuracy rate on Fashion-MNIST, 76% on CIFAR-10, and 70% on CIFAR-100, underscoring its effectiveness across diverse datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availability

Enquiries about data availability should be directed to the authors.

References

  1. Greener, J.G., Kandathil, S.M., Moffat, L., Jones, D.T.: A guide to machine learning for biologists. Nat. Rev. Mol. Cell Biol. 23(1), 40–55 (2022)

    Article  Google Scholar 

  2. Sarker, I.H.: Machine learning: algorithms, real-world applications and research directions. SN Comput. Sci. 2(3), 160 (2021)

    Article  Google Scholar 

  3. Khalid, R., & Javaid, N. (2020). A survey on hyperparameters optimization algorithms of forecasting models in smart grid. Sustainable Cities and Society, 61, 102275.

    Article  Google Scholar 

  4. Bischl, B., Binder, M., Lang, M., Pielok, T., Richter, J., Coors, S., Lindauer, M.: Hyperparameter optimization: foundations, algorithms, best practices, and open challenges. Wiley Interdiscip. Rev.: Data Min. Knowl. Discov. 13(2), e1484 (2023)

    Google Scholar 

  5. Tang, J., Liu, G., Pan, Q.: A review on representative swarm intelligence algorithms for solving optimization problems: applications and trends. IEEE/CAA J. Autom. Sin. 8(10), 1627–1643 (2021)

    Article  MathSciNet  Google Scholar 

  6. Gambella, C., Ghaddar, B., Naoum-Sawaya, J.: Optimization problems for machine learning: a survey. Eur. J. Oper. Res. 290(3), 807–828 (2021)

    Article  MathSciNet  Google Scholar 

  7. Del Buono, N., Esposito, F., & Selicato, L. (2020). Methods for hyperparameters optimization in learning approaches: an overview. In Machine Learning, Optimization, and Data Science: 6th International Conference, LOD 2020, Siena, Italy, July 19–23, 2020, Revised Selected Papers, Part I 6 (pp. 100–112). Springer International Publishing.

  8. Abualigah, L., Diabat, A.: A comprehensive survey of the Grasshopper optimization algorithm: results, variants, and applications. Neural Comput. Appl. 32(19), 15533–15556 (2020)

    Article  Google Scholar 

  9. Smys, S., Chen, J. I. Z., & Shakya, S. (2020). Survey on neural network architectures with deep learning. Journal of Soft Computing Paradigm (JSCP), 2(03), 186–194.

    Article  Google Scholar 

  10. Liu, W., Wang, Z., Liu, X., Zeng, N., Liu, Y., & Alsaadi, F. E. (2017). A survey of deep neural network architectures and their applications. Neurocomputing, 234, 11–26.

    Article  Google Scholar 

  11. Goel, S., Klivans, A., & Koehler, F. (2020). From boltzmann machines to neural networks and back again. Advances in Neural Information Processing Systems, 33, 6354–6365.

  12. Fakhouri, H. N., Hamad, F., & Alawamrah, A. (2022). Success history intelligent optimizer. The Journal of Supercomputing, 1–42.

  13. Gul, F., Mir, I., Alarabiat, D., Alabool, H.M., Abualigah, L., Mir, S.: Implementation of bio-inspired hybrid algorithm with mutation operator for robotic path planning. J. Parallel Distrib. Comput. 169, 171–184 (2022)

    Article  Google Scholar 

  14. Hao, Q., Zhou, Z., Wei, Z., Chen, G.: Parameters identification of photovoltaic models using a multi-strategy success-history-based adaptive differential evolution. IEEE Access 8, 35979–35994 (2020)

    Article  Google Scholar 

  15. Fakhouri, H.N., Hudaib, A., Sleit, A.: Multivector particle swarm optimization algorithm. Soft Computing 24, 11695–11713 (2020)

    Article  Google Scholar 

  16. Passos, D., Mishra, P.: A tutorial on automatic hyperparameter tuning of deep spectral modelling for regression and classification tasks. Chemom. Intell. Lab. Syst. 223, 104520 (2022)

    Article  Google Scholar 

  17. Yan, C., Xiong, Y., Chen, L., Endo, Y., Hu, L., Liu, M., Liu, G.: A comparative study of the efficacy of ultrasonics and extracorporeal shock wave in the treatment of tennis elbow: a meta-analysis of randomized controlled trials. J. Orthop. Surg. Res. 14(1), 1–12 (2019)

    Article  Google Scholar 

  18. Liashchynskyi, P., Liashchynskyi, P.: Grid search, random search, genetic algorithm: a big comparison for NAS. arXiv preprint arXiv:1912.06059. (2019).

  19. Garnett, R.: Bayesian optimization. Cambridge University Press, Cambridge (2023)

    Book  Google Scholar 

  20. Gaspar, A., Oliva, D., Cuevas, E., Zaldívar, D., Pérez, M., Pajares, G.: Hyperparameter optimization in a convolutional neural network using metaheuristic algorithms. Metaheuristics in Machine Learning: Theory and Applications, pp. 37–59. Springer International Publishing, Cham (2021)

    Chapter  Google Scholar 

  21. Yağ, İ, Altan, A.: Artificial intelligence-based robust hybrid algorithm design and implementation for real-time detection of plant diseases in agricultural environments. Biology 11(12), 1732 (2022)

    Article  Google Scholar 

  22. Raji, I. D., Bello-Salau, H., Umoh, I. J., Onumanyi, A. J., Adegboye, M. A., & Salawudeen, A. T. (2022). Simple deterministic selection-based genetic algorithm for hyperparameter tuning of machine learning models. Applied Sciences, 12(3), 1186.

    Article  Google Scholar 

  23. Manikandakumar, M., & Karthikeyan, P. (2023). Weed classification using particle swarm optimization and deep learning models. Comput. Syst. Sci. Eng, 44(1), 913–927.

    Article  Google Scholar 

  24. Talpur, N., Abdulkadir, S.J., Akhir, E.A.P., Hasan, M.H., Alhussian, H., Abdullah, M.H.A.: A novel bitwise arithmetic optimization algorithm for the rule base optimization of deep neuro-fuzzy system. J. King Saud Univ.-Comput. Inf. Sci. (2023). https://doi.org/10.1016/j.jksuci.2023.01.020

    Article  Google Scholar 

  25. Salleh, M.N.M., Hussain, K., Talpur, N.: A divide-and-conquer strategy for adaptive neuro-fuzzy inference system learning using metaheuristic algorithm. In: Piuri, V., Balas, V., Borah, S., Syed Ahmad, S. (eds.) Intelligent and interactive computing. Lecture notes in networks and systems, vol. 67. Springer, Singapore (2019)

    Google Scholar 

  26. Talpur, N., Abdulkadir, S.J., Hasan, M.H., Alhussian, H., Alwadain, A.: A novel wrapper-based optimization algorithm for the feature selection and classification. Universiti Teknologi PETRONAS, Seri Iskandar, Malaysia and King Saud University, Riyadh, Saudi Arabia (2022)

  27. Mohakud, R., & Dash, R. (2020). Survey on hyperparameter optimization using nature-inspired algorithm of deep convolution neural network. In Intelligent and Cloud Computing: Proceedings of ICICC 2019, Volume 1 (pp. 737–744). Singapore: Springer Singapore.

  28. Serizawa, T., & Fujita, H. (2020). Optimization of convolutional neural network using the linearly decreasing weight particle swarm optimization. arXiv preprint arXiv:2001.05670.

  29. Elgeldawi, E., Sayed, A., Galal, A. R., & Zaki, A. M. (2021, November). Hyperparameter tuning for machine learning algorithms used for arabic sentiment analysis. In Informatics (Vol. 8, No. 4, p. 79). MDPI.

    Google Scholar 

  30. Fan, Y., Zhang, Y., Guo, B., Luo, X., Peng, Q., Jin, Z.: A hybrid sparrow search algorithm of the hyperparameter optimization in deep learning. Mathematics 10(16), 3019 (2022)

    Article  Google Scholar 

  31. Tayebi, M., El Kafhali, S.: Deep neural networks hyperparameter optimization using particle swarm optimization for detecting frauds transactions, pp. 507–516. Springer, Singapore (2022)

    Google Scholar 

  32. Guo, Y., Li, J. Y., & Zhan, Z. H. (2020). Efficient hyperparameter optimization for convolution neural networks in deep learning: A distributed particle swarm optimization approach. Cybernetics and Systems, 52(1), 36–57.

    Article  Google Scholar 

  33. Zhu, Y., Li, G., Wang, R., Tang, S., Su, H., Cao, K.: Intelligent fault diagnosis of hydraulic piston pump combining improved LeNet-5 and PSO hyperparameter optimization. Applied Acoustics 183, (2021)

    Article  Google Scholar 

  34. Yang, L., Shami, A.: On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415, 295–316 (2020)

    Article  Google Scholar 

  35. Feurer, M., & Hutter, F. (2019). Hyperparameter optimization. Automated machine learning: Methods, systems, challenges, 3-33.

    Google Scholar 

  36. Wu, J., Poloczek, M., Wilson, A. G., & Frazier, P. (2017). Bayesian optimization with gradients. Advances in neural information processing systems, 30.

  37. Ansarullah, S. I., Mohsin Saif, S., Abdul Basit Andrabi, S., Kumhar, S. H., Kirmani, M. M., & Kumar, D. P. (2022). An intelligent and reliable hyperparameter optimization machine learning model for early heart disease assessment using imperative risk attributes. Journal of healthcare engineering, 2022.

  38. Zhang, X., Xu, Y., Yu, C., Heidari, A.A., Li, S., Chen, H., Li, C.: Gaussian mutational chaotic fruit fly-built optimization and feature selection. Expert Syst. Appl. 141, 112976 (2020)

    Article  Google Scholar 

  39. Fakhouri, S.N., Hudaib, A., Fakhouri, H.N.: Enhanced optimizer algorithm and its application to software testing. J. Exp. Theor. Artif. Intell. 32(6), 885–907 (2020)

    Article  Google Scholar 

  40. Tuba, E., Bačanin, N., Strumberger, I., Tuba, M.: Convolutional neural networks hyperparameters tuning. Artificial intelligence: theory and applications, pp. 65–84. Springer International Publishing, Cham (2021)

    Chapter  Google Scholar 

Download references

Funding

The authors have not disclosed any funding.

Author information

Authors and Affiliations

Authors

Contributions

Author Contributions Statement: All authors contributed significantly to this research and the development of the manuscript. HNF, SA and FH: all authors first contributed to the conception and design of the research, developed the novel Hybrid Success History Intelligent Optimizer with Gaussian Transformation (SHIOGT), and performed the primary analysis and interpretation of the data. He also participated in drafting the initial manuscript and approved the final version for submission. more over they performed instrumental in benchmarking and analyzing the SHIOGT algorithm against an array of optimization challenges. He provided substantial contributions to the interpretation of the data and was involved in drafting and revising the manuscript critically for important intellectual content. He has approved the final version of the manuscript for submission. further they were critical in extending the application of the SHIOGT algorithm to the deep learning domain, specifically focusing on the hyperparameter tuning of Convolutional Neural Networks (CNNs). She made substantial contributions to the conception and design of the research, data analysis, and interpretation. She participated in drafting the manuscript, revising it critically for important intellectual content, and approved the final version for submission. All authors agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Corresponding author

Correspondence to Hussam N. Fakhouri.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A

Appendix A

See Tables 5 and 6

Table 5 Benchmark functions characteristics
Table 6 Benchmark functions Description

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fakhouri, H.N., Alawadi, S., Awaysheh, F.M. et al. Novel hybrid success history intelligent optimizer with Gaussian transformation: application in CNN hyperparameter tuning. Cluster Comput 27, 3717–3739 (2024). https://doi.org/10.1007/s10586-023-04161-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10586-023-04161-0

Keywords