Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Prediction model of hot metal temperature for blast furnace based on improved multi-layer extreme learning machine

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

In the blast furnace production site, the disposable thermocouple is used to measure the hot metal temperature. However, this method is not only inconvenient for continuous data acquisition but also costly for the use of one-time thermocouple. Hence, this paper establishes a prediction model to predict the hot metal temperature. Before the prediction model is established, the corresponding factors of influencing the hot metal temperature are selected, and the noises of production data are removed. In this paper, multi-layer extreme learning machine (ML-ELM) is used as the prediction algorithm of the prediction model. However, the input weights, hidden layer weights and hidden biases of ML-ELM are randomly selected, and the solution of the output weights is based on them, which makes ML-ELM inevitably have a set of non-optimal or unnecessary weights and biases. In addition, ML-ELM may suffer from over-fitting problem. Hence, this paper uses the adaptive particle swarm optimization (APSO) and the ensemble model to improve ML-ELM, and the improved algorithm is named as EAPSO-ML-ELM. APSO can optimize the selections of the input weights, hidden layer weights and hidden biases, the ensemble model can alleviate the over-fitting problem, i.e., this paper combines several of the optimized ML-ELMs which have different input weights, hidden layer weights and hidden biases. Finally, this paper also uses other algorithms to establish the prediction model, and simulation results demonstrate that the prediction model based on EAPSO-ML-ELM has better prediction accuracy and generalization performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Wang XL (2000) Iron and steel metallurgy (iron parts). Metallurgical Industry Press, Beijing

    Google Scholar 

  2. Radhakrishnan VR, Ram KM (2001) Mathematical model for predictive control of the bell-less top charging system of a blast furnace. J Process Control 11(5):565–586

    Article  Google Scholar 

  3. Geerdes M, Toxopeus H, van der Vliet C (2009) Modern blast furnace ironmaking: an introduction, vol 4. Ios Press, Amsterdam

    Google Scholar 

  4. Gao CH, Ge QH, Jian L (2014) Rule extraction from fuzzy-based blast furnace SVM multiclassifier for decision-making. IEEE Trans Fuzzy Syst 22(3):586–596

    Article  Google Scholar 

  5. Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of international joint conference on neural networks (IJCNN2004), vol 2, pp 985–990

  6. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501

    Article  Google Scholar 

  7. Li AL, Zhao YM, Cui GM (2015) Prediction model of blast furnace temperature based on ELM with grey correlation analysis. J Iron Steel 27(11):33–37

    Google Scholar 

  8. Zhang HG, Yin YX, Zhang S (2016) An improved ELM algorithm for the measurement of hot metal temperature in blast furnace. Neurocomputing 174:232–237

    Article  Google Scholar 

  9. Kasun LLC, Zhou H, Huang GB, Vong CM (2013) Representational learning with ELMs for big data. IEEE Intell Syst 28(6):31–34

    Google Scholar 

  10. Shi YH, Eberhart R (1998) A modified particle swarm optimizer. In: Proceedings of IEEE world congress on computational intelligence, Anchorage, pp 69–73

  11. Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12(10):993–1001

    Article  Google Scholar 

  12. Liu N, Wang H (2010) Ensemble based extreme learning machine. IEEE Signal Process Lett 17(8):754–757

    Article  Google Scholar 

  13. Cao JW, Lin ZP, Huang GB, Liu N (2012) Voting based extreme learning machine. Inf Sci 185(1):66–77

    Article  MathSciNet  Google Scholar 

  14. Zhai JH, Xu HY, Wang XZ (2012) Dynamic ensemble extreme learning machine based on sample entropy. Soft Comput 16(9):1493–1502

    Article  Google Scholar 

  15. Xue XW, Yao M, Wu ZH, Yang JH (2014) Genetic ensemble of extreme learning machine. Neurocomputing 129:175–184

    Article  Google Scholar 

  16. Krawczyk B, Minku LL, Gama J, Stefanowski J, Woźniak M (2017) Ensemble learning for data stream analysis: a survey. Inf Fusion 37:132–156

    Article  Google Scholar 

  17. Zhai JH, Zang LG, Zhou ZY (2018) Ensemble dropout extreme learning machine via fuzzy integral for data classification. Neurocomputing 275:1043–1052

    Article  Google Scholar 

  18. Zhou ZH, Wu JX, Tang W (2002) Ensembling neural networks: many could be better than all. Artif Intell 137(1–2):239–263

    Article  MathSciNet  Google Scholar 

  19. Zhai JH, Zhang SF, Wang CX (2017) The classification of imbalanced large data sets based on MapReduce and ensemble of ELM classifiers. Int J Mach Learn Cybern 8(3):1009–1017

    Article  Google Scholar 

  20. Zhang L, Shah SK, Kakadiaris IA (2017) Hierarchical multi-label classification using fully associative ensemble learning. Pattern Recognit 70:89–103

    Article  Google Scholar 

  21. Deng JL (1982) Control problems of grey systems. Syst Control Lett 1(5):288–294

    Article  MathSciNet  Google Scholar 

  22. Moran J, Granada E, Míguez JL, Porteiro J (2006) Use of grey relational analysis to assess and optimize small biomass boilers. Fuel Process Technol 87(2):123–127

    Article  Google Scholar 

  23. Gao CH, Jian L, Chen JM, Sun YX (2009) Data-driven modeling and predictive algorithm for complex blast furnace ironmaking process. Acta Autom Sin 35(6):725–730

    Article  Google Scholar 

  24. Madadi Z, Anand GV, Premkumar AB (2013) Signal detection in generalized gaussian noise by nonlinear wavelet denoising. IEEE Trans Circuits Syst I Reg Pap 60(11):2973–2986

    Article  Google Scholar 

  25. Huang GB, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2(2):107–122

    Article  Google Scholar 

  26. Mao WT, Wang JN, Xue ZN (2017) An ELM-based model with sparse-weighting strategy for sequential data imbalance problem. Int J Mach Learn Cybern 8(4):1333–1345

    Article  Google Scholar 

  27. Bengio Y (2009) Learning deep architectures for AI. Found Trends Mach Learn 2(1):1–127

    Article  MathSciNet  Google Scholar 

  28. Ding S, Zhang N, Xu X, Guo LL, Zhang J (2015) Deep extreme learning machine and its application in EEG classification. Math Probl Eng. https://doi.org/10.1155/2015/129021 (Article ID 129021)

    Article  MathSciNet  MATH  Google Scholar 

  29. Tang J, Deng C, Huang GB (2016) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27(4):809–821

    Article  MathSciNet  Google Scholar 

  30. Eberhart R, Kennedy J (1995) Particle swarm optimization. In: Proceedings of the IEEE international conference on neural network, pp 1942–1948

  31. Eberhart R, Shi YH (2001) Particle swarm optimization: developments, applications and resources. In: Proceedings of the 2001 congress on evolutionary computation, vol 1, pp 81–86

  32. Kennedy J (2010) Particle swarm optimization.In: Encyclopedia of machine learning. Springer, Berlin, US, pp 760–766

  33. Zhan ZH, Zhang J, Li Y, Chung HSH (2009) Adaptive particle swarm optimization. IEEE Trans Syst Man Cybern B Cybern 39(6):1362–1381

    Article  Google Scholar 

  34. Zhu QY, Qin AK, Suganthan PN, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recognit 38(10):1759–1763

    Article  Google Scholar 

  35. Ghosh R, Verma B (2003) A hierarchical method for finding optimal architecture and weights using evolutionary least square based learning. Int J Neural Syst 13(01):13–24

    Article  Google Scholar 

  36. Han F, Yao HF, Ling QH (2013) An improved evolutionary extreme learning machine based on particle swarm optimization. Neurocomputing 116:87–93

    Article  Google Scholar 

  37. Xu Y, Shu Y (2006) Evolutionary extreme learning machine-based on particle swarm optimization. In: Advances in neural networks—ISNN2006, pp 644–652

    Chapter  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Nature Science Foundation of China under Grants nos. 61673056 and 61673055, the Beijing Natural Science Foundation under Grant no. 4182039, the Key Program of National Nature Science Foundation of China under Grant no. 61333002, and the Beijing Key Discipline Co-construction Project (XK100080537).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sen Zhang.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Su, X., Zhang, S., Yin, Y. et al. Prediction model of hot metal temperature for blast furnace based on improved multi-layer extreme learning machine. Int. J. Mach. Learn. & Cyber. 10, 2739–2752 (2019). https://doi.org/10.1007/s13042-018-0897-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-018-0897-3

Keywords

Navigation