Prediction of Concrete Compressive Strength Based on ISSA-BPNN-AdaBoost
<p>Images of test functions.</p> "> Figure 1 Cont.
<p>Images of test functions.</p> "> Figure 2
<p>Convergence curves of five optimization algorithms on test functions.</p> "> Figure 2 Cont.
<p>Convergence curves of five optimization algorithms on test functions.</p> "> Figure 3
<p>Boxplots of 10 test functions.</p> "> Figure 4
<p>Structure of the BPNN.</p> "> Figure 5
<p>Structure of the ISSA-BPNN.</p> "> Figure 6
<p>Structure of the ISSA-BPNN-AdaBoost.</p> "> Figure 7
<p>Training errors for different hidden layer neurons.</p> "> Figure 8
<p>Fitted plots of predicted and actual values in the test set for each ensemble model.</p> "> Figure 9
<p>Fitted plots of predicted and actual values in the test set for each single model.</p> "> Figure 10
<p>Fitted plots of predicted and actual values in the training set and test set for each model.</p> ">
Abstract
:1. Introduction
2. Optimization Algorithm and Improvements
2.1. Sparrow Search Algorithm
2.2. Improvement of the SSA
2.2.1. Adding Piecewise Chaotic Mapping
2.2.2. Improving the Discoverer Strategy
2.2.3. Adding Adaptive t-Distribution Variation
2.3. Optimization Algorithm Performance Testing
3. Model Construction
3.1. ISSA-BPNN
3.2. ISSA-BPNN-AdaBoost
- Preprocess the dataset by dividing it into a training set and a test set. Set the maximum number of iterations (i.e., the number of base learners). Initialize the weights of each training sample.
- Train the current base learner based on the distribution of the weights of the current training samples using the ISSA-BPNN model as the base learner.
- Calculate the error rate and weights of the current weak learner respectively.
- Update the weights of each training sample.
- Check whether the maximum iteration count is reached. If yes, stop the algorithm iteration and combine all the base learners obtained during the training process to obtain the final strong learner. (Otherwise, the process jumps back to step 2 to continue training a new base learner.)
- Use the strong learner to train and predict the test dataset and output the final result.
3.3. Model Evaluation Indicators
- RMSE measures the average error produced by the model in making predictions and is the square root of the mean squared error (MSE), which is the average squared difference between the actual data values and the model predictions. Typically, the lower the RMSE, the better the model. RMSE is calculated using the following formula:
- 2.
- MAE, like the RMSE, is an evaluation metric for measuring prediction error. This metric shows the average absolute difference between actual values and predicted results and is less sensitive to outliers than RMSE. MAE is calculated using the following formula:
- 3.
- R2, also known as the goodness of fit, compares the total variability of the model’s predicted values with the actual values and expresses the degree of fit of the model. R2 ranges between 0 and 1, and the closer the value is to 1, the better the fit. R2 is calculated using the following formula:
4. Case Study Analysis
4.1. Data Analysis and Pre-Processing
4.2. Hyperparameter Setting
4.3. Comparison with Ensemble Models
4.4. Comparison with Single Model
5. Model Performance Verification
6. Conclusions
- Instead of increasing its complexity, the improvements to SSA improve its poor initial population quality, poor ability to jump out of local optima, and dimensionality shrinkage. On the 10 general benchmark test functions, ISSA achieves better performance, and ISSA’s optimization performance is basically better than the four basic intelligent optimization algorithms, DBO, NGO, SSA, and GWO, in both single-peak and multi-peak test functions, and it has better convergence speed and optimization accuracy.
- In Dataset 1, the ISSA-BPNN-AdaBoost model test set achieves a goodness-of-fit of 0.964. Such a goodness-of-fit can satisfy the requirements of actual prediction accuracy. Compared with the compared ensemble models, the R2 of the ISSA-BPNN-AdaBoost model test set is 6.64% better than the AdaBoost model, 6.64% better than the XGBoost model, and 8.80% better than the RF model. The R2 of the ISSA-BPNN-AdaBoost model test set is also improved by more than 10% compared with other comparative single models. The RMSE, MAE, and R2 of the ISSA-BPNN-AdaBoost model are optimal in the training set and the test set, which indicates that its prediction data have the best fit with the real data, and the accuracy and reliability of its predictions are better than those of the other models.
- The generalized prediction ability of the ISSA-BPNN-AdaBoost model is well validated in Dataset 2. The model achieved an R2 value of 0.970 on the test set, implying that the model was able to account for most of the data variability, a result that suggests that the model has a very high prediction accuracy. In addition, the RMSE and MAE of the model are both very low, further confirming the model’s excellent performance in generalization ability. On small datasets, by adopting an integrated strategy, the model can avoid overfitting the training set, which demonstrates its ability to generalize to new data while remaining sensitive to variable relationships.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wakjira, T.G.; Kutty, A.A.; Alam, M.S. A novel framework for developing environmentally sustainable and cost-effective ultra-high-performance concrete (UHPC) using advanced machine learning and multi-objective optimization techniques. Constr. Build. Mater. 2024, 416, 135114. [Google Scholar] [CrossRef]
- Lin, C.-J.; Wu, N.-J. An ANN Model for Predicting the Compressive Strength of Concrete. Appl. Sci. 2021, 11, 3798. [Google Scholar] [CrossRef]
- Awodiji CT, G.; Onwuka, D.O.; Okere, C.; Ibearugbulem, O. Anticipating the Compressive Strength of Hydrated Lime Cement Concrete Using Artificial Neural Network Model. Civ. Eng. J. 2018, 4, 3005. [Google Scholar] [CrossRef]
- Yaprak, H.; Karacı, A.; Demir, İ. Prediction of the effect of varying cure conditions and w/c ratio on the compressive strength of concrete using artificial neural networks. Neural Comput. Appl. 2013, 22, 133–141. [Google Scholar] [CrossRef]
- Abunassar, N.; Alas, M.; Ali SI, A. Prediction of Compressive Strength in Self-compacting Concrete Containing Fly Ash and Silica Fume Using ANN and SVM. Arab. J. Sci. Eng. 2023, 48, 5171–5184. [Google Scholar] [CrossRef]
- Abd, A.M.; Abd, S.M. Modelling the strength of lightweight foamed concrete using support vector machine (SVM). Case Stud. Constr. Mater. 2017, 6, 8–15. [Google Scholar] [CrossRef]
- Huang, X.Y.; Wu, K.Y.; Wang, S.; Lu, T.; Lu, Y.F.; Deng, W.C.; Li, H.M. Compressive Strength Prediction of Rubber Concrete Based on Artificial Neural Network Model with Hybrid Particle Swarm Optimization Algorithm. Materials 2022, 15, 3934. [Google Scholar] [CrossRef]
- Li, Y.; Yang, X.; Ren, C.; Wang, L.; Ning, X. Predicting the Compressive Strength of Ultra-High-Performance Concrete Based on Machine Learning Optimized by Meta-Heuristic Algorithm. Buildings 2024, 14, 1209. [Google Scholar] [CrossRef]
- Ganaie, M.A.; Hu, M.; Malik, A.K.; Tanveer, M.; Suganthan, P.N. Ensemble deep learning: A review. Eng. Appl. Artif. Intell. 2022, 115, 105151. [Google Scholar] [CrossRef]
- Ahmad, A.; Farooq, F.; Niewiadomski, P.; Ostrowski, K.; Akbar, A.; Aslam, F.; Alyousef, R. Prediction of Compressive Strength of Fly Ash Based Concrete Using Individual and Ensemble Algorithm. Materials 2021, 14, 794. [Google Scholar] [CrossRef]
- Li, Q.-F.; Song, Z.-M. High-performance concrete strength prediction based on ensemble learning. Constr. Build. Mater. 2022, 324, 126694. [Google Scholar] [CrossRef]
- Al Martini, S.; Sabouni, R.; Khartabil, A.; Wakjira, T.G.; Shahria Alam, M. Development and strength prediction of sustainable concrete having binary and ternary cementitious blends and incorporating recycled aggregates from demolished UAE buildings: Experimental and machine learning-based studies. Constr. Build. Mater. 2023, 380, 131278. [Google Scholar] [CrossRef]
- Li, Q.; Song, Z. Prediction of compressive strength of rice husk ash concrete based on stacking ensemble learning model. J. Clean. Prod. 2023, 382, 135279. [Google Scholar] [CrossRef]
- Al Martini, S.; Sabouni, R.; Khartabil, A.; Wakjira, T.G.; Alam, M.S. Use of fresh properties to predict mechanical properties of sustainable concrete incorporating recycled concrete aggregate. J. Sustain. Cem. -Based Mater. 2024, 13, 1277–1288. [Google Scholar] [CrossRef]
- Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
- Yue, Y.; Cao, L.; Lu, D.; Hu, Z.; Xu, M.; Wang, S.; Li, B.; Ding, H. Review and empirical analysis of sparrow search algorithm. Artif. Intell. Rev. 2023, 56, 10867–10919. [Google Scholar] [CrossRef]
- Wang, Z.; Wang, J.; Li, D.; Zhu, D. A Multi-Strategy Sparrow Search Algorithm with Selective Ensemble. Electronics 2023, 12, 2505. [Google Scholar] [CrossRef]
- Chen, X.; Huang, X.; Zhu, D.; Qiu, Y. Research on chaotic flying sparrow search algorithm. J. Phys. Conf. Ser. 2021, 1848, 012044. [Google Scholar] [CrossRef]
- Hou, J.; Jiang, W.; Luo, Z.; Yang, L.; Hu, X.; Guo, B. Dynamic Path Planning for Mobile Robots by Integrating Improved Sparrow Search Algorithm and Dynamic Window Approach. Actuators 2024, 13, 24. [Google Scholar] [CrossRef]
- Qiu, S.; Li, A. Application of Chaos Mutation Adaptive Sparrow Search Algorithm in Edge Data Compression. Sensors 2022, 22, 5425. [Google Scholar] [CrossRef]
- Zhang, C.; Ding, S. A stochastic configuration network based on chaotic sparrow search algorithm. Knowl.-Based Syst. 2021, 220, 106924. [Google Scholar] [CrossRef]
- Dehghani, M.; Hubalovsky, S.; Trojovsky, P. Northern Goshawk Optimization: A New Swarm-Based Algorithm for Solving Optimization Problems. IEEE Access 2021, 9, 162059–162080. [Google Scholar] [CrossRef]
- Zhang, F. Multi-Strategy Improved Northern Goshawk Optimization Algorithm and Application. IEEE Access 2024, 12, 34247–34264. [Google Scholar] [CrossRef]
- Salgotra, R.; Singh, U.; Singh, G.; Singh, S.; Gandomi, A.H. Application of mutation operators to salp swarm algorithm. Expert Syst. Appl. 2021, 169, 114368. [Google Scholar] [CrossRef]
- Zhu, H.; Liu, G.; Zhou, M.; Xie, Y.; Kang, Q. Dandelion Algorithm With Probability-Based Mutation. IEEE Access 2019, 7, 97974–97985. [Google Scholar] [CrossRef]
- Xue, J.; Shen, B. Dung beetle optimizer: A new meta-heuristic algorithm for global optimization. J. Supercomput. 2023, 79, 7305–7336. [Google Scholar] [CrossRef]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
- Liang, C.; Qian, C.; Chen, H.; Kang, W. Prediction of Compressive Strength of Concrete in Wet-Dry Environment by BP Artificial Neural Networks. Adv. Mater. Sci. Eng. 2018, 2018, 6204942. [Google Scholar] [CrossRef]
- Hornik, K.; Stinchcombe, M.; White, H. Multilayer feedforward networks are universal approximators. Neural Netw. 1989, 2, 359–366. [Google Scholar] [CrossRef]
- Freund, Y.; Schapire, R.E. A desicion-theoretic generalization of on-line learning and an application to boosting. In Computational Learning Theory; Vitányi, P., Ed.; Springer: Berlin/Heidelberg, Germany, 1995; Volume 904, pp. 23–37. [Google Scholar]
- Shanmugasundar, G.; Vanitha, M.; Čep, R.; Kumar, V.; Kalita, K.; Ramachandran, M. A Comparative Study of Linear, Random Forest and AdaBoost Regressions for Modeling Non-Traditional Machining. Processes 2021, 9, 2015. [Google Scholar] [CrossRef]
- Ben Chaabene, W.; Flah, M.; Nehdi, M.L. Machine learning prediction of mechanical properties of concrete: Critical review. Constr. Build. Mater. 2020, 260, 119889. [Google Scholar] [CrossRef]
- Yeh, I.-C. Modeling of strength of high-performance concrete using artificial neural networks. Cem. Concr. Res. 1998, 28, 1797–1808. [Google Scholar] [CrossRef]
- Wakjira, T.G.; Ibrahim, M.; Ebead, U.; Alam, M.S. Explainable machine learning model and reliability analysis for flexural capacity prediction of RC beams strengthened in flexure with FRCM. Eng. Struct. 2022, 255, 113903. [Google Scholar] [CrossRef]
- Chopra, P.; Sharma, R.K.; Kumar, M. Prediction of Compressive Strength of Concrete Using Artificial Neural Network and Genetic Programming. Adv. Mater. Sci. Eng. 2016, 2016, 7648467. [Google Scholar] [CrossRef]
- Wakjira, T.G.; Abushanab, A.; Alam, M.S. Hybrid machine learning model and predictive equations for compressive stress-strain constitutive modelling of confined ultra-high-performance concrete (UHPC) with normal-strength steel and high-strength steel spirals. Eng. Struct. 2024, 304, 117633. [Google Scholar] [CrossRef]
Test Function | n | S | ||
---|---|---|---|---|
F1 | 30 | 0 | ||
F2 | 30 | 0 | ||
F3 | 30 | 0 | ||
F4 | 30 | 0 | ||
F5 | 30 | 0 | ||
F6 | 30 | 0 | ||
F7 | 30 | −12,569.5 | ||
F8 | 30 | 0 | ||
F9 | + 20 + e | 30 | 0 | |
30 | 0 |
Optimal | Worst | Median | Average | SD | ||
---|---|---|---|---|---|---|
F1 | ISSA | 0 | 0 | 0 | 0 | 0 |
DBO | 0 | 1.81 × 10−218 | 2.61 × 10−276 | 6.04 × 10−220 | 0 | |
NGO | 5.71 × 10−183 | 6.59 × 10−178 | 1.91 × 10−180 | 3.71 × 10−179 | 0 | |
SSA | 0 | 1.44 × 10−81 | 9.68 × 10−99 | 4.81 × 10−83 | 2.63 × 10−82 | |
GWO | 8.14 × 10−62 | 2.27 × 10−58 | 7.99 × 10−60 | 2.77 × 10−59 | 4.68 × 10−59 | |
F2 | ISSA | 0 | 1.12 × 10−268 | 1.91 × 10−300 | 3.73 × 10−270 | 0 |
DBO | 1.66 × 10−152 | 4.32 × 10−118 | 1.53 × 10−139 | 1.44 × 10−119 | 7.88 × 10−119 | |
NGO | 7.76 × 10−94 | 1.80 × 10−91 | 6.56 × 10−93 | 1.60 × 10−92 | 3.55 × 10−92 | |
SSA | 0 | 1.04 × 10−39 | 8.60 × 10−49 | 4.28 × 10−41 | 1.92 × 10−40 | |
GWO | 1.17 × 10−35 | 2.99 × 10−34 | 6.98 × 10−35 | 9.87 × 10−35 | 7.37 × 10−35 | |
F3 | ISSA | 0 | 0 | 0 | 0 | 0 |
DBO | 1.99 × 10−257 | 2.17 × 10−116 | 1.53 × 10−209 | 7.24 × 10−118 | 3.97 × 10−117 | |
NGO | 8.29 × 10−58 | 8.07 × 10−46 | 9.97 × 10−54 | 2.70 × 10−47 | 1.47 × 10−46 | |
SSA | 0 | 2.51 × 10−32 | 8.55 × 10−50 | 8.37 × 10−34 | 4.58 × 10−33 | |
GWO | 4.27 × 10−20 | 1.75 × 10−13 | 2.03 × 10−16 | 1.14 × 10−14 | 3.38 × 10−14 | |
F4 | ISSA | 8.38 × 10−308 | 9.44 × 10−267 | 4.93 × 10−281 | 3.96 × 10−268 | 0 |
DBO | 3.89 × 10−157 | 7.15 × 10−104 | 1.07 × 10−124 | 2.38 × 10−105 | 1.31 × 10−104 | |
NGO | 5.74 × 10−78 | 1.33 × 10−75 | 7.11 × 10−77 | 1.99 × 10−76 | 3.01 × 10−76 | |
SSA | 5.49 × 10−130 | 1.55 × 10−38 | 1.07 × 10−50 | 9.14 × 10−40 | 3.44 × 10−39 | |
GWO | 2.80 × 10−16 | 1.12 × 10−13 | 7.58 × 10−15 | 1.91 × 10−14 | 3.09 × 10−14 | |
F5 | ISSA | 0 | 1.34 × 10−25 | 2.87 × 10−31 | 5.04 × 10−27 | 2.44 × 10−26 |
DBO | 4.25 × 10−11 | 2.30 × 10−6 | 2.47 × 10−9 | 1.25 × 10−7 | 4.34 × 10−7 | |
NGO | 7.12 × 10−9 | 2.46 × 10−7 | 4.14 × 10−8 | 6.55 × 10−8 | 6.50 × 10−8 | |
SSA | 2.76 × 10−24 | 9.21 × 10−18 | 1.90 × 10−20 | 7.07 × 10−19 | 2.01 × 10−18 | |
GWO | 1.43 × 10−5 | 1.75 | 6.42 × 10−1 | 6.58 × 10−1 | 3.42 × 10−1 | |
F6 | ISSA | 3.35 × 10−5 | 1.30 × 10−3 | 3.21 × 10−4 | 3.98 × 10−4 | 3.13 × 10−4 |
DBO | 9.02 × 10−5 | 1.51 × 10−3 | 6.54 × 10−4 | 6.70 × 10−4 | 4.25 × 10−4 | |
NGO | 1.27 × 10−5 | 6.71 × 10−4 | 2.67 × 10−4 | 3.02 × 10−4 | 1.29 × 10−4 | |
SSA | 9.76 × 10−6 | 3.65 × 10−3 | 7.41 × 10−4 | 1.00 × 10−3 | 9.39 × 10−4 | |
GWO | 2.11 × 10−4 | 2.19 × 10−3 | 8.12 × 10−4 | 9.05 × 10−4 | 5.41 × 10−4 | |
F7 | ISSA | −12,569.49 | −8974.77 | −12,569.49 | −11,794.43 | 1146.83 |
DBO | −12,550.31 | −5996.56 | −8472.92 | −9252.01 | 2303.60 | |
NGO | −9143.78 | −6988.91 | −7837.66 | −7958.94 | 546.14 | |
SSA | −9558.42 | −6607.98 | −8286.62 | −8310.38 | 666.62 | |
GWO | −7555.06 | −3700.65 | −612.65 | −6063.92 | 858.72 | |
F8 | ISSA | 0 | 0 | 0 | 0 | 0 |
DBO | 0 | 33.83 | 0 | 2.72 | 8.58 | |
NGO | 0 | 0 | 0 | 0 | 0 | |
SSA | 0 | 0 | 0 | 0 | 0 | |
GWO | 0 | 1.01 | 0 | 3.34 × 10−2 | 1.83 × 10−1 | |
F9 | ISSA | 4.44 × 10−16 | 4.44 × 10−16 | 4.44 × 10−16 | 4.44 × 10−16 | 0 |
DBO | 4.44 × 10−16 | 4.00 × 10−15 | 4.44 × 10−16 | 6.81 × 10−16 | 9.01 × 10−16 | |
NGO | 4.00 × 10−15 | 7.55 × 10−15 | 7.55 × 10−15 | 5.89 × 10−15 | 1.80 × 10−15 | |
SSA | 4.44 × 10−16 | 4.44 × 10−16 | 4.44 × 10−16 | 4.44 × 10−16 | 0 | |
GWO | 1.11 × 10−14 | 2.18 × 10−14 | 1.47 × 10−14 | 1.60 × 10−14 | 2.87 × 10−15 | |
F10 | ISSA | 1.57 × 10−32 | 3.63 × 10−32 | 1.70 × 10−32 | 1.81 × 10−32 | 4.20 × 10−33 |
DBO | 1.04 × 10−13 | 1.06 × 10−3 | 4.42 × 10−11 | 6.52 × 10−5 | 2.27 × 10−4 | |
NGO | 4.83 × 10−10 | 1.75 × 10−8 | 3.38 × 10−9 | 4.83 × 10−9 | 4.32 × 10−9 | |
SSA | 5.30 × 10−24 | 6.82 × 10−18 | 2.52 × 10−21 | 2.51 × 10−20 | 1.24 × 10−18 | |
GWO | 1.32 × 10−2 | 9.00 × 10−2 | 3.70 × 10−2 | 4.16 × 10−2 | 1.96 × 10−2 |
Parameters | Mean | Median | SD | Variance | Min | Max | Skewness |
---|---|---|---|---|---|---|---|
Water (kg/m3) | 181.57 | 185.00 | 21.34 | 455.56 | 121.80 | 247.00 | 0.07 |
Cement (kg/m3) | 281.17 | 272.90 | 104.46 | 10,910.98 | 102 | 540 | 0.51 |
Fine aggregate (kg/m3) | 773.58 | 779.50 | 80.14 | 6421.95 | 594.00 | 992.60 | −0.25 |
Coarse aggregate (kg/m3) | 972.92 | 968.00 | 77.72 | 6039.81 | 801.00 | 1145.00 | −0.04 |
Fly ash (kg/m3) | 54.19 | 0.00 | 63.97 | 4091.64 | 0.00 | 200.10 | 0.54 |
Slag(kg/m3) | 73.90 | 22.00 | 86.24 | 7436.90 | 0.00 | 359.40 | 0.80 |
Superplastic (kg/m3) | 6.20 | 6.40 | 5.97 | 35.65 | 0.00 | 32.20 | 0.91 |
Age (days) | 45.66 | 28.00 | 63.14 | 3986.56 | 1.00 | 365.00 | 3.26 |
Strength (MPa) | 35.82 | 34.45 | 16.70 | 278.81 | 2.33 | 82.60 | 0.42 |
Cement | Slag | Fly Ash | Water | Superplasticizer | Coarse Aggregate | Fine Aggregate | Age | Strength | |
---|---|---|---|---|---|---|---|---|---|
CEMENT | 1.000 | ||||||||
SLAG | −0.275 *** | 1.000 | |||||||
FLY ASH | −0.397 *** | −0.324 *** | 1.000 | ||||||
WATER | −0.082 *** | 0.107 *** | −0.257 *** | 1.000 | |||||
SUPERPLASTICIZER | 0.093 *** | 0.043 | 0.377 *** | −0.657 *** | 1.000 | ||||
COARSE AGGREGATE | −0.109 *** | −0.284 *** | −0.010 | −0.182 *** | −0.266 *** | 1.000 | |||
FINE AGGREGATE | −0.223 *** | −0.282 *** | 0.079 *** | −0.451 *** | 0.223 *** | −0.179 *** | 1.000 | ||
AGE | 0.082 *** | −0.044 | −0.154 *** | 0.278 *** | −0.193 *** | −0.003 | −0.156 *** | 1.000 | |
STRENGTH | 0.498 *** | 0.135 *** | −0.106 *** | −0.290 *** | 0.366 *** | −0.165 *** | −0.167 *** | 0.329 *** | 1.000 |
Hyperparameter Name | Hyperparameter Value |
---|---|
Learning rate | 0.01 |
Epochs | 100 |
Max fail | 6 |
Activation function | ReLU |
Optimization algorithm | trainlm |
Batch size | 64 |
Ratio | Training Set | Test Set | |||||
---|---|---|---|---|---|---|---|
RMSE | MAE | R2 | RMSE | MAE | R2 | ||
ISSA-BPNN- AdaBoost | 7:3 | 3.634 | 2.785 | 0.957 | 4.565 | 3.213 | 0.945 |
8:2 | 3.524 | 2.582 | 0.971 | 3.548 | 2.954 | 0.964 | |
9:1 | 3.855 | 2.767 | 0.953 | 4.675 | 3.332 | 0.937 | |
AdaBoost | 7:3 | 4.332 | 3.238 | 0.932 | 5.196 | 4.003 | 0.906 |
8:2 | 3.985 | 2.962 | 0.945 | 4.746 | 3.450 | 0.904 | |
9:1 | 4.399 | 3.293 | 0.932 | 4.475 | 3.391 | 0.915 | |
XGBoost | 7:3 | 4.096 | 3.095 | 0.941 | 5.868 | 4.606 | 0.895 |
8:2 | 3.939 | 3.106 | 0.942 | 4.706 | 3.584 | 0.904 | |
9:1 | 4.286 | 3.306 | 0.934 | 6.075 | 4.726 | 0.867 | |
RF | 7:3 | 4.044 | 3.115 | 0.923 | 6.198 | 4.673 | 0.871 |
8:2 | 3.884 | 3.080 | 0.927 | 5.436 | 3.970 | 0.886 | |
9:1 | 3.827 | 2.830 | 0.939 | 5.472 | 3.857 | 0.869 |
Ratio | Training Set | Test Set | |||||
---|---|---|---|---|---|---|---|
RMSE | MAE | R2 | RMSE | MAE | R2 | ||
ISSA-BPNN | 7:3 | 4.916 | 3.475 | 0.912 | 6.223 | 4.810 | 0.886 |
8:2 | 5.015 | 3.787 | 0.921 | 4.741 | 3.590 | 0.912 | |
9:1 | 5.876 | 3.564 | 0.923 | 5.985 | 4.541 | 0.895 | |
BPNN | 7:3 | 6.083 | 4.406 | 0.864 | 6.623 | 5.010 | 0.845 |
8:2 | 5.523 | 4.220 | 0.891 | 5.830 | 4.286 | 0.874 | |
9:1 | 5.294 | 3.922 | 0.898 | 5.938 | 4.654 | 0.887 | |
SVM | 7:3 | 5.619 | 3.939 | 0.885 | 6.635 | 4.627 | 0.848 |
8:2 | 5.602 | 3.895 | 0.886 | 6.766 | 4.778 | 0.843 | |
9:1 | 5.331 | 3.739 | 0.898 | 8.655 | 5.639 | 0.726 | |
CNN | 7:3 | 5.933 | 4.531 | 0.878 | 6.530 | 4.865 | 0.834 |
8:2 | 5.360 | 4.089 | 0.899 | 6.092 | 4.595 | 0.857 | |
9:1 | 5.655 | 4.225 | 0.885 | 6.585 | 5.193 | 0.854 | |
ELM | 7:3 | 7.324 | 5.698 | 0.816 | 7.197 | 5.504 | 0.792 |
8:2 | 7.129 | 5.497 | 0.810 | 8.363 | 6.384 | 0.782 | |
9:1 | 7.231 | 5.554 | 0.815 | 7.719 | 5.830 | 0.753 | |
LSTM | 7:3 | 7.315 | 5.694 | 0.813 | 7.139 | 5.434 | 0.805 |
8:2 | 6.400 | 4.969 | 0.851 | 6.895 | 5.358 | 0.837 | |
9:1 | 7.105 | 5.595 | 0.821 | 7.035 | 5.724 | 0.789 |
Parameters | Mean | Median | SD | Variance | Min | Max | Skewness |
---|---|---|---|---|---|---|---|
Water (kg/m3) | 202.81 | 199.75 | 12.82 | 164.37 | 178.50 | 229.5 | 0.11 |
Cement (kg/m3) | 433.88 | 450.00 | 34.81 | 1211.73 | 350.00 | 475 | −0.67 |
Fine aggregate (kg/m3) | 524.31 | 526.50 | 69.38 | 4813.32 | 175.95 | 641.75 | −1.66 |
Coarse aggregate (kg/m3) | 1050.88 | 1096.50 | 134.51 | 18,094.07 | 798.00 | 1253.75 | −0.57 |
Fly ash (kg/m3) | 24.03 | 0.00 | 32.64 | 1065.44 | 0.00 | 71.25 | 0.63 |
Strength (MPa) | 44.37 | 44.08 | 5.21 | 27.17 | 31.66 | 54.49 | 0.07 |
Water | Cement | Fine Aggregate | Coarse Aggregate | Fly Ash | Strength | |
---|---|---|---|---|---|---|
WATER | 1.000 | |||||
CEMENT | 0.503 *** | 1.000 | ||||
FINE AGGREGATE | 0.510 *** | 0.008 | 1.000 | |||
COARSE AGGREGATE | −0.289 ** | −0.351 *** | 0.193 * | 1.000 | ||
FLY ASH | 0.089 | 0.386 *** | −0.027 | −0.140 | 1.000 | |
STRENGTH | −0.173 | 0.505 *** | −0.317 *** | −0.073 | −0.361 *** | 1.000 |
Training Set | Test Set | |||||
---|---|---|---|---|---|---|
RMSE | MAE | R2 | RMSE | MAE | R2 | |
ISSA-BPNN-AdaBoost | 0.752 | 0.551 | 0.982 | 0.968 | 0.756 | 0.970 |
ISSA-BPNN | 1.007 | 0.649 | 0.963 | 1.275 | 1.041 | 0.932 |
BPNN | 1.718 | 1.348 | 0.883 | 1.867 | 1.587 | 0.882 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, P.; Zhang, Z.; Gu, J. Prediction of Concrete Compressive Strength Based on ISSA-BPNN-AdaBoost. Materials 2024, 17, 5727. https://doi.org/10.3390/ma17235727
Li P, Zhang Z, Gu J. Prediction of Concrete Compressive Strength Based on ISSA-BPNN-AdaBoost. Materials. 2024; 17(23):5727. https://doi.org/10.3390/ma17235727
Chicago/Turabian StyleLi, Ping, Zichen Zhang, and Jiming Gu. 2024. "Prediction of Concrete Compressive Strength Based on ISSA-BPNN-AdaBoost" Materials 17, no. 23: 5727. https://doi.org/10.3390/ma17235727
APA StyleLi, P., Zhang, Z., & Gu, J. (2024). Prediction of Concrete Compressive Strength Based on ISSA-BPNN-AdaBoost. Materials, 17(23), 5727. https://doi.org/10.3390/ma17235727