Prediction of Useful Eggplant Seedling Transplants Using Multi-View Images
<p>Flow diagram of the information processing process.</p> "> Figure 2
<p>Schematic diagram of an image acquisition device comprising (<b>A</b>) computer (to process image data and for 3D reconstruction), (<b>B</b>) camera (image collection), (<b>C</b>) rotation platform (rotating while carrying seedlings), and (<b>D</b>) eggplant seedling (experimental materials).</p> "> Figure 3
<p>Point cloud reconstruction and preprocessing results: <b>A</b>(<b>1</b>)–<b>D</b>(<b>1</b>) shows the primary seedlings, <b>A</b>(<b>2</b>)–<b>D</b>(<b>2</b>) shows the secondary seedlings, and <b>A</b>(<b>3</b>)–<b>D</b>(<b>3</b>) shows the unhealthy seedlings; <b>A</b>(<b>1</b>)–<b>A</b>(<b>3</b>) shows the point cloud plants after 3D reconstruction; <b>B</b>(<b>1</b>)–<b>B</b>(<b>3</b>) shows the results of fast Euclidean clustering; <b>C</b>(<b>1</b>)–<b>C</b>(<b>3</b>) shows the results based on colour threshold filtering; <b>D</b>(<b>1</b>)–<b>D</b>(<b>3</b>) shows the results of voxel filtering.</p> "> Figure 4
<p>Completion of missing point clouds: (<b>A</b>) shows the plant containing missing leaves; (<b>B</b>) shows the segmented incomplete leaves; (<b>C</b>) shows the missing leaves with the RGB data removed; (<b>D</b>) shows the purple missing section generated by PF-Net prediction; (<b>E</b>) shows the completed leaves; (<b>F</b>) shows the entire plant after point cloud completion.</p> "> Figure 5
<p>The fitting performance of phenotype extraction values based on the 3D point cloud and manual measurements: (<b>A</b>) primary seedling plant height (actual vs. predicted); (<b>B</b>) primary seedling stem diameter (actual vs. predicted); (<b>C</b>) primary seedling number of leaves (random deviation obtained in the <span class="html-italic">x</span>- and <span class="html-italic">y</span>-axis directions for the same values) (actual vs. predicted); (<b>D</b>) secondary seedling plant height (actual vs. predicted); (<b>E</b>) secondary seedling stem diameter (actual vs. predicted); (<b>F</b>) secondary seedling number of leaves (random deviation obtained in the <span class="html-italic">x</span>- and <span class="html-italic">y</span>-axis directions for the same values) (actual vs. predicted).</p> "> Figure 6
<p>Box plot data distribution for each primary and secondary seedling parameter: (<b>A</b>) data distribution on the primary and secondary seedling number of leaves; (<b>B</b>) data distribution on the primary and secondary seedling plant heights; (<b>C</b>) data distribution on the primary and secondary seedling stem diameters.</p> "> Figure 7
<p>Model convergence in testing: (<b>A</b>) accuracy variation comparison; (<b>B</b>) loss variation comparison.</p> "> Figure 8
<p>Confusion matrix of different models.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experimental Materials
2.2. Experiment Design
2.3. Image Acquisition
2.4. 2D Image Classification Models
2.5. 3D Reconstruction Based on SfM
2.6. Point Cloud Preprocessing
2.6.1. Fast Euclidean Clustering Algorithm for Image Background Removal
2.6.2. Point Cloud Filtering Based on Colour Threshold
2.6.3. Point Cloud Down Sampling Based on Voxel Filtering
2.6.4. Point Cloud Completion through a Point Fractal Network (PF-Net)
2.7. Phenotypic Feature Extraction Based on 3D Point Cloud
2.7.1. Phenotype Calculation Method
2.7.2. Point Cloud Segmentation
2.7.3. Point Cloud Coordinate Scale Transformation
2.8. 3D Point Cloud Classification Models
2.8.1. PointNet++ Model
2.8.2. DGCNN Model
2.8.3. PointConv Model
3. Results
3.1. Point Cloud Preprocessing Results
3.2. Phenotypic Feature Extraction Results
3.3. 2D Image and 3D Point Cloud Classification Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Amulya, P.; Ul Islam, R. Optimization of Enzyme-Assisted Extraction of Anthocyanins from Eggplant (Solanum melongena L.) Peel. Food Chem. X 2023, 18, 100643. [Google Scholar] [CrossRef]
- Taşan, S.; Cemek, B.; Taşan, M.; Cantürk, A. Estimation of Eggplant Yield with Machine Learning Methods Using Spectral Vegetation Indices. Comput. Electron. Agric. 2022, 202, 107367. [Google Scholar] [CrossRef]
- Jin, X.; Li, R.; Tang, Q.; Wu, J.; Jiang, L.; Wu, C. Low-Damage Transplanting Method for Leafy Vegetable Seedlings Based on Machine Vision. Biosyst. Eng. 2022, 220, 159–171. [Google Scholar] [CrossRef]
- Fu, W.; Gao, J.; Zhao, C.; Jiang, K.; Zheng, W.; Tian, Y. Detection Method and Experimental Research of Leafy Vegetable Seedlings Transplanting Based on a Machine Vision. Agronomy 2022, 12, 2899. [Google Scholar] [CrossRef]
- Jin, X.; Tang, L.; Li, R.; Ji, J.; Liu, J. Selective Transplantation Method of Leafy Vegetable Seedlings Based on ResNet 18 Network. Front. Plant Sci. 2022, 13, 893357. [Google Scholar] [CrossRef] [PubMed]
- Li, L.; Bie, Z.; Zhang, Y.; Huang, Y.; Peng, C.; Han, B.; Xu, S. Nondestructive Detection of Key Phenotypes for the Canopy of the Watermelon Plug Seedlings Based on Deep Learning. Hortic. Plant J. 2023, S2468014123001267. [Google Scholar] [CrossRef]
- Zhao, S.; Lei, X.; Liu, J.; Jin, Y.; Bai, Z.; Yi, Z.; Liu, J. Transient Multi-Indicator Detection for Seedling Sorting in High-Speed Transplanting Based on a Lightweight Model. Comput. Electron. Agric. 2023, 211, 107996. [Google Scholar] [CrossRef]
- Tong, J.H.; Li, J.B.; Jiang, H.Y. Machine Vision Techniques for the Evaluation of Seedling Quality Based on Leaf Area. Biosyst. Eng. 2013, 115, 369–379. [Google Scholar] [CrossRef]
- Tang, Z.; He, X.; Zhou, G.; Chen, A.; Wang, Y.; Li, L.; Hu, Y. A Precise Image-Based Tomato Leaf Disease Detection Approach Using PLPNet. Plant Phenomics 2023, 5, 0042. [Google Scholar] [CrossRef] [PubMed]
- Zermas, D.; Morellas, V.; Mulla, D.; Papanikolopoulos, N. 3D Model Processing for High Throughput Phenotype Extraction—The Case of Corn. Comput. Electron. Agric. 2020, 172, 105047. [Google Scholar] [CrossRef]
- Yang, W.; Feng, H.; Zhang, X.; Zhang, J.; Doonan, J.H.; Batchelor, W.D.; Xiong, L.; Yan, J. Crop Phenomics and High-Throughput Phenotyping: Past Decades, Current Challenges, and Future Perspectives. Mol. Plant 2020, 13, 187–214. [Google Scholar] [CrossRef] [PubMed]
- Zhao, C.; Zhang, Y.; Du, J.; Guo, X.; Wen, W.; Gu, S.; Wang, J.; Fan, J. Crop Phenomics: Current Status and Perspectives. Front. Plant Sci. 2019, 10, 714. [Google Scholar] [CrossRef] [PubMed]
- Perez-Sanz, F.; Navarro, P.J.; Egea-Cortines, M. Plant Phenomics: An Overview of Image Acquisition Technologies and Image Data Analysis Algorithms. GigaScience 2017, 6, gix092. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Liu, J.; Zhang, B.; Wang, Y.; Yao, J.; Zhang, X.; Fan, B.; Li, X.; Hai, Y.; Fan, X. Three-Dimensional Reconstruction and Phenotype Measurement of Maize Seedlings Based on Multi-View Image Sequences. Front. Plant Sci. 2022, 13, 974339. [Google Scholar] [CrossRef] [PubMed]
- Andújar, D.; Calle, M.; Fernández-Quintanilla, C.; Ribeiro, Á.; Dorado, J. Three-Dimensional Modeling of Weed Plants Using Low-Cost Photogrammetry. Sensors 2018, 18, 1077. [Google Scholar] [CrossRef] [PubMed]
- Zhang, X.; Liu, J.; Zhang, B.; Sun, L.; Zhou, Y.; Li, Y.; Zhang, J.; Zhang, H.; Fan, X. Research on Object Panoramic 3D Point Cloud Reconstruction System Based on Structure From Motion. IEEE Access 2022, 10, 110064–110075. [Google Scholar] [CrossRef]
- He, W.; Ye, Z.; Li, M.; Yan, Y.; Lu, W.; Xing, G. Extraction of Soybean Plant Trait Parameters Based on SfM-MVS Algorithm Combined with GRNN. Front. Plant Sci. 2023, 14, 1181322. [Google Scholar] [CrossRef] [PubMed]
- Hopkinson, B.M.; King, A.C.; Owen, D.P.; Johnson-Roberson, M.; Long, M.H.; Bhandarkar, S.M. Automated Classification of Three-Dimensional Reconstructions of Coral Reefs Using Convolutional Neural Networks. PLoS ONE 2020, 15, e0230671. [Google Scholar] [CrossRef] [PubMed]
- Yang, J.; Kang, Z.; Cheng, S.; Yang, Z.; Akwensi, P.H. An Individual Tree Segmentation Method Based on Watershed Algorithm and Three-Dimensional Spatial Distribution Analysis from Airborne LiDAR Point Clouds. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 1055–1067. [Google Scholar] [CrossRef]
- Liu, Z.; Zhang, Q.; Wang, P.; Li, Z.; Wang, H. Automated Classification of Stems and Leaves of Potted Plants Based on Point Cloud Data. Biosyst. Eng. 2020, 200, 215–230. [Google Scholar] [CrossRef]
- Liu, B.; Huang, H.; Su, Y.; Chen, S.; Li, Z.; Chen, E.; Tian, X. Tree Species Classification Using Ground-Based LiDAR Data by Various Point Cloud Deep Learning Methods. Remote Sens. 2022, 14, 5733. [Google Scholar] [CrossRef]
- Zhu, Z.; Wang, J.; Wu, M. Pattern Recognition of Quartz Sand Particles with PointConv Network. Comput. Geotech. 2023, 153, 105061. [Google Scholar] [CrossRef]
- Martinez-Guanter, J.; Ribeiro, Á.; Peteinatos, G.G.; Pérez-Ruiz, M.; Gerhards, R.; Bengochea-Guevara, J.M.; Machleb, J.; Andújar, D. Low-Cost Three-Dimensional Modeling of Crop Plants. Sensors 2019, 19, 2883. [Google Scholar] [CrossRef] [PubMed]
- Cao, Y.; Wang, Y.; Xue, Y.; Zhang, H.; Lao, Y. FEC: Fast Euclidean Clustering for Point Cloud Segmentation. Drones 2022, 6, 325. [Google Scholar] [CrossRef]
- Chen, H.; Liu, S.; Wang, C.; Wang, C.; Gong, K.; Li, Y.; Lan, Y. Point Cloud Completion of Plant Leaves under Occlusion Conditions Based on Deep Learning. Plant Phenomics 2023, 5, 0117. [Google Scholar] [CrossRef] [PubMed]
- Fan, Z.; Wei, J.; Zhang, R.; Zhang, W. Tree Species Classification Based on PointNet++ and Airborne Laser Survey Point Cloud Data Enhancement. Forests 2023, 14, 1246. [Google Scholar] [CrossRef]
- Zhao, Y.; Chen, H.; Zen, L.; Li, Z. Effective Software Security Enhancement Using an Improved PointNet++. J. Syst. Softw. 2023, 204, 111794. [Google Scholar] [CrossRef]
- Wang, Y.; Sun, Y.; Liu, Z.; Sarma, S.E.; Bronstein, M.M.; Solomon, J.M. Dynamic Graph CNN for Learning on Point Clouds. ACM Trans. Graph. 2019, 38, 146. [Google Scholar] [CrossRef]
- Xie, Y.; Tian, J.; Zhu, X.X. Linking Points With Labels in 3D: A Review of Point Cloud Semantic Segmentation. IEEE Geosci. Remote Sens. Mag. 2020, 8, 38–59. [Google Scholar] [CrossRef]
- Widyaningrum, E.; Bai, Q.; Fajari, M.K.; Lindenbergh, R.C. Airborne Laser Scanning Point Cloud Classification Using the DGCNN Deep Learning Method. Remote Sens. 2021, 13, 859. [Google Scholar] [CrossRef]
- Tsai, C.-M.; Lai, Y.-H.; Sun, Y.-D.; Chung, Y.-J.; Perng, J.-W. Multi-Dimensional Underwater Point Cloud Detection Based on Deep Learning. Sensors 2021, 21, 884. [Google Scholar] [CrossRef] [PubMed]
- De Medeiros, A.D.; Capobiango, N.P.; Da Silva, J.M.; Da Silva, L.J.; Da Silva, C.B.; Dos Santos Dias, D.C.F. Interactive Machine Learning for Soybean Seed and Seedling Quality Classification. Sci. Rep. 2020, 10, 11267. [Google Scholar] [CrossRef] [PubMed]
Model | Average Accuracy (%) | Average Precision (%) | Average Recall (%) | Average F1-Score (%) |
---|---|---|---|---|
ResNet50 | 92.18 | 92.18 | 92.19 | 92.19 |
VGG16 | 93.23 | 93.23 | 93.23 | 93.23 |
MobilNetV2 | 91.67 | 91.67 | 91.68 | 91.67 |
PointNet++ | 94.44 | 94.44 | 94.53 | 94.49 |
DGCNN | 93.06 | 93.06 | 93.10 | 93.08 |
PointConv | 95.83 | 95.83 | 95.88 | 95.86 |
Model | Average Score (%) | 95% Lower (%) | 95% Upper (%) |
---|---|---|---|
ResNet50 | 92.19 | 91.50 | 92.87 |
VGG16 | 93.23 | 92.58 | 93.88 |
MobilNetV2 | 91.67 | 90.95 | 92.40 |
PointNet++ | 94.48 | 93.82 | 95.15 |
DGCNN | 93.07 | 92.38 | 93.78 |
PointConv | 95.85 | 95.30 | 96.41 |
Model Comparison | T-Parameters | p-Value |
---|---|---|
PointConv vs. ResNet50 | 3.46 | 0.002 |
PointConv vs. VGG16 | 2.81 | 0.01 |
PointConv vs. MobilNetV2 | 4.13 | 0.001 |
PointConv vs. PointNet++ | 2.16 | 0.037 |
PointConv vs. DGCNN | 3.51 | 0.002 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yuan, X.; Liu, J.; Wang, H.; Zhang, Y.; Tian, R.; Fan, X. Prediction of Useful Eggplant Seedling Transplants Using Multi-View Images. Agronomy 2024, 14, 2016. https://doi.org/10.3390/agronomy14092016
Yuan X, Liu J, Wang H, Zhang Y, Tian R, Fan X. Prediction of Useful Eggplant Seedling Transplants Using Multi-View Images. Agronomy. 2024; 14(9):2016. https://doi.org/10.3390/agronomy14092016
Chicago/Turabian StyleYuan, Xiangyang, Jingyan Liu, Huanyue Wang, Yunfei Zhang, Ruitao Tian, and Xiaofei Fan. 2024. "Prediction of Useful Eggplant Seedling Transplants Using Multi-View Images" Agronomy 14, no. 9: 2016. https://doi.org/10.3390/agronomy14092016
APA StyleYuan, X., Liu, J., Wang, H., Zhang, Y., Tian, R., & Fan, X. (2024). Prediction of Useful Eggplant Seedling Transplants Using Multi-View Images. Agronomy, 14(9), 2016. https://doi.org/10.3390/agronomy14092016