Use of Unmanned Aerial Vehicle Imagery and Deep Learning UNet to Extract Rice Lodging
<p>Schematic diagram of the study area.</p> "> Figure 2
<p>Data preprocessing: (<b>a</b>) the resulting RGB image; (<b>b</b>) the resulting multispectral image.</p> "> Figure 3
<p>Sample dataset, the manually processed ground images are displayed as binary image: (<b>a</b>) RGB image dataset, displayed as color image; (<b>b</b>) multispectral image dataset, displayed as color image and pseudo color image.</p> "> Figure 4
<p>Improved UNet architecture. Conv is a convolutional layer (convolution kernel is 3 × 3 and padding is 1), BN is batch normalization, Relu is the activation layer, Maxpool is the maximum pooling layer, UpConv is the transposed convolution.</p> "> Figure 5
<p>Loss and dice curves of the RGB image dataset model.</p> "> Figure 6
<p>Loss and dice curves of the Red, Green, NIR (RGN) image dataset model.</p> "> Figure 7
<p>Test dataset segmentation results: (<b>a</b>) RGB image test dataset segmentation results; (<b>b</b>) multispectral image test dataset segmentation results.</p> "> Figure 8
<p>RGB original image (<b>left</b>), segmentation result image (<b>middle</b>) and manually processed ground image (<b>right</b>).</p> "> Figure 9
<p>Multispectral original image (<b>left</b>), segmentation result image (<b>middle</b>) and manually processed ground image (<b>right</b>).</p> "> Figure 10
<p>Field marker: (<b>a</b>) field marker of RGB image; (<b>b</b>) field marker of multispectral image.</p> "> Figure 11
<p>Comparison of different sensor applications: (<b>a</b>) loss comparison of different sensor applications; (<b>b</b>) dice comparison of different sensor applications.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Data Acquisition
2.3. Data Preprocessing
2.4. Dataset Generation
2.5. Model Establishment and Evaluation
2.6. Vegetation Index Calculation
3. Results
3.1. Model Training
3.2. Model Prediction
3.3. Damage Assessment
3.4. Vegetation Indices as Inputs of UNet
4. Discussion
4.1. Influence of Feature Selection on the Recognition Accuracy of Lodged Rice
4.2. Influence of Different Sensors on the Recognition Accuracy of Rice Lodging
4.3. Influence of Different External Environmental Factors on the Recognition Accuracy of Rice Lodging
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Juliano, B.O. Rice in Human Nutrition; International Rice Research Institute: Los Banos, Philippines, 1993. [Google Scholar]
- Setter, T.; Laureles, E.; Mazaredo, A. Lodging reduces yield of rice by self-shading and reductions in canopy photosynthesis. Field Crop. Res. 1997, 49, 95–106. [Google Scholar] [CrossRef]
- Zhong-xu, Z.; Wenfu, C.; Zhenyu, Y. Effect of Lodging Resistance on Yield and Its Relationship with Stalk Physical Characteristics. J. Shenyang Agric. Univ. 1999, 29, 6–11. [Google Scholar]
- Hitaka, H. Studies on the lodging of rice plants. Jpn. Agric. Res. Quart 1969, 4, 1–6. [Google Scholar]
- Li, R.; Tingbo, J.; Taiquan, Q. Study on Effect of Lodging to Yield and Relationship between Lodging and Plant Height in Rice. Heilongjiang Agric. Sci. 1996, 1, 13–17. [Google Scholar]
- Lang, Y.; Yang, X.; Wang, M.; Zhu, Q. Effects of lodging at different filling stages on rice yield and grain quality. Rice Sci. 2012, 19, 315–319. [Google Scholar] [CrossRef]
- Chu, T.; Starek, M.; Brewer, M.; Murray, S.; Pruter, L. Assessing lodging severity over an experimental maize (Zea mays L.) field using UAS images. Remote Sens. 2017, 9, 923. [Google Scholar] [CrossRef]
- Kumpumäki, T.; Linna, P.; Lipping, T. Crop Lodging Analysis from Uas Orthophoto Mosaic, Sentinel-2 Image and Crop Yield Monitor Data. In Proceedings of the 2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 7723–7726. [Google Scholar]
- Han, D.; Yang, H.; Yang, G.; Qiu, C. Monitoring model of maize lodging based on sentinel-1 radar image. Trans. Chin. Soc. Agric. Eng. 2018, 34, 166–172. [Google Scholar]
- Wang, L.; Gu, X.; Hu, S.; Yang, G.; Wang, L.; Fan, Y.; Wang, Y. Remote Sensing Monitoring of Maize Lodging Disaster with Multi-Temporal HJ-1B CCD Image. Sci. Agric. Sin. 2016, 49, 4120–4129. [Google Scholar]
- Li, Z.; Chen, Z.; Wang, L.; Liu, J.; Zhou, Q. Area extraction of maize lodging based on remote sensing by small unmanned aerial vehicle. Trans. Chin. Soc. Agric. Eng. 2014, 30, 207–213. [Google Scholar]
- Yang, M.-D.; Huang, K.-S.; Kuo, Y.-H.; Tsai, H.; Lin, L.-M. Spatial and spectral hybrid image classification for rice lodging assessment through UAV imagery. Remote Sens. 2017, 9, 583. [Google Scholar] [CrossRef]
- Jin, X.; Jie, L.; Wang, S.; Qi, H.; Li, S. Classifying wheat hyperspectral pixels of healthy heads and Fusarium head blight disease using a deep neural network in the wild field. Remote Sens. 2018, 10, 395. [Google Scholar] [CrossRef]
- Mohanty, S.P.; Hughes, D.P.; Salathé, M. Using deep learning for image-based plant disease detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef]
- Morales, G.; Kemper, G.; Sevillano, G.; Arteaga, D.; Ortega, I.; Telles, J. Automatic Segmentation of Mauritia flexuosa in Unmanned Aerial Vehicle (UAV) Imagery Using Deep Learning. Forests 2018, 9, 736. [Google Scholar] [CrossRef]
- Zeggada, A.; Melgani, F.; Bazi, Y. A deep learning approach to UAV image multilabeling. IEEE Geosci. Remote Sens. Lett. 2017, 14, 694–698. [Google Scholar] [CrossRef]
- Yuan, P.; Li, W.; Ren, S.; Xu, H. Recognition for flower type and variety of chrysanthemum with convolutional neural network. Trans. Chin. Soc. Agric. Eng. 2018, 34, 152–158. [Google Scholar]
- Tang, J.; Wang, D.; Zhang, Z.; He, L.; Xin, J.; Xu, Y. Weed identification based on K-means feature learning combined with convolutional neural network. Comput. Electron. Agric. 2017, 135, 63–70. [Google Scholar] [CrossRef]
- Chen, S.W.; Shivakumar, S.S.; Dcunha, S.; Das, J.; Okon, E.; Qu, C.; Taylor, C.J.; Kumar, V. Counting apples and oranges with deep learning: A data-driven approach. IEEE Robot. Autom. Lett. 2017, 2, 781–788. [Google Scholar] [CrossRef]
- Li, W.; Fu, H.; Yu, L.; Cracknell, A. Deep learning based oil palm tree detection and counting for high-resolution remote sensing images. Remote Sens. 2016, 9, 22. [Google Scholar] [CrossRef]
- Ferentinos, K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018, 145, 311–318. [Google Scholar] [CrossRef]
- Uzal, L.C.; Grinblat, G.L.; Namías, R.; Larese, M.G.; Bianchi, J.; Morandi, E.; Granitto, P.M. Seed-per-pod estimation for plant breeding using deep learning. Comput. Electron. Agric. 2018, 150, 196–204. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5−9 October 2015; pp. 234–241. [Google Scholar]
- Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Sardinia, Italy, 13−15 May 2010; pp. 249–256. [Google Scholar]
- Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Meyer, G.E.; Hindman, T.W.; Laksmi, K. Machine vision detection parameters for plant species identification. Precis. Agric. Biol. Qual. 1999, 3543, 327–335. [Google Scholar]
- Wang, X.; Wang, M.; Wang, S.; Wu, Y. Extraction of vegetation information from visible unmanned aerial vehicle images. Trans. Chin. Soc. Agric. Eng. 2015, 31, 152–159. [Google Scholar]
- Gamon, J.; Surfus, J. Assessing leaf pigment content and activity with a reflectometer. New Phytol. 1999, 143, 105–117. [Google Scholar] [CrossRef]
- Hunt, E.R.; Cavigelli, M.; Daughtry, C.S.; Mcmurtrey, J.E.; Walthall, C.L. Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precis. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
- Sun, G.; Wang, X.; Yan, T.; Li, X.; Chen, M.; Shi, Y.; Chen, J. Inversion method of flora growth parameters based on machine vision. Trans. Chin. Soc. Agric. Eng. 2014, 30, 187–195. [Google Scholar]
- Rouse, J.W., Jr.; Haas, R.; Schell, J.; Deering, D. Monitoring Vegetation Systems in the Great Plains with ERTS; NASA: Washington, DC, USA, 1974.
- Pearson, R.L.; Miller, L.D. Remote mapping of standing crop biomass for estimation of the productivity of the shortgrass prairie. Remote Sens. Environ. VIII 1972, 1355. [Google Scholar]
- McFeeters, S.K. The use of the Normalized Difference Water Index (NDWI) in the delineation of open water features. Int. J. Remote Sens. 1996, 17, 1425–1432. [Google Scholar] [CrossRef]
- Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
- Richardson, A.J.; Wiegand, C. Distinguishing vegetation from soil background information. Photogramm. Eng. Remote Sens. 1977, 43, 1541–1552. [Google Scholar]
- Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
- Wilke, N.; Siegmann, B.; Klingbeil, L.; Burkart, A.; Kraska, T.; Muller, O.; van Doorn, A.; Heinemann, S.; Rascher, U. Quantifying Lodging Percentage and Lodging Severity Using a UAV-Based Canopy Height Model Combined with an Objective Threshold Approach. Remote Sens. 2019, 11, 515. [Google Scholar] [CrossRef]
- Murakami, T.; Yui, M.; Amaha, K. Canopy height measurement by photogrammetric analysis of aerial images: Application to buckwheat (Fagopyrum esculentum Moench) lodging evaluation. Comput. Electron. Agric. 2012, 89, 70–75. [Google Scholar] [CrossRef]
- De Souza, C.H.W.; Lamparelli, R.A.C.; Rocha, J.V.; Magalhães, P.S.G. Height estimation of sugarcane using an unmanned aerial system (UAS) based on structure from motion (SfM) point clouds. Int. J. Remote Sens. 2017, 38, 2218–2230. [Google Scholar] [CrossRef]
- Chu, T.; Starek, M.J.; Brewer, M.J.; Masiane, T.; Murray, S.C. UAS imaging for automated crop lodging detection: A case study over an experimental maize field. Auton. Air Ground Sens. Syst. Agric. Optim. Phenotyping II 2017, 10218, 102180E. [Google Scholar]
Type | Vegetation Index | Calculation Formula | References |
---|---|---|---|
Visible Vegetation index | ExG (Excess Green) | 2g-r-b | [25] |
ExR (Excess Red) | 1.4R-G | [26] | |
VDVI (Visible-Band Difference Vegetation Index) | (2G-R-B)/(2G+R+B) | [27] | |
RGRI (Red-Green Ratio Index) | R/G | [28] | |
NGRDI (Normalized Green-Red Difference Index) | (G-R)/(G+R) | [29] | |
ExGR (Excess Green minus Excess Red) | ExG-ExR | [30] | |
Multispectral Vegetation index | NDVI (Normalized Difference Vegetation Index) | (NIR-R)/(NIR+R) | [31] |
RVI (Ratio Vegetation Index) | NIR/R | [32] | |
NDWI (Normalized Difference Water Index) | (G-NIR)/(G+NIR) | [33] | |
DVI (Difference Vegetation Index) | NIR-R | [34] | |
PVI (Perpendicular Vegetation Index) | [35] | ||
SAVI(Soil-Adjusted Vegetation Index) | [36] |
Dataset | Training Dice Coefficient | Validation Dice Coefficient |
---|---|---|
RGB | 0.9468 | 0.9382 |
Multispectral | 0.9291 | 0.9222 |
Dataset | Loss | Dice Coefficient |
---|---|---|
RGB | 0.0961 | 0.9442 |
Multispectral | 0.1188 | 0.9284 |
Dataset | Lodging Ratio |
---|---|
RGB | 54.71% |
Multispectral | 61.26% |
Vegetation Index Type | Dataset | Loss | Dice Coefficient |
---|---|---|---|
3 kinds of indices | RGB | 0.1144 | 0.9396 |
Multispectral | 0.1534 | 0.9158 | |
6 kinds of indices | RGB | 0.1047 | 0.9348 |
Multispectral | 0.1207 | 0.9270 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhao, X.; Yuan, Y.; Song, M.; Ding, Y.; Lin, F.; Liang, D.; Zhang, D. Use of Unmanned Aerial Vehicle Imagery and Deep Learning UNet to Extract Rice Lodging. Sensors 2019, 19, 3859. https://doi.org/10.3390/s19183859
Zhao X, Yuan Y, Song M, Ding Y, Lin F, Liang D, Zhang D. Use of Unmanned Aerial Vehicle Imagery and Deep Learning UNet to Extract Rice Lodging. Sensors. 2019; 19(18):3859. https://doi.org/10.3390/s19183859
Chicago/Turabian StyleZhao, Xin, Yitong Yuan, Mengdie Song, Yang Ding, Fenfang Lin, Dong Liang, and Dongyan Zhang. 2019. "Use of Unmanned Aerial Vehicle Imagery and Deep Learning UNet to Extract Rice Lodging" Sensors 19, no. 18: 3859. https://doi.org/10.3390/s19183859