The Optimal Phenological Phase of Maize for Yield Prediction with High-Frequency UAV Remote Sensing
<p>A brief framework for maize yield prediction in this study (GLCM = grey level co-occurrence matrix; V2, V4, V6, V14, VT, R1, R3, R4, and R6 are the second-leaf, fourth-leaf, six-leaf, fourteenth-leaf, tasseling, silking, milking, dough stage, and maturity, respectively; UAV = unmanned aerial vehicle; D = DOY = day of year).</p> "> Figure 2
<p>(<b>a</b>) Location of the YCES of the Chinese Academy of Sciences; (<b>b</b>) overview of NBES field; (<b>c</b>) overview of WNCR field. (N, P, K denote nitrogen, phosphate, and potash fertilizer, respectively, and S denotes returning all straw to field; CK is no NPK fertilizer; N0, N70, N140, N210, and N280 denote that 0, 70, 140, 210, and 280 kg·N·ha<sup>−1</sup> were applied for each growing season, respectively; L and H denote 100 mm and 140 mm irrigation, respectively).</p> "> Figure 3
<p>The observed grain yield at varying nutrient treatments: (<b>a</b>) NBES plots, WNCR plots across various (<b>b</b>) nitrogen and (<b>c</b>) irrigation levels.</p> "> Figure 4
<p>(1) Importance values of the top 20 ranked variables for the yield prediction model and (2) RMSE variation (t·ha<sup>−1</sup>) as the size of inputs increases when using each of the nine mono-temporal UAV data: (<b>a</b>) second-leaf, (<b>b</b>) fourth-leaf, (<b>c</b>) sixth-leaf, (<b>d</b>) fourteenth-leaf, (<b>e</b>) tasseling, (<b>f</b>) silking, (<b>g</b>) milking, (<b>h</b>) dough stage, and (<b>i</b>) maturity. The importance values were normalized by the value of the first ranked variable for each stage [<a href="#B58-remotesensing-14-01559" class="html-bibr">58</a>]. TIs of the images calculated using blue, green, red, red edge, and NIR bands are denoted as TI-Blue, TI-Green, TI-Red, TI-Edge, and TI-NIR, respectively. con = contrast, dis = dissimilarity, cor = correlation, sem = second moment, ent = entropy, homo = homogeneity, var = variance.</p> "> Figure 5
<p>Yield prediction accuracy using different mono-temporal UAV data: (<b>a</b>) <span class="html-italic">R</span><sup>2</sup> and (<b>b</b>) RMSE (t·ha<sup>−1</sup>) values.</p> "> Figure 6
<p>The compositions of the eight multi-temporal UAV datasets. (Red/grey denotes mono-temporal datasets are included/excluded in the multi-temporal datasets; yellow dots denote the selected mono-temporal datasets after feature variables screening, which was described in the following <a href="#sec3dot2dot2-remotesensing-14-01559" class="html-sec">Section 3.2.2</a>).</p> "> Figure 7
<p>(1) Importance values of the top 20 ranked variables for the prediction model and (2) RMSE variation (t·ha<sup>−1</sup>) as the size of inputs increases when using each of the eight groups of multi-temporal UAV data: (<b>a</b>) group 1; (<b>b</b>) group 2; (<b>c</b>) group 3; (<b>d</b>) group 4; (<b>e</b>) group 5; (<b>f</b>) group 6; (<b>g</b>) group 7; (<b>h</b>) group 8. The subscript of the variable denotes the sequence number of the used UAV data, which is determined according to the order of the data acquisition; for example, NDVI<sub>1</sub> denotes NDVI generated from the first experiment, i.e., the second-leaf stage. dis = dissimilarity, homo = homogeneity, con = contrast, var = variance, ent = entropy, sem = second moment.</p> "> Figure 8
<p>Yield prediction accuracy using different multi-temporal UAV data: (<b>a</b>) <span class="html-italic">R</span><sup>2</sup> and (<b>b</b>) RMSE (t·ha<sup>−1</sup>) values.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
- (1)
- UAV-based multispectral observations with high frequency were conducted at nine development phases of maize (Section 2.2.1).
- (2)
- Pre-processing of UAV multispectral data for the generation of an orthomosaic (Section 2.2.2).
- (3)
- Calculations of UAV variable implementing in the RF model (Section 2.2.3).
- (4)
- Performing the RF model for yield prediction using single stage data (Section 2.4.1) and multi-temporal data (Section 2.4.2), respectively.
- (5)
- Assessing the performance of the prediction models and determining the optimal UAV observation time (Section 2.5) (Figure 1).
2.1. Study Aera
2.2. UAV Multispectral Data Acquisition and Processing
2.2.1. UAV Flight Campaign
2.2.2. Pre-Processing of UAV-Based Multispectral Remote Sensing Data
2.2.3. Spectral and Texture Variable Calculations
2.3. Ground Measurement of Maize Yield
2.4. Data-Driven Model for Yield Prediction
2.4.1. Implementation of Mono-Temporal UAV Data
2.4.2. Implementation of Multi-Temporal UAV Data
2.5. Performance Analysis
3. Results
3.1. Estimating Maize Yield Using Mono-Temporal UAV Data
3.1.1. Feature Variables Screening of Mono-Temporal UAV Data
3.1.2. Performance of the Prediction Model Using Mono-Temporal UAV Data
3.2. Estimating Maize Yield Using Multi-Temporal UAV Data
3.2.1. Generation of Multi-Temporal UAV Groups
3.2.2. Feature Variables Screening of Multi-Temporal UAV Data
3.2.3. Performance of the Prediction Model Using Multi-Temporal UAV Data
4. Discussion
4.1. Selection of Feature Variables
4.2. The Optimal Growth Stage for UAV-Based Maize Yield Prediction
4.3. The Optimal Combination of UAV Observation Time for Maize Yield Prediction
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A. Calculations of UAV Texture Indices
References
- Wang, L.; Tian, Y.; Yao, X.; Zhu, Y.; Cao, W. Predicting grain yield and protein content in wheat by fusing multi-sensor and multi-temporal remote-sensing images. Field Crops Res. 2014, 164, 178–188. [Google Scholar] [CrossRef]
- Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
- Wan, L.; Cen, H.; Zhu, J.; Zhang, J.; Zhu, Y.; Sun, D.; Du, X.; Zhai, L.; Weng, H.; Li, Y.; et al. Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer-a case study of small farmlands in the South of China. Agric. For. Meteorol. 2020, 291, 108096. [Google Scholar] [CrossRef]
- Zhang, H.; Wang, L.; Tian, T.; Yin, J. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
- Furukawa, F.; Maruyama, K.; Saito, Y.K.; Kaneko, M. Corn Height Estimation Using UAV for Yield Prediction and Crop Monitoring. Unmanned Aer. Veh. Appl. Agric. Environ. 2020, 51–69. [Google Scholar] [CrossRef]
- Chivasa, W.; Mutanga, O.; Biradar, C. Application of remote sensing in estimating maize grain yield in heterogeneous African agricultural landscapes: A review. Int. J. Remote Sens. 2017, 38, 6816–6845. [Google Scholar] [CrossRef]
- Leroux, L.; Castets, M.; Baron, C.; Escorihuela, M.; Bégué, A.; Seen, D. Maize yield estimation in West Africa from crop process-induced combinations of multi-domain remote sensing indices. Eur. J. Agron. 2019, 108, 11–26. [Google Scholar] [CrossRef]
- Yao, F.; Tang, Y.; Wang, P.; Zhang, J. Estimation of maize yield by using a process-based model and remote sensing data in the Northeast China Plain. Phys. Chem. Earth Parts A/B/C 2015, 87–88, 142–152. [Google Scholar] [CrossRef]
- Verrelst, J.; Pablo Rivera, J.; Veroustraete, F.; Munoz-Mari, J.; Clevers, J.G.P.W.; Camps-Valls, G.; Moreno, J. Experimental Sentinel-2 LAI estimation using parametric, non-parametric and physical retrieval methods-A comparison. ISPRS J. Photogramm. Remote Sens. 2015, 108, 260–272. [Google Scholar] [CrossRef]
- Xie, Q.; Dash, J.; Huete, A.; Jiang, A.; Yin, G.; Ding, Y.; Peng, D.; Christopher, C.H.; Brown, L.; Shi, Y.; et al. Retrieval of crop biophysical parameters from Sentinel-2 remote sensing imagery. Int. J. Appl. Earth Obs. Geoinf. 2019, 80, 187–195. [Google Scholar] [CrossRef]
- Nevavuori, P.; Narra, N.; Linna, P.; Lipping, T. Crop Yield Prediction Using Multitemporal UAV Data and Spatio-Temporal Deep Learning Models. Remote Sens. 2020, 12, 4000. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
- Sankey, J.B.; Sankey, T.T.; Li, J.; Ravi, S.; Wang, G.; Caster, J.; Kasprak, A. Quantifying plant-soil-nutrient dynamics in rangelands: Fusion of UAV hyperspectral-LiDAR, UAV multispectral-photogrammetry, and ground-based LiDAR-digital photography in a shrub-encroached desert grassland. Remote Sens. Environ. 2021, 253, 112223. [Google Scholar] [CrossRef]
- Maresma, Á.; Ariza, M.; Martínez, E.; Lloveras, J.; Martínez-Casasnovas, J.A. Analysis of Vegetation Indices to Determine Nitrogen Application and Yield Prediction in Maize (Zea mays L.) from a Standard UAV Service. Remote Sens. 2016, 8, 973. [Google Scholar] [CrossRef] [Green Version]
- Hassan, M.A.; Yang, M.; Rasheed, A.; Yang, G.; Reynolds, M.; Xia, X.; Xiao, Y.; He, Z. A rapid monitoring of NDVI across the wheat growth cycle for grain yield prediction using a multi-spectral UAV platform. Plant Sci. 2019, 282, 95–103. [Google Scholar] [CrossRef] [PubMed]
- Silva, E.E.; Baio, F.H.R.; Teodoro, L.P.R.; Junior, C.A.; Borges, R.S.; Teodoro, P.E. UAV-multispectral and vegetation indices in soybean grain yield prediction based on in situ observation. Remote Sens. Appl. Soc. Environ. 2020, 18, 100318. [Google Scholar] [CrossRef]
- Xie, C.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
- Yang, M.; Hassan, M.A.; Xu, K.; Zheng, C.; Rasheed, A.; Zhang, Y.; Jin, X.; Xia, X.; Xiao, Y.; He, Z. Assessment of Water and Nitrogen Use Efficiencies Through UAV-Based Multispectral Phenotyping in Winter Wheat. Front. Plant Sci. 2020, 11, 927. [Google Scholar] [CrossRef]
- Potgieter, A.B.; George-Jaeggli, B.; Chapman, S.C.; Laws, K.; Suárez Cadavid, L.A.; Wixted, J.; Watson, J.; Eldridge, M.; Jordan, D.R.; Hammer, G.L. Multi-spectral imaging from an unmanned aerial vehicle enables the assessment of seasonal leaf area dynamics of sorghum breeding lines. Front. Plant Sci. 2017, 8, 1532. [Google Scholar] [CrossRef]
- Zhang, F.; Zhou, G. Estimation of vegetation water content using hyperspectral vegetation indices: A comparison of crop water indicators in response to water stress treatments for summer maize. BMC Ecol. 2019, 19, 18. [Google Scholar] [CrossRef] [Green Version]
- Li, J.; Shi, Y.; Veeranampalayam-Sivakumar, A.N.; Schachtman, D.P. Elucidating sorghum biomass, nitrogen and chlorophyll contents with spectral and morphological traits derived from unmanned aircraft system. Front. Plant Sci. 2018, 9, 1406. [Google Scholar] [CrossRef] [PubMed]
- Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef] [Green Version]
- Rotili, D.H.; Voil, P.; Eyre, J.; Serafin, L.; Aisthorpe, D.; Maddonni, G.A.; Rodríguez, D. Untangling genotype x management interactions in multi-environment on-farm experimentation. Field Crops Res. 2020, 255, 107900. [Google Scholar] [CrossRef]
- Fu, Y.; Yang, G.; Wang, J.; Song, X.; Feng, H. Winter wheat biomass estimation based on spectral indices, band depth analysis and partial least squares regression using hyperspectral measurements. Comput. Electron. Agr. 2014, 100, 51–59. [Google Scholar] [CrossRef]
- Yue, J.; Yang, G.; Tian, Q.; Feng, H.; Xu, K.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices. ISPRS J. Photogramm. Remote Sens. 2019, 150, 226–244. [Google Scholar] [CrossRef]
- Sulik, J.J.; Long, D.S. Spectral considerations for modeling yield of canola. Remote Sens. Environ. 2016, 184, 161–174. [Google Scholar] [CrossRef] [Green Version]
- Wang, F.; Yi, Q.; Hu, J.; Xie, L.; Yao, X.; Xu, T.; Zheng, J. Combining spectral and textural information in UAV hyperspectral images to estimate rice grain yield. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102397. [Google Scholar] [CrossRef]
- Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
- Serele, C.Z.; Gwyn, Q.H.J.; Boisvert, J.B.; Pattey, E.; McLaughlin, N.; Daoust, G. Corn yield prediction with artificial neural network trained using airborne remote sensing and topographic data. In Proceedings of the IGARSS 2000, IEEE 2000 International Geoscience and Remote Sensing Symposium, Taking the Pulse of the Planet: The Role of Remote Sensing in Managing the Environmen, Honolulu, HI, USA, 24–28 July 2000. [Google Scholar] [CrossRef]
- Zhou, X.; Kono, Y.; Win, A.; Matsui, T.; Tanaka, T.S.T. Predicting within-field variability in grain yield and protein content of winter wheat using UAV-based multispectral imagery and machine learning approaches. Plant Prod. Sci. 2020, 24, 137–151. [Google Scholar] [CrossRef]
- Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
- Klompenburg, T.; Kassahun, A.; Catal, C. Crop yield prediction using machine learning: A systematic literature review. Comput. Electron. Agric. 2020, 177, 105709. [Google Scholar] [CrossRef]
- Chlingaryan, A.; Sukkarieh, S.; Whelan, B. Machine learning approaches for crop yield prediction and nitrogen status estimation in precision agriculture: A review. Comput. Electron. Agric. 2018, 151, 61–69. [Google Scholar] [CrossRef]
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Khanal, S.; Fulton, J.; Klopfenstein, A.; Douridas, N.; Shearer, S. Integration of high resolution remotely sensed data and machine learning techniques for spatial prediction of soil properties and corn yield. Comput. Electron. Agric. 2018, 153, 213–225. [Google Scholar] [CrossRef]
- Latella, M.; Sola, F.; Camporeale, C. A Density-Based Algorithm for the Detection of Individual Trees from LiDAR Data. Remote Sens. 2021, 13, 322. [Google Scholar] [CrossRef]
- Lu, J.; Cheng, D.; Geng, C.; Zhang, Z.; Xiang, Y.; Hu, T. Combining Plant Height, Canopy Coverage and Vegetation Index from UAV-Based RGB Images to Estimate Leaf Nitrogen Concentration of Summer Maize. Biosyst. Eng. 2021, 202, 42–54. [Google Scholar] [CrossRef]
- Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef] [Green Version]
- Zhu, W.; Sun, Z.; Yang, T.; Li, J.; Peng, J.; Zhu, K.; Li, S.; Gong, H.; Lyu, Y.; Li, B.; et al. Estimating Leaf Chlorophyll Content of Crops via Optimal Unmanned Aerial Vehicle Hyperspectral Data at Multi-Scales. Comput. Electron. Agric. 2020, 178, 105786. [Google Scholar] [CrossRef]
- Du, M.; Noguchi, N. Monitoring of Wheat Growth Status and Mapping of Wheat Yield’s within-Field Spatial Variations Using Color Images Acquired from UAV-camera System. Remote Sens. 2017, 9, 289. [Google Scholar] [CrossRef] [Green Version]
- Ashapure, A.; Jung, J.; Chang, A.; Oh, S.; Yeom, J.; Maeda, M.; Maeda, A.; Dube, N.; Landivar, J.; Hague, S.; et al. Developing a machine learning based cotton yield estimation framework using multi-temporal UAS data. ISPRS J. Photogramm. Remote Sens. 2020, 169, 180–194. [Google Scholar] [CrossRef]
- Song, Y.; Wang, J.; Shan, B. Estimation of Winter Wheat Yield from UAV-Based Multi-Temporal Imagery Using Crop Allometric Relationship and SAFY Model. Drones 2021, 5, 78. [Google Scholar] [CrossRef]
- Hanway, J.J. How a corn plant develops. In Special Report; No. 38; Iowa Agricultural and Home Economics Experiment Station Publications at Iowa State University Digital Repository: Ames, IA, USA, 1966. [Google Scholar]
- Sánchez, B.; Rasmussen, A.; Porter, J. Temperatures and the growth and development of maize and rice: A review. Glob. Chang. Biol. 2013, 20, 408–417. [Google Scholar] [CrossRef] [PubMed]
- Ramos, A.P.M.; Osco, L.P.; Furuya, D.E.G.; Gonçalves, W.N.; Santana, D.C.; Teodoro, L.P.T.; Junior, C.A.S.; Capristo-Silva, G.F.; Li, J.; Baio, F.H.R.; et al. A random forest ranking approach to predict yield in maize with uav-based vegetation spectral indices. Comput. Electron. Agric. 2020, 178, 105791. [Google Scholar] [CrossRef]
- Han, L.; Yang, G.; Dai, H.; Xu, B.; Yang, H.; Feng, H.; Li, Z.; Yang, X. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 2019, 15, 10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Uno, Y.; Prasher, S.O.; Lacroix, R.; Goel, P.K.; Karimi, Y.; Viau, A.; Patel, R.M. Artificial neural networks to predict corn yield from Compact Airborne Spectrographic Imager data. Comput. Electron. Agric. 2005, 47, 149–161. [Google Scholar] [CrossRef]
- García-Martínez, H.; Flores-Magdaleno, H.; Ascencio-Hernández, R.; Khalil-Gardezi, A.; Tijerina-Chávez, L.; Mancilla-Villa, O.R.; Vázquez-Peña, M.A. Corn Grain Yield Estimation from Vegetation Indices, Canopy Cover, Plant Density, and a Neural Network Using Multispectral and RGB Images Acquired with Unmanned Aerial Vehicles. Agriculture 2020, 10, 277. [Google Scholar] [CrossRef]
- Han, W.; Peng, X.; Zhang, L.; Niu, Y. Summer Maize Yield Estimation Based on Vegetation Index Derived from Multi-temporal UAV Remote Sensing. Trans. Chin. Soc. Agric. Mach. 2020, 51, 148–155. [Google Scholar] [CrossRef]
- Random Forests. Available online: https://www.stat.berkeley.edu/~breiman/RandomForests/ (accessed on 7 February 2022).
- Li, X.; Long, J.; Zhang, M.; Liu, Z.; Lin, H. Coniferous Plantations Growing Stock Volume Estimation Using Advanced Remote Sensing Algorithms and Various Fused Data. Remote Sens. 2021, 13, 3468. [Google Scholar] [CrossRef]
- Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol Ind. 2016, 67, 637–648. [Google Scholar] [CrossRef]
- Kuhn, M.; Johnson, K. Applied Predictive Modeling, 1st ed.; Springer: New York, NY, USA, 2013. [Google Scholar]
- Zhu, W.; Sun, Z.; Huang, Y.; Yang, T.; Li, J.; Zhu, K.; Zhang, J.; Yang, B.; Shao, C.; Peng, J.; et al. Optimization of multi-source UAV RS agro-monitoring schemes designed for field-scale crop phenotyping. Precis. Agric 2021, 22, 1768–1802. [Google Scholar] [CrossRef]
- Michael, K.; Dana, R. Algorithmic stability and sanity-check bounds for leave-one-out cross-validation. Neural Comput. 1999, 11, 1427–1453. [Google Scholar] [CrossRef]
- Jurjević, L.; Gašparović, M.; Milas, A.S.; Balenović, I. Impact of UAS Image Orientation on Accuracy of Forest Inventory Attributes. Remote Sens. 2020, 12, 404. [Google Scholar] [CrossRef] [Green Version]
- Féret, J.; François, C.; Gitelson, A.; Asner, G.; Barry, K.; Panigada, C.; Richardson, A.; Jacquemoud, S. Optimizing spectral indices and chemometric analysis of leaf chemical properties using radiative transfer modeling. Remote Sens. Environ. 2011, 115, 2742–2750. [Google Scholar] [CrossRef] [Green Version]
- Almeida, C.T.; Galvão, L.S.; Aragão, L.E.; Ometto, J.P.H.B.; Jacon, A.D.; Pereira, F.R.S.; Sato, L.Y.; Lopes, A.P.; Graça, P.M.L.; Silva, A.; et al. Combining LiDAR and hyperspectral data for aboveground biomass modeling in the Brazilian Amazon using different regression algorithms. Remote Sens. Environ. 2019, 232, 111323. [Google Scholar] [CrossRef]
- Colombo, R.; Bellingeri, D.; Fasolini, D.; Marino, C.M. Retrieval of leaf area index in different vegetation types using high resolution satellite data. Remote Sens. Environ. 2003, 86, 120–131. [Google Scholar] [CrossRef]
- Sun, H.; Li, M.Z.; Zhao, Y.; Zhang, Y.E.; Wang, X.M.; Li, X.H. The spectral characteristics and chlorophyll content at winter wheat growth stages. Spectrosc. Spectr. Anal. 2010, 30, 192–196. [Google Scholar] [CrossRef]
- Yue, J.; Feng, H.; Yang, G.; Li, Z. A comparison of regression techniques for estimation of above-ground winter wheat biomass using near-surface spectroscopy. Remote Sens. 2018, 10, 66. [Google Scholar] [CrossRef] [Green Version]
- Treitz, P.M.; Howarth, P.J.; Filho, O.R.; Soulis, E.D. Agricultural Crop Classification Using SAR Tone and Texture Statistics. Can. J. Remote Sens. 2000, 26, 18–29. [Google Scholar] [CrossRef]
- Wan, S.; Chang, S.H. Crop classification with WorldView-2 imagery using Support Vector Machine comparing texture analysis approaches and grey relational analysis in Jianan Plain, Taiwan. Int. J. Remote Sens. 2019, 40, 8076–8092. [Google Scholar] [CrossRef]
- Fieuzal, R.; Sicre, C.M.; Baup, F. Estimation of corn yield using multi-temporal optical and radar satellite data and artificial neural networks. Int. J. Appl. Earth Obs. Geoinf. 2017, 57, 14–23. [Google Scholar] [CrossRef]
- Ji, Z.; Pan, Y.; Zhu, X.; Wang, J.; Li, Q. Prediction of Crop Yield Using Phenological Information Extracted from Remote Sensing Vegetation Index. Sensors 2021, 21, 1406. [Google Scholar] [CrossRef] [PubMed]
- LÜ, G.; Wu, Y.; Bai, W.; Ma, B.; Wang, C.; Song, J. Influence of High Temperature Stress on Net Photosynthesis, Dry Matter Partitioning and Rice Grain Yield at Flowering and Grain Filling Stages. J. Integr. Agric. 2013, 12, 603–609. [Google Scholar] [CrossRef]
- Széles, A.V.; Megyes, A.; Nagy, J. Irrigation and nitrogen effects on the leaf chlorophyll content and grain yield of maize in different crop years. Agric. Water Manag. 2012, 107, 133–144. [Google Scholar] [CrossRef]
- Qader, S.H.; Dash, J.; Atkinson, P.M. Forecasting wheat and barley crop production in arid and semi-arid regions using remotely sensed primary productivity and crop phenology: A case study in Iraq. Sci. Total Environ. 2018, 613, 250–262. [Google Scholar] [CrossRef] [Green Version]
- Li, W.; Jiang, J.; Weiss, M.; Madec, S.; Tison, F.; Philippe, B.; Comar, A.; Baret, F. Impact of the reproductive organs on crop BRDF as observed from a UAV. Remote Sens. Environ. 2021, 259, 112433. [Google Scholar] [CrossRef]
- Haralick, R.M.; Shanmugam, K.; Dinstein, I. Textural features for image classification. IEEE Trans. Syst. Man. Cybern. 1973, SMC-3, 610–621. [Google Scholar] [CrossRef] [Green Version]
Treatments | Fertilizer N-P-K (kg·ha−1) | Treatments | Fertilizer N-P-K (kg·ha−1) |
---|---|---|---|
CK 1 | 0-0-0 | N0 | 0-120-80 |
NK | 245-0-120 | N70 | 70-120-80 |
NP | 245-30-0 | N140 | 140-120-80 |
PK | 0-30-120 | N210 | 210-120-80 |
NPK NPKS 2 | 245-30-120 245-30-120 | N280 / | 280-120-80 / |
Vegetation Indices | Formulations | References |
---|---|---|
Normalized difference vegetation index | NDVI = (Rnir − Rred)/(Rnir + Rred) | [45] |
Normalized difference red edge index | NDRE = (Rnir − Rrededge)/(Rnir + Rrededge) | [45] |
Green normalized difference vegetation index | GNDVI = (Rnir − Rgreen)/(Rnir + Rgreen) | [45] |
Normalized green-red difference Index | NGRDI = (Rgreen − Rred)/(Rgreen + Rred) | [46] |
Simple ratio | SR = Rnir/Rred | [47] |
Triangular greenness index | TGI = Rgreen − 0.39Rred − 0.61Rblue | [48] |
Wide dynamic range vegetation index | WDRVI = (0.1Rnir − Rred)/(0.1Rnir + Rred) | [48] |
Visible atmospherically resistant index | VARI = (Rgreen − Rred)/(Rgreen + Rred − Rblue) | [48] |
Soil adjusted vegetation index | SAVI = 1.5(Rnir − Rred)/(Rnir + Rred + 0.5) | [29] |
Optimized soil adjusted vegetation index | OSAVI = 1.16(Rnir − Rred)/(Rnir + Rred + 0.16) | [49] |
Enhanced vegetation index 2 | EVI2 = 2.5(Rnir − Rred)/(Rnir + 2.4Rred + 1) | [49] |
Modified soil adjusted vegetation index 2 | [49] |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, B.; Zhu, W.; Rezaei, E.E.; Li, J.; Sun, Z.; Zhang, J. The Optimal Phenological Phase of Maize for Yield Prediction with High-Frequency UAV Remote Sensing. Remote Sens. 2022, 14, 1559. https://doi.org/10.3390/rs14071559
Yang B, Zhu W, Rezaei EE, Li J, Sun Z, Zhang J. The Optimal Phenological Phase of Maize for Yield Prediction with High-Frequency UAV Remote Sensing. Remote Sensing. 2022; 14(7):1559. https://doi.org/10.3390/rs14071559
Chicago/Turabian StyleYang, Bin, Wanxue Zhu, Ehsan Eyshi Rezaei, Jing Li, Zhigang Sun, and Junqiang Zhang. 2022. "The Optimal Phenological Phase of Maize for Yield Prediction with High-Frequency UAV Remote Sensing" Remote Sensing 14, no. 7: 1559. https://doi.org/10.3390/rs14071559
APA StyleYang, B., Zhu, W., Rezaei, E. E., Li, J., Sun, Z., & Zhang, J. (2022). The Optimal Phenological Phase of Maize for Yield Prediction with High-Frequency UAV Remote Sensing. Remote Sensing, 14(7), 1559. https://doi.org/10.3390/rs14071559