Rice-Yield Prediction with Multi-Temporal Sentinel-2 Data and 3D CNN: A Case Study in Nepal
"> Figure 1
<p>Study area with S2 tiles and elevation profile.</p> "> Figure 2
<p>Building blocks considered for the head (<b>a</b>), body (<b>b</b>) and tail (<b>c</b>) of the proposed architecture. In (<b>a</b>,<b>b</b>), <span class="html-italic">K</span>, <span class="html-italic">N</span>, <span class="html-italic">T</span> and <span class="html-italic">U</span> represent the considered number of kernels, kernel size, number of timestamps and number of fully connected neurons, respectively. Note that, in (<b>b</b>), both 3D convolutinal layers are defined with the same dimensions. Besides, a 3-dimensional stride is used in the pooling layers (Pool) to summarize spatial and temporal dimensions, where appropriate.</p> "> Figure 3
<p>Graphical visualization of the proposed 3D-Convolutional Neural Network (CNN) architecture for rice crop yield estimation. The input rice crop multi-temporal data (start of the season—SoS, peak of season—PoS and end of the season—EoS), made of soil, climate and Sentinel-2 information, are passed through four different 3D convolutional blocks that compose the head and body of the proposed network. Finally, a fully connected block and a final fully connected layer are used to estimate the rice production. Note that <span class="html-italic">M</span>, <span class="html-italic">P</span> and <span class="html-italic">T</span> represent the total number of input channels, patch size and considered timestamps, respectively.</p> "> Figure 4
<p>Training process of the proposed rice yield estimation network.</p> "> Figure 5
<p>Qualitative results (kg/pixel) for experiment 4 with S/P/E-S2+C+S over the T45RUK Sentinel-2 tile in 2018. In the case of CNN-2D, CNN-2D and the proposed network, a 33 patch size is considered.</p> "> Figure 5 Cont.
<p>Qualitative results (kg/pixel) for experiment 4 with S/P/E-S2+C+S over the T45RUK Sentinel-2 tile in 2018. In the case of CNN-2D, CNN-2D and the proposed network, a 33 patch size is considered.</p> "> Figure 6
<p>Regression plots for experiment 4 with S/P/E-S2+C+S. In rows, the tested patch sizes (9, 15, 21, 27 and 33). In columns, the considered CNN-based methods (CNN-2D, CNN-2D and the proposed network).</p> "> Figure 6 Cont.
<p>Regression plots for experiment 4 with S/P/E-S2+C+S. In rows, the tested patch sizes (9, 15, 21, 27 and 33). In columns, the considered CNN-based methods (CNN-2D, CNN-2D and the proposed network).</p> ">
Abstract
:1. Introduction
- 1.
- The suitability of using S2 imagery to effectively estimate strategic crop yields in developing countries with a case of study in Nepal and its local rice production is investigated.
- 2.
- A new large-scale rice crop database of Nepal (RicePAL) composed by multi-temporal S2 products and ground-truth rice yield information from 2016 to 2018 is built.
- 3.
- A novel operational CNN-based framework adapted to the intrinsic data constraints of developing countries for the accurate rice yield estimation is designed.
- 4.
- The effect of considering different temporal, climate and soil data configurations in terms of the resulting performance achieved by the proposed approach and several state-of-the-art regression and CNN-based yield estimation methods is studied. The codes related to this work will be released for reproducible research inside the remote sensing community (https://github.com/rufernan/RicePAL accessed on 3 April 2021).
2. Related Work
3. Study Area and Dataset
3.1. Study Area
3.2. Dataset
3.2.1. Sentinel-2 Data
3.2.2. Auxiliary Data
3.2.3. Ground-Truth Data
4. Methodology
4.1. Convolutional Neural Networks
- 2D-CNN [53], also called spatial methods, which consider the input data as a complete spatial-spectral volume where the kernel slides along the two spatial dimensions (i.e., width and height). Hence, the multi-temporal data are considered as part of the spectral dimension and the kernel depth is adjusted to the number of available channels. The basic idea consists in treating each channel independently and generating a two-dimensional feature map for each input data volume. More specifically, the convolution process over a multi-spectral input can be defined as follows,
- 3D-CNN [40], also known as spatio-temporal methods, that introduce an additional temporal dimension to allow the convolutional kernel to operate over a specific timestamp window. In a 2D-CNN, all the spatio-temporal data are stacked together in the input and, after the first convolution, the temporal information is completely collapsed [71]. To prevent this, 3D-CNN models make use of 3D convolutions that apply 3-dimensional kernels which slide along width, height and time image dimensions. In this case, each kernel generates a 3D feature map for each spatial-spectral and temporal data volume. Following the aforementioned notation, it is possible to formulate this kind of convolution as,
- 1.
- Convolutional layers (Conv3D): The convolution layer computes the output of neurons that are connected to the local regions in the input image by conducting dot product between their weights and biases at a specific region to which they are related in the input range [72]. Mathematically,
- 2.
- Nonlinear layers (Rectified Linear Unit—ReLU): Activation functions are used to embed non-linearity into the neural network thereby enabling the network to learn nonlinear representations. In this layer, the Rectified Linear Unit (ReLU) [73] has been implemented as it is proven to be computationally efficient as well as effective for convergence.
- 3.
- Batch normalization (BN): The batch normalization represents a type of layer aimed at reducing the co-variance shift by normalizing the layer’s inputs over a mini-batch. It enables an independent learning process in each layer and regularizes as well as accelerates the training process. Formally, it can be defined as,
- 4.
- Pooling layers (Pool): The pooling layer is a sub-sampling operation along the dimensions of the feature maps, which does some spatial invariance [72]. Usually, in pooling, some predefined functions (e.g., maximum, average, etc.) are applied to summarize the signal and spatially preserving discriminant information.
- 5.
- Fully connected layers (FC): The fully connected layer takes the output of the previous layers and flattens them into a single vector of values. Then, each output neuron is fully connected to this data vector in order to represent the activation that a certain feature map produces to one of the predefined outputs.
4.2. Proposed Yield Prediction Network
- 1.
- Network’s Head: The proposed architecture starts with two initial head building blocks (HB) that transform the input patches with the multi-temporal data into a first-level feature representation that will be fed to the subsequent network parts. Note that input patches have a size, where M is the total number of bands (made of the concatenation of S2 spectral bands and climate/soil data), P is the spatial size of the input data and T is the temporal dimension representing the considered rice seasons (SoS, PoS and EoS). As it is possible to see, each one of these head blocks is made of four layers: (1) Conv3D, (2) BN, (3) ReLU and (4) Pool. In the case of (1), we employ and convolutional kernels in and , respectively, with a size (). In the case of (4), a max pooling is conducted to reduce the spatial size of the resulting feature maps.
- 2.
- Network’s Body: This part can be considered as the main body of the network since it contains its highest structural complexity. More specifically, the network’s body consists of two sequential building blocks (BB) with the following seven layers: (1) Conv3D, (2) BN, (3) ReLU, (4) Conv3D, (5) BN, (6) ReLU and (7) Pool. This configuration allows us to increase the block depth without reducing the size of the feature maps which eventually produces higher-level features over the same spatio-temporal receptive field. It is important to note that both (1) and (4) layers are defined with the same settings to work towards those high-level features that may help the most to predict the corresponding yield estimations. More specifically, we use and convolutional kernels in and , with a size (). In the case of (7), we conduct a max pooling in to reduce the spatial size while maintaining the depth of the feature maps. Finally, we apply in a max pooling of to summarize the temporal dimension after the network’s body.
- 3.
- Network’s Tail: Once the head and body of the network have been executed, an output 3D volume with deep multi-temporal features is produced. In this way, the last part of the proposed network aims at projecting these high-level feature maps onto their corresponding rice crop yield values. To achieve this goal, we initially define a tail building block (TB) with the following layers: (1) FC, (2) BN and (3) ReLU. Then, we use a final fully connected layer (FC) to generate the desired yield estimates. In the case of (1), we employ fully connected neurons in . In the case of FC, we logically make use of only one neuron to produce a single output for a given multi-temporal input.
4.3. Proposed Network Training
5. Experiments
5.1. Experimental Settings
- Experiment 1: In the first experiment, we tested the performance of the considered rice yield estimation methods when using S2 data with all the possible combinations of the rice crop stages (SoS, PoS and EoS). Note that this scheme led to the following seven multi-temporal data alternatives: SoS, PoS, EoS, SoS/PoS, PoS/EoS, SoS/EoS and SoS/PoS/EoS. In each case, the size of the input data volume was , where T is adjusted to the number of available timestamps.
- Experiment 2: In the second experiment, we used the same multi-temporal configuration as the first experiment but added the climate data into the input data volume, resulting in a input size.
- Experiment 3: In this experiment, we adopted the same scheme as the first one but including the soil data in this case ( size).
- Experiment 4: In the last experiment, we also used the same configuration as the first experiment but including all the available S2, climate and soil data ( size).
5.2. Results
6. Discussion
7. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- United Nations. Transforming Our World: The 2030 Agenda for Sustainable Development; United Nations, Department of Economic and Social Affairs: New York, NY, USA, 2015. [Google Scholar]
- Toth, C.; Jóźków, G. Remote sensing platforms and sensors: A survey. ISPRS J. Photogramm. Remote Sens. 2016, 115, 22–36. [Google Scholar] [CrossRef]
- Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
- Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
- Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent Advances of Hyperspectral Imaging Technology and Applications in Agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
- Fernandez-Beltran, R.; Latorre-Carmona, P.; Pla, F. Single-frame super-resolution in remote sensing: A practical overview. Int. J. Remote Sens. 2017, 38, 314–354. [Google Scholar] [CrossRef]
- Kogan, F. Remote Sensing for Food Security; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
- Kerényi, A.; McIntosh, R.W. Sustainable Development in Changing Complex Earth Systems; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Tey, Y.S.; Li, E.; Bruwer, J.; Abdullah, A.M.; Brindal, M.; Radam, A.; Ismail, M.M.; Darham, S. Factors influencing the adoption of sustainable agricultural practices in developing countries: A review. Environ. Eng. Manag. J. 2017, 16, 337–349. [Google Scholar] [CrossRef]
- Haraguchi, N.; Martorano, B.; Sanfilippo, M. What factors drive successful industrialization? Evidence and implications for developing countries. Struct. Chang. Econ. Dyn. 2019, 49, 266–276. [Google Scholar] [CrossRef]
- Roy, T. The Economy of South Asia: From 1950 to the Present; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
- Gadal, N.; Shrestha, J.; Poudel, M.N.; Pokharel, B. A review on production status and growing environments of rice in Nepal and in the world. Arch. Agric. Environ. Sci. 2019, 4, 83–87. [Google Scholar] [CrossRef]
- Chalise, S.; Naranpanawa, A. Climate change adaptation in agriculture: A computable general equilibrium analysis of land-use change in Nepal. Land Use Policy 2016, 59, 241–250. [Google Scholar] [CrossRef] [Green Version]
- Paudel, M.N. Prospects and limitations of agriculture industrialization in Nepal. Agron. J. Nepal 2016, 4, 38–63. [Google Scholar] [CrossRef] [Green Version]
- Chauhan, S.; Darvishzadeh, R.; Boschetti, M.; Pepe, M.; Nelson, A. Remote sensing-based crop lodging assessment: Current status and perspectives. ISPRS J. Photogramm. Remote Sens. 2019, 151, 124–140. [Google Scholar] [CrossRef] [Green Version]
- Wang, L.; Zhang, G.; Wang, Z.; Liu, J.; Shang, J.; Liang, L. Bibliometric Analysis of Remote Sensing Research Trend in Crop Growth Monitoring: A Case Study in China. Remote Sens. 2019, 11, 809. [Google Scholar] [CrossRef] [Green Version]
- Awad, M.M. Toward precision in crop yield estimation using remote sensing and optimization techniques. Agriculture 2019, 9, 54. [Google Scholar] [CrossRef] [Green Version]
- Peng, D.; Huang, J.; Li, C.; Liu, L.; Huang, W.; Wang, F.; Yang, X. Modelling paddy rice yield using MODIS data. Agric. For. Meteorol. 2014, 184, 107–116. [Google Scholar] [CrossRef]
- Hong, S.Y.; Hur, J.; Ahn, J.B.; Lee, J.M.; Min, B.K.; Lee, C.K.; Kim, Y.; Lee, K.D.; Kim, S.H.; Kim, G.Y.; et al. Estimating rice yield using MODIS NDVI and meteorological data in Korea. Korean J. Remote Sens. 2012, 28, 509–520. [Google Scholar] [CrossRef]
- Son, N.; Chen, C.; Chen, C.; Minh, V.; Trung, N. A comparative analysis of multitemporal MODIS EVI and NDVI data for large-scale rice yield estimation. Agric. For. Meteorol. 2014, 197, 52–64. [Google Scholar] [CrossRef]
- Siyal, A.A.; Dempewolf, J.; Becker-Reshef, I. Rice yield estimation using Landsat ETM+ Data. J. Appl. Remote Sens. 2015, 9, 095986. [Google Scholar] [CrossRef]
- Nuarsa, I.W.; Nishio, F.; Hongo, C. Rice yield estimation using Landsat ETM+ data and field observation. J. Agric. Sci. 2012, 4, 45. [Google Scholar] [CrossRef] [Green Version]
- Setiyono, T.D.; Quicho, E.D.; Gatti, L.; Campos-Taberner, M.; Busetto, L.; Collivignarelli, F.; García-Haro, F.J.; Boschetti, M.; Khan, N.I.; Holecz, F. Spatial rice yield estimation based on MODIS and Sentinel-1 SAR data and ORYZA crop growth model. Remote Sens. 2018, 10, 293. [Google Scholar] [CrossRef] [Green Version]
- You, N.; Dong, J. Examining earliest identifiable timing of crops using all available Sentinel 1/2 imagery and Google Earth Engine. ISPRS J. Photogramm. Remote Sens. 2020, 161, 109–123. [Google Scholar] [CrossRef]
- Ashourloo, D.; Shahrabi, H.S.; Azadbakht, M.; Aghighi, H.; Nematollahi, H.; Alimohammadi, A.; Matkan, A.A. Automatic canola mapping using time series of sentinel 2 images. ISPRS J. Photogramm. Remote Sens. 2019, 156, 63–76. [Google Scholar] [CrossRef]
- Mercier, A.; Betbeder, J.; Baudry, J.; Le Roux, V.; Spicher, F.; Lacoux, J.; Roger, D.; Hubert-Moy, L. Evaluation of Sentinel-1 & 2 time series for predicting wheat and rapeseed phenological stages. ISPRS J. Photogramm. Remote Sens. 2020, 163, 231–256. [Google Scholar]
- Drusch, M.; Del Bello, U.; Carlier, S.; Colin, O.; Fernandez, V.; Gascon, F.; Hoersch, B.; Isola, C.; Laberinti, P.; Martimort, P.; et al. Sentinel-2: ESA’s optical high-resolution mission for GMES operational services. Remote Sens. Environ. 2012, 120, 25–36. [Google Scholar] [CrossRef]
- He, L.; Mostovoy, G. Cotton Yield Estimate Using Sentinel-2 Data and an Ecosystem Model over the Southern US. Remote Sens. 2019, 11, 2000. [Google Scholar] [CrossRef] [Green Version]
- Zhao, Y.; Potgieter, A.B.; Zhang, M.; Wu, B.; Hammer, G.L. Predicting wheat yield at the field scale by combining high-resolution Sentinel-2 satellite imagery and crop modelling. Remote Sens. 2020, 12, 1024. [Google Scholar] [CrossRef] [Green Version]
- Gómez, D.; Salvador, P.; Sanz, J.; Casanova, J.L. Potato yield prediction using machine learning techniques and sentinel 2 data. Remote Sens. 2019, 11, 1745. [Google Scholar] [CrossRef] [Green Version]
- Kayad, A.; Sozzi, M.; Gatto, S.; Marinello, F.; Pirotti, F. Monitoring Within-Field Variability of Corn Yield using Sentinel-2 and Machine Learning Techniques. Remote Sens. 2019, 11, 2873. [Google Scholar] [CrossRef] [Green Version]
- Hunt, M.L.; Blackburn, G.A.; Carrasco, L.; Redhead, J.W.; Rowland, C.S. High resolution wheat yield mapping using Sentinel-2. Remote Sens. Environ. 2019, 233, 111410. [Google Scholar] [CrossRef]
- De Wit, A.; Clevers, J. Efficiency and accuracy of per-field classification for operational crop mapping. Int. J. Remote Sens. 2004, 25, 4091–4112. [Google Scholar] [CrossRef]
- Waldner, F.; Canto, G.S.; Defourny, P. Automated annual cropland mapping using knowledge-based temporal features. ISPRS J. Photogramm. Remote Sens. 2015, 110, 1–13. [Google Scholar] [CrossRef]
- You, J.; Li, X.; Low, M.; Lobell, D.; Ermon, S. Deep gaussian process for crop yield prediction based on remote sensing data. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; pp. 4559–4565. [Google Scholar]
- Gumma, M.K.; Thenkabail, P.S.; Maunahan, A.; Islam, S.; Nelson, A. Mapping seasonal rice cropland extent and area in the high cropping intensity environment of Bangladesh using MODIS 500 m data for the year 2010. ISPRS J. Photogramm. Remote Sens. 2014, 91, 98–113. [Google Scholar] [CrossRef]
- Picoli, M.C.A.; Camara, G.; Sanches, I.; Simões, R.; Carvalho, A.; Maciel, A.; Coutinho, A.; Esquerdo, J.; Antunes, J.; Begotti, R.A.; et al. Big earth observation time series analysis for monitoring Brazilian agriculture. ISPRS J. Photogramm. Remote Sens. 2018, 145, 328–339. [Google Scholar] [CrossRef]
- Song, P.; Mansaray, L.R.; Huang, J.; Huang, W. Mapping paddy rice agriculture over China using AMSR-E time series data. ISPRS J. Photogramm. Remote Sens. 2018, 144, 469–482. [Google Scholar] [CrossRef]
- Qamer, F.M.; Shah, S.P.; Murthy, M.; Baidar, T.; Dhonju, K.; Hari, B.G. Operationalizing crop monitoring system for informed decision making related to food security in Nepal. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 1325. [Google Scholar] [CrossRef] [Green Version]
- Russello, H. Convolutional Neural Networks for Crop Yield Prediction Using Satellite Images; IBM Center for Advanced Studies, University of Amsterdam: Amsterdam, The Netherlands, 2018. [Google Scholar]
- Sun, J.; Di, L.; Sun, Z.; Shen, Y.; Lai, Z. County-Level Soybean Yield Prediction Using Deep CNN-LSTM Model. Sensors 2019, 19, 4363. [Google Scholar] [CrossRef] [Green Version]
- Van Klompenburg, T.; Kassahun, A.; Catal, C. Crop yield prediction using machine learning: A systematic literature review. Comput. Electron. Agric. 2020, 177, 105709. [Google Scholar] [CrossRef]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
- Cai, Y.; Guan, K.; Peng, J.; Wang, S.; Seifert, C.; Wardlow, B.; Li, Z. A high-performance and in-season classification system of field-level crop types using time-series Landsat data and a machine learning approach. Remote Sens. Environ. 2018, 210, 35–47. [Google Scholar] [CrossRef]
- Dyson, J.; Mancini, A.; Frontoni, E.; Zingaretti, P. Deep learning for soil and crop segmentation from remotely sensed data. Remote Sens. 2019, 11, 1859. [Google Scholar] [CrossRef] [Green Version]
- Shrisath, P. Real-Time Crop Yield Monitoring in Nepal for Food Security Planning and Climatic Risk Management; CGIAR Research Program on Climate Change Agriculture and Food Security, International Water Management Institute (IWMI): Kathmandu, Nepal, 2016. [Google Scholar]
- Prasad, A.K.; Chai, L.; Singh, R.P.; Kafatos, M. Crop yield estimation model for Iowa using remote sensing and surface parameters. Int. J. Appl. Earth Obs. Geoinf. 2006, 8, 26–33. [Google Scholar] [CrossRef]
- Ren, J.; Chen, Z.; Zhou, Q.; Tang, H. Regional yield estimation for winter wheat with MODIS-NDVI data in Shandong, China. Int. J. Appl. Earth Obs. Geoinf. 2008, 10, 403–413. [Google Scholar] [CrossRef]
- Kim, N.; Lee, Y.W. Estimation of corn and soybeans yield using remote sensing and crop yield data in the United States. In Remote Sensing for Agriculture, Ecosystems, and Hydrology XVI; International Society for Optics and Photonics: Washington, DC, USA, 2014; Volume 9239. [Google Scholar]
- Jiang, D.; Yang, X.; Clinton, N.; Wang, N. An artificial neural network model for estimating crop yields using remotely sensed information. Int. J. Remote Sens. 2004, 25, 1723–1732. [Google Scholar] [CrossRef]
- Kim, N.; Lee, Y.W. Machine learning approaches to corn yield estimation using satellite images and climate data: A case of Iowa State. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2016, 34, 383–390. [Google Scholar] [CrossRef]
- Kuwata, K.; Shibasaki, R. Estimating crop yields with deep learning and remotely sensed data. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 858–861. [Google Scholar]
- Nevavuori, P.; Narra, N.; Lipping, T. Crop yield prediction with deep convolutional neural networks. Comput. Electron. Agric. 2019, 163, 104859. [Google Scholar] [CrossRef]
- Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crop. Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
- MOF. Economic Survey, Fiscal Year 2009/10; Technical Report; Ministry of Finance (MOF), Government of Nepal: Kathmandu, Nepal, 2010.
- MOAC. Statistical Information on Nepalese Agriculture, 2008/2009; Technical Report; Agri-Business Promotion and Statistical Division, Ministry of Agriculture and Cooperatives: Kathmandu, Nepal, 2009.
- Ghimire, S.; Dhungana, S.M.; Krishna, V.; Teufel, N.; Sherchan, D. Biophysical and Socio-Economic Characterization of Cereal Production Systems of Central Nepal; CIMMYT Research Data & Software Repository Network: Mexico City, Mexico, 2013. [Google Scholar]
- Gascon, F.; Cadau, E.; Colin, O.; Hoersch, B.; Isola, C.; Fernández, B.L.; Martimort, P. Copernicus Sentinel-2 mission: Products, algorithms and Cal/Val. In Earth Observing Systems XIX; International Society for Optics and Photonics: Washington, DC, USA, 2014; Volume 9218, p. 92181E. [Google Scholar]
- Richter, R.; Schläpfer, D. Atmospheric/Topographic Correction for Satellite Imagery (ATCOR-2/3 User Guide, Version 8.3. 1, February 2014). ReSe Appl. Schläpfer Langeggweg 2013, 3, 77. [Google Scholar]
- Vuolo, F.; Żółtak, M.; Pipitone, C.; Zappa, L.; Wenng, H.; Immitzer, M.; Weiss, M.; Baret, F.; Atzberger, C. Data service platform for Sentinel-2 surface reflectance and value-added products: System use and examples. Remote Sens. 2016, 8, 938. [Google Scholar] [CrossRef] [Green Version]
- Immitzer, M.; Vuolo, F.; Atzberger, C. First experience with Sentinel-2 data for crop and tree species classifications in central Europe. Remote Sens. 2016, 8, 166. [Google Scholar] [CrossRef]
- Fan, X.; Liu, Y. A global study of NDVI difference among moderate-resolution satellite sensors. ISPRS J. Photogramm. Remote Sens. 2016, 121, 177–191. [Google Scholar] [CrossRef]
- De la Casa, A.; Ovando, G.; Bressanini, L.; Martinez, J.; Diaz, G.; Miranda, C. Soybean crop coverage estimation from NDVI images with different spatial resolution to evaluate yield variability in a plot. ISPRS J. Photogramm. Remote Sens. 2018, 146, 531–547. [Google Scholar] [CrossRef]
- Bai, T.; Li, D.; Sun, K.; Chen, Y.; Li, W. Cloud detection for high-resolution satellite imagery using machine learning and multi-feature fusion. Remote Sens. 2016, 8, 715. [Google Scholar] [CrossRef] [Green Version]
- Gandhi, N.; Armstrong, L.J.; Petkar, O.; Tripathy, A.K. Rice crop yield prediction in India using support vector machines. In Proceedings of the 2016 13th International Joint Conference on Computer Science and Software Engineering (JCSSE), Khon Kaen, Thailand, 13–15 July 2016; pp. 1–5. [Google Scholar]
- Karki, R.; Talchabhadel, R.; Aalto, J.; Baidya, S.K. New climatic classification of Nepal. Theor. Appl. Climatol. 2016, 125, 799–808. [Google Scholar] [CrossRef]
- Shrestha, S.; Baidar, T. Spatial Distribution and Temporal Change of Extreme Precipitation Events on the Koshi Basin of Nepal. Nepal. J. Geoinform. 2018, 17, 38–46. [Google Scholar]
- Uddin, K.; Shrestha, H.L.; Murthy, M.; Bajracharya, B.; Shrestha, B.; Gilani, H.; Pradhan, S.; Dangol, B. Development of 2010 national land cover database for the Nepal. J. Environ. Manag. 2015, 148, 82–90. [Google Scholar] [CrossRef]
- Paudel, G.; Maharjan, S.; Guerena, D.; Rai, A.; McDonald, A.J. Nepal Rice Crop Cut and Survey Data 2016; CIMMYT Research Data & Software Repository Network: Mexico City, Mexico, 2017. [Google Scholar]
- Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [CrossRef] [Green Version]
- Ji, S.; Zhang, C.; Xu, A.; Shi, Y.; Duan, Y. 3D convolutional neural networks for crop classification with multi-temporal remote sensing images. Remote Sens. 2018, 10, 75. [Google Scholar] [CrossRef] [Green Version]
- O’Shea, K.; Nash, R. An introduction to convolutional neural networks. arXiv 2015, arXiv:1511.08458. [Google Scholar]
- Dahl, G.E.; Sainath, T.N.; Hinton, G.E. Improving deep neural networks for LVCSR using rectified linear units and dropout. In Proceedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; pp. 8609–8613. [Google Scholar]
- Onojeghuo, A.O.; Blackburn, G.A.; Wang, Q.; Atkinson, P.M.; Kindred, D.; Miao, Y. Mapping paddy rice fields by applying machine learning algorithms to multi-temporal Sentinel-1A and Landsat data. Int. J. Remote Sens. 2018, 39, 1042–1067. [Google Scholar] [CrossRef] [Green Version]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Hernandez, J.; Lobos, G.A.; Matus, I.; Del Pozo, A.; Silva, P.; Galleguillos, M. Using ridge regression models to estimate grain yield from field spectral data in bread wheat (Triticum aestivum L.) grown under three water regimes. Remote Sens. 2015, 7, 2109. [Google Scholar] [CrossRef] [Green Version]
- Shiu, Y.S.; Chuang, Y.C. Yield Estimation of Paddy Rice Based on Satellite Imagery: Comparison of Global and Local Regression Models. Remote Sens. 2019, 11, 111. [Google Scholar] [CrossRef] [Green Version]
Data | Source | |
---|---|---|
Sentinel-2 | B02-B08, B8A, B11, and B12 | https://scihub.copernicus.eu/ accessed on 3 April 2021 |
NDVI | Calculated using bands B04 and B08 | |
Cloud mask | Available with L1C products | |
Auxiliary | Climate | https://www.dhm.gov.np/ accessed on 3 April 2021 |
Soil | https://krishiprabidhi.net/ accessed on 3 April 2021 | |
Ground-truth | Rice crop mask | Based on Qamer et al. [39] |
Rice crop yield | https://mold.gov.np/ accessed on 3 April 2021 |
Climate Variables | Unit |
---|---|
Rainfall | Millimeter (mm) |
Maximum temperature | Degree Celsius () |
Minimum temperature | Degree Celsius () |
Relative Humidity | Percentage (%) |
DATA | LIN | RID | SVR | GPR | CNN-2D | CNN-3D | Proposed | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | |||||
S-S2 | 495.24 | 503.33 | 456.35 | 438.33 | 332.87 | 294.45 | 265.43 | 271.26 | 235.47 | 410.89 | 348.73 | 319.38 | 311.77 | 283.12 | 287.34 | 255.97 | 217.55 | 192.12 | 170.16 |
P-S2 | 529.26 | 534.71 | 462.29 | 442.86 | 370.34 | 343.60 | 305.64 | 314.96 | 310.22 | 412.47 | 372.55 | 336.13 | 365.40 | 323.37 | 339.35 | 282.32 | 214.32 | 215.80 | 215.71 |
E-S2 | 532.75 | 540.30 | 481.50 | 465.38 | 340.29 | 319.86 | 285.41 | 256.32 | 252.94 | 413.85 | 314.87 | 288.36 | 272.97 | 234.54 | 278.80 | 201.19 | 247.31 | 158.53 | 160.67 |
S/P-S2 | 480.22 | 492.34 | 386.30 | 370.18 | 251.60 | 224.61 | 207.30 | 206.76 | 198.93 | 261.15 | 199.06 | 184.32 | 166.64 | 163.27 | 188.97 | 140.40 | 121.85 | 114.80 | 115.96 |
P/E-S2 | 512.42 | 526.10 | 419.01 | 408.46 | 295.89 | 227.63 | 211.20 | 223.45 | 178.12 | 264.48 | 223.83 | 199.49 | 176.52 | 179.84 | 202.18 | 147.00 | 134.33 | 127.03 | 117.97 |
S/E-S2 | 482.01 | 497.63 | 393.40 | 389.02 | 238.63 | 213.19 | 176.69 | 176.03 | 168.83 | 265.99 | 230.56 | 198.44 | 156.82 | 151.91 | 169.52 | 182.02 | 106.65 | 116.26 | 117.20 |
S/P/E-S2 | 468.30 | 487.37 | 359.51 | 349.14 | 216.24 | 187.95 | 168.32 | 157.00 | 158.90 | 215.42 | 174.37 | 162.82 | 123.60 | 115.96 | 181.77 | 119.38 | 117.84 | 103.80 | 89.03 |
Avg. | 500.03 | 511.68 | 422.62 | 409.05 | 292.27 | 258.76 | 231.43 | 229.40 | 214.77 | 320.61 | 266.28 | 241.28 | 224.82 | 207.43 | 235.42 | 189.75 | 165.69 | 146.91 | 140.96 |
DATA | LIN | RID | SVR | GPR | CNN-2D | CNN-3D | Proposed | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | |||||
S-S2+C | 485.56 | 496.50 | 439.26 | 432.87 | 336.85 | 296.08 | 280.71 | 270.69 | 261.94 | 431.73 | 363.31 | 306.86 | 275.56 | 267.95 | 297.90 | 253.40 | 253.59 | 206.93 | 174.14 |
P-S2+C | 515.11 | 521.01 | 447.23 | 430.08 | 379.00 | 336.51 | 295.16 | 305.76 | 324.00 | 431.39 | 388.09 | 353.88 | 332.59 | 297.94 | 329.11 | 310.26 | 254.31 | 240.12 | 223.17 |
E-S2+C | 530.89 | 538.30 | 465.35 | 456.47 | 349.84 | 300.20 | 247.51 | 251.86 | 240.66 | 380.69 | 360.55 | 304.01 | 340.33 | 243.33 | 302.22 | 265.71 | 201.16 | 198.38 | 179.44 |
S/P-S2+C | 470.92 | 485.13 | 380.21 | 376.56 | 289.66 | 235.30 | 215.59 | 195.79 | 187.61 | 276.73 | 231.51 | 201.26 | 172.41 | 168.97 | 171.48 | 179.83 | 140.46 | 132.57 | 138.87 |
P/E-S2+C | 488.04 | 501.99 | 398.19 | 399.14 | 277.58 | 240.85 | 200.47 | 216.91 | 193.91 | 298.59 | 222.10 | 214.16 | 170.12 | 162.96 | 197.40 | 163.29 | 159.53 | 136.89 | 157.19 |
S/E-S2+C | 465.73 | 482.20 | 379.82 | 371.43 | 248.35 | 212.57 | 178.93 | 182.06 | 171.85 | 260.58 | 200.27 | 186.53 | 169.67 | 161.05 | 164.65 | 185.40 | 127.87 | 105.39 | 114.25 |
S/P/E-S2+C | 452.90 | 473.06 | 346.14 | 336.34 | 212.12 | 193.83 | 214.40 | 194.49 | 181.20 | 190.95 | 216.35 | 159.52 | 131.16 | 113.47 | 159.12 | 120.56 | 122.95 | 156.41 | 107.69 |
Avg. | 487.02 | 499.74 | 408.03 | 400.41 | 299.06 | 259.33 | 233.25 | 231.08 | 223.02 | 324.38 | 283.17 | 246.60 | 227.41 | 202.24 | 231.70 | 211.21 | 179.98 | 168.10 | 156.39 |
DATA | LIN | RID | SVR | GPR | CNN-2D | CNN-3D | Proposed | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | |||||
S-S2+S | 495.15 | 503.16 | 447.32 | 430.88 | 332.08 | 293.20 | 287.92 | 254.69 | 236.77 | 411.11 | 329.71 | 313.52 | 301.75 | 272.50 | 299.93 | 234.14 | 206.71 | 191.80 | 188.58 |
P-S2+S | 529.17 | 534.60 | 455.64 | 446.67 | 377.25 | 359.96 | 317.79 | 319.12 | 324.43 | 396.86 | 358.92 | 338.78 | 330.88 | 295.28 | 311.55 | 271.84 | 231.51 | 240.32 | 218.29 |
E-S2+S | 532.56 | 540.16 | 476.87 | 467.09 | 358.70 | 312.74 | 270.84 | 237.39 | 225.53 | 411.98 | 330.43 | 305.51 | 270.15 | 255.74 | 302.53 | 221.49 | 192.84 | 198.88 | 165.13 |
S/P-S2+S | 480.41 | 492.48 | 379.55 | 382.08 | 250.64 | 247.28 | 205.30 | 202.43 | 214.76 | 261.63 | 211.86 | 206.61 | 173.77 | 165.89 | 200.38 | 163.96 | 148.50 | 130.75 | 94.19 |
P/E-S2+S | 512.35 | 526.07 | 417.70 | 408.12 | 274.77 | 295.89 | 221.70 | 231.88 | 259.55 | 275.95 | 216.97 | 197.82 | 169.72 | 159.86 | 199.36 | 170.12 | 134.63 | 123.04 | 140.72 |
S/E-S2+S | 481.89 | 497.52 | 386.97 | 383.77 | 273.69 | 250.01 | 248.09 | 171.57 | 200.49 | 245.98 | 184.04 | 193.61 | 165.58 | 145.45 | 182.61 | 126.60 | 115.58 | 120.51 | 99.95 |
S/P/E-S2+S | 468.49 | 487.37 | 356.67 | 348.40 | 251.79 | 199.39 | 167.91 | 171.83 | 165.26 | 208.29 | 167.74 | 136.63 | 137.61 | 107.26 | 172.25 | 131.74 | 145.81 | 96.89 | 115.95 |
Avg. | 500.00 | 511.62 | 417.25 | 409.57 | 302.70 | 279.78 | 245.65 | 226.99 | 232.40 | 315.97 | 257.10 | 241.78 | 221.35 | 200.28 | 238.37 | 188.56 | 167.94 | 157.46 | 146.11 |
DATA | LIN | RID | SVR | GPR | CNN-2D | CNN-3D | Proposed | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | |||||
S-S2+C+S | 485.62 | 496.54 | 438.00 | 431.88 | 334.85 | 309.86 | 275.38 | 269.76 | 240.92 | 445.60 | 375.98 | 335.11 | 292.70 | 270.40 | 292.13 | 255.26 | 220.48 | 186.67 | 196.91 |
P-S2+C+S | 515.17 | 521.03 | 442.93 | 436.52 | 365.72 | 377.89 | 312.69 | 309.62 | 289.73 | 406.37 | 368.45 | 344.81 | 328.45 | 296.09 | 333.73 | 318.97 | 263.84 | 223.47 | 236.45 |
E-S2+C+S | 530.65 | 538.09 | 463.05 | 453.01 | 326.07 | 296.11 | 258.14 | 249.32 | 252.56 | 373.47 | 358.05 | 294.38 | 319.73 | 269.55 | 293.80 | 255.81 | 203.88 | 188.81 | 190.90 |
S/P-S2+C+S | 471.21 | 485.20 | 374.12 | 363.00 | 250.07 | 237.04 | 230.39 | 216.44 | 186.00 | 262.65 | 231.24 | 212.40 | 188.13 | 176.68 | 175.25 | 185.50 | 167.74 | 135.69 | 133.02 |
P/E-S2+C+S | 488.30 | 502.02 | 398.57 | 394.72 | 270.10 | 211.63 | 207.69 | 228.69 | 227.35 | 260.76 | 207.65 | 202.63 | 191.30 | 169.11 | 201.04 | 162.45 | 155.18 | 131.43 | 145.19 |
S/E-S2+C+S | 465.93 | 482.30 | 375.80 | 369.94 | 271.27 | 207.56 | 176.92 | 177.63 | 188.86 | 262.06 | 199.32 | 201.29 | 186.16 | 161.08 | 176.23 | 138.77 | 119.00 | 126.32 | 109.39 |
S/P/E-S2+C+S | 453.30 | 473.33 | 345.97 | 339.85 | 230.09 | 183.46 | 167.45 | 181.84 | 165.45 | 189.95 | 163.66 | 147.56 | 132.14 | 120.91 | 159.29 | 125.54 | 117.50 | 113.50 | 108.49 |
Avg. | 487.17 | 499.79 | 405.49 | 398.42 | 292.60 | 260.51 | 232.67 | 233.33 | 221.55 | 314.41 | 272.05 | 248.31 | 234.09 | 209.12 | 233.07 | 206.04 | 178.23 | 157.98 | 160.05 |
EXPERIMENTS | LIN | RID | SVR | GPR | CNN-2D | CNN-3D | Proposed | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | 9 | 15 | 21 | 27 | 33 | |||||
Experiment 1 | 468.30 | 487.37 | 359.51 | 349.14 | 216.24 | 187.95 | 168.32 | 157.00 | 158.90 | 215.42 | 174.37 | 162.82 | 123.60 | 115.96 | 169.52 | 119.38 | 106.65 | 103.80 | 89.03 |
Experiment 2 | 452.90 | 473.06 | 346.14 | 336.34 | 212.12 | 193.83 | 178.93 | 182.06 | 171.85 | 190.95 | 200.27 | 159.52 | 131.16 | 113.47 | 159.12 | 120.56 | 122.95 | 105.39 | 107.69 |
Experiment 3 | 468.49 | 487.37 | 356.67 | 348.40 | 250.64 | 199.39 | 167.91 | 171.57 | 165.26 | 208.29 | 167.74 | 136.63 | 137.61 | 107.26 | 172.25 | 126.60 | 115.58 | 96.89 | 94.19 |
Experiment 4 | 453.30 | 473.33 | 345.97 | 339.85 | 230.09 | 183.46 | 167.45 | 177.63 | 165.45 | 189.95 | 163.66 | 147.56 | 132.14 | 120.91 | 159.29 | 125.54 | 117.50 | 113.50 | 108.49 |
Best | 452.90 | 473.06 | 345.97 | 336.34 | 212.12 | 183.46 | 167.45 | 157.00 | 158.90 | 189.95 | 163.66 | 136.63 | 123.60 | 107.26 | 159.12 | 119.38 | 106.65 | 96.89 | 89.03 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fernandez-Beltran, R.; Baidar, T.; Kang, J.; Pla, F. Rice-Yield Prediction with Multi-Temporal Sentinel-2 Data and 3D CNN: A Case Study in Nepal. Remote Sens. 2021, 13, 1391. https://doi.org/10.3390/rs13071391
Fernandez-Beltran R, Baidar T, Kang J, Pla F. Rice-Yield Prediction with Multi-Temporal Sentinel-2 Data and 3D CNN: A Case Study in Nepal. Remote Sensing. 2021; 13(7):1391. https://doi.org/10.3390/rs13071391
Chicago/Turabian StyleFernandez-Beltran, Ruben, Tina Baidar, Jian Kang, and Filiberto Pla. 2021. "Rice-Yield Prediction with Multi-Temporal Sentinel-2 Data and 3D CNN: A Case Study in Nepal" Remote Sensing 13, no. 7: 1391. https://doi.org/10.3390/rs13071391
APA StyleFernandez-Beltran, R., Baidar, T., Kang, J., & Pla, F. (2021). Rice-Yield Prediction with Multi-Temporal Sentinel-2 Data and 3D CNN: A Case Study in Nepal. Remote Sensing, 13(7), 1391. https://doi.org/10.3390/rs13071391