An Adaptive Feature Fusion Network with Superpixel Optimization for Crop Classification Using Sentinel-2 Imagery
<p>Coverage of different patch sizes in complex scenarios. (<b>a</b>) Mixed planting scenario, where crops grow in a random pattern. There are multiple crops in a small area. (<b>b</b>) Mountain terrace scenario, where crops grow in valleys with smaller areas.</p> "> Figure 2
<p>Spatial distribution map of the study area. The left figure shows the location of the study area in China, and the right figure shows the zoomed-in details of the study area.</p> "> Figure 3
<p>Phenology of corn, peanut, and rice.</p> "> Figure 4
<p>Distribution of sample points in the study area.</p> "> Figure 5
<p>Example of sample expansion. The point vectors are obtained through ground surveys, and the surface vectors are obtained through manual interpretation based on the point vectors. The expanded crop samples are assumed to be manually interpreted inside crop plots.</p> "> Figure 6
<p>Overview of the proposed framework for crop mapping.</p> "> Figure 7
<p>The proposed SPTNet’s framework for crop mapping using a novel DNN architecture. (<b>a</b>) The selective patch module. (<b>b</b>) The branch of central pixel spectral feature extraction.</p> "> Figure 8
<p>Reflectance of corn, peanut, and rice in different bands of Sentinel-2 images. Red represents corn, yellow represents peanut, and blue represents rice.</p> "> Figure 9
<p>Schematic of the superpixel optimization process. (<b>a</b>) Enhanced Band 8, Band 11, Band 4 false color image. (<b>b</b>) SLIC extraction results. (<b>c</b>) Band 8, Band 11, Band 4 false color synthesis of the Sentinel-2 image. (<b>d</b>) Classified result. (<b>e</b>) Optimized result.</p> "> Figure 10
<p>Some examples of the results on the dataset. From left to right: (<b>a</b>) Band 8, Band 11, Band 4 false color synthesis of Sentinel-2 image; (<b>b</b>) RF; (<b>c</b>) XGBoost; (<b>d</b>) CNN; (<b>e</b>) CNN-RF; (<b>f</b>) <span class="html-italic">S</span><sup>3</sup>ANet; (<b>g</b>) SPTNet.</p> "> Figure 11
<p>Some examples of crop classification ablation experiments. From left to right: (<b>a</b>) Band 8, Band 11, Band 4 false color synthesis of Sentinel-2 image; (<b>b</b>) baseline; (<b>c</b>) Experiment 1; (<b>d</b>) Experiment 2; (<b>e</b>) Experiment 3; (<b>f</b>) Experiment 4.</p> "> Figure 12
<p>Mapping results of corn, peanut, and rice from Sentinel-2 images in Henan Province, China, obtained by the proposed SPTNet method.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Data and Processing
2.2.1. Remote Sensing Data
2.2.2. Reference Samples
2.3. Methods
2.3.1. Selective Patch Module
2.3.2. Branch of Central Pixel Spectral Feature Extraction
2.3.3. Loss Function
2.3.4. Superpixel Optimization
2.3.5. Evaluation Metrics
2.4. Train Details
3. Results
3.1. Comparing Methods
3.1.1. Quantitative Comparisons
3.1.2. Visualization Results
3.2. Ablation Study
3.2.1. Quantitative Comparisons
3.2.2. Visualization Results
3.3. Crop Mapping in Henan Province
4. Discussion
4.1. The Split Strategy Selection of SPM
4.2. The Contribution of Different Sizes of Patches to Crop Extraction
4.3. Advantages and Limitations
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
CE | cross entropy |
CNN | convolutional neural network |
DNN | deep neural network |
EVI | enhanced vegetation index |
FN | false negative |
FP | false positive |
GRU | gated recurrent unit |
KC | kappa coefficient |
LSTM | long short-term memory |
LSWI | land surface water index |
NDVI | normalized difference vegetation index |
PA | producer accuracy |
PCA | principal component analysis |
RF | random forest |
RNN | recurrent neural network |
SLIC | simple linear iterative clustering |
SPM | selective patch module |
TP | true positive |
UA | user accuracy |
References
- Chen, X.; Cui, Z.; Fan, M.; Vitousek, P.; Zhao, M.; Ma, W.; Wang, Z.; Zhang, W.; Yan, X.; Yang, J.; et al. Producing more grain with lower environmental costs. Nature 2014, 514, 486–489. [Google Scholar] [CrossRef]
- Kuzman, B.; Petković, B.; Denić, N.; Petković, D.; Ćirković, B.; Stojanović, J.; Milić, M. Estimation of optimal fertilizers for optimal crop yield by adaptive neuro fuzzy logic. Rhizosphere 2021, 18, 100358. [Google Scholar] [CrossRef]
- Jez, J.M.; Topp, C.N.; Silva, G.; Tomlinson, J.; Onkokesung, N.; Sommer, S.; Mrisho, L.; Legg, J.; Adams, I.P.; Gutierrez-Vazquez, Y.; et al. Plant pest surveillance: From satellites to molecules. Emerg. Top. Life Sci. 2021, 5, 275–287. [Google Scholar] [CrossRef]
- Rasti, S.; Bleakley, C.J.; Holden, N.; Whetton, R.; Langton, D.; O’Hare, G. A survey of high resolution image processing techniques for cereal crop growth monitoring. Inf. Process. Agric. 2021, 9, 300–315. [Google Scholar] [CrossRef]
- Wen, Y.; Li, X.; Mu, H.; Zhong, L.; Chen, H.; Zeng, Y.; Miao, S.; Su, W.; Gong, P.; Li, B.; et al. Mapping corn dynamics using limited but representative samples with adaptive strategies. ISPRS J. Photogramm. Remote Sens. 2022, 190, 252–266. [Google Scholar] [CrossRef]
- Gallego, J.; Carfagna, E.; Baruth, B. Accuracy, objectivity and efficiency of remote sensing for agricultural statistics. In Agricultural Survey Methods; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2010; pp. 193–211. [Google Scholar]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
- Bargiel, D. A new method for crop classification combining time series of radar images and crop phenology information. Remote Sens. Environ. 2017, 198, 369–383. [Google Scholar] [CrossRef]
- Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
- Zhang, L.; Gao, L.; Huang, C.; Wang, N.; Wang, S.; Peng, M.; Zhang, X.; Tong, Q. Crop classification based on the spectrotemporal signature derived from vegetation indices and accumulated temperature. Int. J. Digit. Earth 2022, 15, 626–652. [Google Scholar] [CrossRef]
- Huang, X.; Huang, J.; Li, X.; Shen, Q.; Chen, Z. Early mapping of winter wheat in Henan province of China using time series of Sentinel-2 data. GIScience Remote Sens. 2022, 59, 1534–1549. [Google Scholar] [CrossRef]
- Yang, N.; Liu, D.; Feng, Q.; Xiong, Q.; Zhang, L.; Ren, T.; Zhao, Y.; Zhu, D.; Huang, J. Large-scale crop mapping based on machine learning and parallel computation with grids. Remote Sens. 2019, 11, 1500. [Google Scholar] [CrossRef] [Green Version]
- Mingwei, Z.; Qingbo, Z.; Zhongxin, C.; Jia, L.; Yong, Z.; Chongfa, C. Crop discrimination in Northern China with double cropping systems using Fourier analysis of time-series MODIS data. Int. J. Appl. Earth Obs. Geoinf. 2008, 10, 476–485. [Google Scholar] [CrossRef]
- Xiao, X.; Boles, S.; Frolking, S.; Li, C.; Babu, J.Y.; Salas, W.; Moore, B., III. Mapping paddy rice agriculture in South and Southeast Asia using multi-temporal MODIS images. Remote Sens. Environ. 2006, 100, 95–113. [Google Scholar] [CrossRef]
- Hearst, M.A.; Dumais, S.T.; Osuna, E.; Platt, J.; Scholkopf, B. Support vector machines. IEEE Intell. Syst. Their Appl. 1998, 13, 18–28. [Google Scholar] [CrossRef] [Green Version]
- Ahmed, M.; Seraj, R.; Islam, S.M.S. The k-means algorithm: A comprehensive survey and performance evaluation. Electronics 2020, 9, 1295. [Google Scholar] [CrossRef]
- Nitze, I.; Schulthess, U.; Asche, H. Comparison of machine learning algorithms random forest, artificial neural network and support vector machine to maximum likelihood for supervised crop type classification. In Proceedings of the 4th GEOBIA, Rio de Janeiro, Brazil, 7–9 May 2012; Volume 79, p. 3540. [Google Scholar]
- Chen, T.; He, T.; Benesty, M.; Khotilovich, V.; Tang, Y.; Cho, H.; Chen, K. Xgboost: Extreme Gradient Boosting. R Package Version 0.4-2; 2015; pp. 1–4. Available online: http://cran.fhcrc.org/web/packages/xgboost/vignettes/xgboost.pdf (accessed on 5 April 2023).
- You, N.; Dong, J.; Li, J.; Huang, J.; Jin, Z. Rapid early-season maize mapping without crop labels. Remote Sens. Environ. 2023, 290, 113496. [Google Scholar] [CrossRef]
- Immitzer, M.; Vuolo, F.; Atzberger, C. First experience with Sentinel-2 data for crop and tree species classifications in central Europe. Remote Sens. 2016, 8, 166. [Google Scholar] [CrossRef]
- Waldner, F.; Lambert, M.J.; Li, W.; Weiss, M.; Demarez, V.; Morin, D.; Marais-Sicre, C.; Hagolle, O.; Baret, F.; Defourny, P. Land cover and crop type classification along the season based on biophysical variables retrieved from multi-sensor high-resolution time series. Remote Sens. 2015, 7, 10400–10424. [Google Scholar] [CrossRef] [Green Version]
- Bagnall, A.; Lines, J.; Bostrom, A.; Large, J.; Keogh, E. The great time series classification bake off: A review and experimental evaluation of recent algorithmic advances. Data Min. Knowl. Discov. 2017, 31, 606–660. [Google Scholar] [CrossRef] [Green Version]
- Belgiu, M.; Csillik, O. Sentinel-2 cropland mapping using pixel-based and object-based time-weighted dynamic time warping analysis. Remote Sens. Environ. 2018, 204, 509–523. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Yuan, X.; Shi, J.; Gu, L. A review of deep learning methods for semantic segmentation of remote sensing imagery. Expert Syst. Appl. 2021, 169, 114417. [Google Scholar] [CrossRef]
- Wen, D.; Huang, X.; Bovolo, F.; Li, J.; Ke, X.; Zhang, A.; Benediktsson, J.A. Change detection from very-high-spatial-resolution optical remote sensing images: Methods, applications, and future directions. IEEE Geosci. Remote Sens. Mag. 2021, 9, 68–101. [Google Scholar] [CrossRef]
- Sun, X.; Wang, P.; Wang, C.; Liu, Y.; Fu, K. PBNet: Part-based convolutional neural network for complex composite object detection in remote sensing imagery. ISPRS J. Photogramm. Remote Sens. 2021, 173, 50–65. [Google Scholar] [CrossRef]
- Gu, L.; He, F.; Yang, S. Crop classification based on deep learning in northeast China using sar and optical imagery. In Proceedings of the 2019 SAR in Big Data Era (BIGSARDATA), Beijing, China, 5–6 August 2019; IEEE: New York, NY, USA, 2019; pp. 1–4. [Google Scholar]
- Yuan, Q.; Shen, H.; Li, T.; Li, Z.; Li, S.; Jiang, Y.; Xu, H.; Tan, W.; Yang, Q.; Wang, J.; et al. Deep learning in environmental remote sensing: Achievements and challenges. Remote Sens. Environ. 2020, 241, 111716. [Google Scholar] [CrossRef]
- Wang, L.; Wang, J.; Liu, Z.; Zhu, J.; Qin, F. Evaluation of a deep-learning model for multispectral remote sensing of land use and crop classification. Crop J. 2022, 10, 1435–1451. [Google Scholar] [CrossRef]
- Xu, J.; Yang, J.; Xiong, X.; Li, H.; Huang, J.; Ting, K.; Ying, Y.; Lin, T. Towards interpreting multi-temporal deep learning models in crop mapping. Remote Sens. Environ. 2021, 264, 112599. [Google Scholar] [CrossRef]
- Yu, Y.; Si, X.; Hu, C.; Zhang, J. A review of recurrent neural networks: LSTM cells and network architectures. Neural Comput. 2019, 31, 1235–1270. [Google Scholar] [CrossRef]
- Ren, L.; Cheng, X.; Wang, X.; Cui, J.; Zhang, L. Multi-scale dense gate recurrent unit networks for bearing remaining useful life prediction. Future Gener. Comput. Syst. 2019, 94, 601–609. [Google Scholar] [CrossRef]
- Pullanagari, R.; Dehghan-Shoar, M.; Yule, I.J.; Bhatia, N. Field spectroscopy of canopy nitrogen concentration in temperate grasslands using a convolutional neural network. Remote Sens. Environ. 2021, 257, 112353. [Google Scholar] [CrossRef]
- Zhong, L.; Hu, L.; Zhou, H. Deep learning based multi-temporal crop classification. Remote Sens. Environ. 2019, 221, 430–443. [Google Scholar] [CrossRef]
- Zhao, H.; Duan, S.; Liu, J.; Sun, L.; Reymondin, L. Evaluation of five deep learning models for crop type mapping using sentinel-2 time series images with missing information. Remote Sens. 2021, 13, 2790. [Google Scholar] [CrossRef]
- Zhao, J.; Zhong, Y.; Hu, X.; Wei, L.; Zhang, L. A robust spectral-spatial approach to identifying heterogeneous crops using remote sensing imagery with high spectral and spatial resolutions. Remote Sens. Environ. 2020, 239, 111605. [Google Scholar] [CrossRef]
- Xie, B.; Zhang, H.K.; Xue, J. Deep convolutional neural network for mapping smallholder agriculture using high spatial resolution satellite image. Sensors 2019, 19, 2398. [Google Scholar] [CrossRef] [Green Version]
- Seydi, S.T.; Amani, M.; Ghorbanian, A. A Dual Attention Convolutional Neural Network for Crop Classification Using Time-Series Sentinel-2 Imagery. Remote Sens. 2022, 14, 498. [Google Scholar] [CrossRef]
- Yang, S.; Gu, L.; Li, X.; Jiang, T.; Ren, R. Crop classification method based on optimal feature selection and hybrid CNN-RF networks for multi-temporal remote sensing imagery. Remote Sens. 2020, 12, 3119. [Google Scholar] [CrossRef]
- Kussul, N.; Lemoine, G.; Gallego, F.J.; Skakun, S.V.; Lavreniuk, M.; Shelestov, A.Y. Parcel-based crop classification in Ukraine using Landsat-8 data and Sentinel-1A data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 2500–2508. [Google Scholar] [CrossRef]
- Wang, D.; Cao, W.; Zhang, F.; Li, Z.; Xu, S.; Wu, X. A review of deep learning in multiscale agricultural sensing. Remote Sens. 2022, 14, 559. [Google Scholar] [CrossRef]
- Orynbaikyzy, A.; Gessner, U.; Conrad, C. Crop type classification using a combination of optical and radar remote sensing data: A review. Int. J. Remote Sens. 2019, 40, 6553–6595. [Google Scholar] [CrossRef]
- Arik, S.Ö.; Pfister, T. Tabnet: Attentive interpretable tabular learning. Proc. Aaai Conf. Artif. Intell. 2021, 35, 6679–6687. [Google Scholar] [CrossRef]
- Sun, Y.; Qin, Q.; Ren, H.; Zhang, T.; Chen, S. Red-edge band vegetation indices for leaf area index estimation from sentinel-2/msi imagery. IEEE Trans. Geosci. Remote Sens. 2019, 58, 826–840. [Google Scholar] [CrossRef]
- Li, X.; Wang, W.; Hu, X.; Yang, J. Selective kernel networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 510–519. [Google Scholar]
- Li, R.; Zheng, S.; Zhang, C.; Duan, C.; Su, J.; Wang, L.; Atkinson, P.M. Multiattention network for semantic segmentation of fine-resolution remote sensing images. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–13. [Google Scholar] [CrossRef]
- Wang, D.; Chen, X.; Jiang, M.; Du, S.; Xu, B.; Wang, J. ADS-Net: An Attention-Based deeply supervised network for remote sensing image change detection. Int. J. Appl. Earth Obs. Geoinf. 2021, 101, 102348. [Google Scholar]
- Buskirk, T.D. Surveying the forests and sampling the trees: An overview of classification and regression trees and random forests with applications in survey research. Surv. Pract. 2018, 11, 1–13. [Google Scholar] [CrossRef] [Green Version]
- Thung, K.H.; Wee, C.Y. A brief review on multi-task learning. Multimed. Tools Appl. 2018, 77, 29705–29725. [Google Scholar] [CrossRef]
- Zhang, Z.; Sabuncu, M.R. Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels. arXiv 2018, arXiv:1805.07836. [Google Scholar]
- Achanta, R.; Shaji, A.; Smith, K.; Lucchi, A.; Fua, P.; Süsstrunk, S. SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 2274–2282. [Google Scholar] [CrossRef] [Green Version]
- Kumar, Y.J.N.; Spandana, V.; Vaishnavi, V.; Neha, K.; Devi, V. Supervised machine learning approach for crop yield prediction in agriculture sector. In Proceedings of the 2020 5th International Conference on Communication and Electronics Systems (ICCES), Coimbatore, India, 10–12 June 2020; IEEE: New York, NY, USA, 2020; pp. 736–741. [Google Scholar]
- Verrelst, J.; Rivera, J.P.; Veroustraete, F.; Muñoz-Marí, J.; Clevers, J.G.; Camps-Valls, G.; Moreno, J. Experimental Sentinel-2 LAI estimation using parametric, non-parametric and physical retrieval methods—A comparison. ISPRS J. Photogramm. Remote Sens. 2015, 108, 260–272. [Google Scholar] [CrossRef]
- Estornell, J.; Martí-Gavilá, J.M.; Sebastiá, M.T.; Mengual, J. Principal component analysis applied to remote sensing. Model. Sci. Educ. Learn. 2013, 6, 83–89. [Google Scholar] [CrossRef] [Green Version]
- Fitzgerald, R.; Lees, B. Assessing the classification accuracy of multisource remote sensing data. Remote Sens. Environ. 1994, 47, 362–368. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Loshchilov, I.; Hutter, F. Sgdr: Stochastic gradient descent with warm restarts. arXiv 2016, arXiv:1608.03983. [Google Scholar]
- Hu, X.; Wang, X.; Zhong, Y.; Zhang, L. S3ANet: Spectral-spatial-scale attention network for end-to-end precise crop classification based on UAV-borne H2 imagery. ISPRS J. Photogramm. Remote Sens. 2022, 183, 147–163. [Google Scholar] [CrossRef]
Sentinel-2 Bands | Central Wavelength (nm) | Resolution (m) | Description |
---|---|---|---|
Band 1 | 443.9 | 60 | Aerosols |
Band 2 | 496.6 | 10 | Blue |
Band 3 | 560.0 | 10 | Green |
Band 4 | 664.5 | 10 | Red |
Band 5 | 703.9 | 20 | Red Edge 1 |
Band 6 | 740.2 | 20 | Red Edge 2 |
Band 7 | 782.5 | 20 | Red Edge 3 |
Band 8 | 835.1 | 10 | NIR |
Band 8A | 864.8 | 20 | Red Edge 4 |
Band 9 | 945.0 | 60 | Water Vapor |
Band 11 | 1613.7 | 20 | SWIR 1 |
Band 12 | 2202.4 | 20 | SWIR 2 |
Types | Number of Field Samples | Number of Expanded Samples | Training (50%) | Validation (5%) | Test (45%) |
---|---|---|---|---|---|
Peanut | 1501 | 112,134 | 56,067 | 5607 | 50,460 |
Corn | 1528 | 69,185 | 34,593 | 3459 | 31,133 |
Rice | 1045 | 76,966 | 38,483 | 3849 | 34,635 |
Others | 319 | 128,659 | 64,329 | 6433 | 57,897 |
Total | 4392 | 386,945 | 193,471 | 19,349 | 174,125 |
Band Combination | Peanut–Corn | Peanut–Rice | Corn–Rice |
---|---|---|---|
B4, B3, B2 | 10.6995 | 3.7830 | 7.8800 |
B8, B4, B3 | 14.2403 | 7.2344 | 7.7433 |
B8, B5, B4 | 6.6000 | 13.1003 | 9.2887 |
B8, B6, B4 | 10.5776 | 14.3992 | 7.8531 |
B8, B7, B4 | 6.7159 | 5.3856 | 3.0826 |
B8, B9, B4 | 5.1530 | 4.2164 | 3.9393 |
B8, B11, B4 | 26.5757 | 20.7087 | 6.9893 |
PCA | 13.5527 | 9.1596 | 2.9907 |
Method | Others | Peanut | Corn | Rice | mF1 | KC | OA | Parameters | Inference Time |
---|---|---|---|---|---|---|---|---|---|
(kb) | (mins) | ||||||||
RF | 0.7246 | 0.8719 | 0.9562 | 0.8076 | 0.8401 | 0.8199 | 0.8521 | 158,851 | 43 |
XGBoost | 0.8050 | 0.9587 | 0.9230 | 0.7950 | 0.8704 | 0.8234 | 0.8741 | 12,456 | 39 |
CNN | 0.6037 | 0.9131 | 0.7051 | 0.8862 | 0.7770 | 0.8217 | 0.7221 | 3581 | 20 |
CNN-RF | 0.8162 | 0.9703 | 0.8199 | 0.9068 | 0.8774 | 0.8936 | 0.8801 | 738,547 | 78 |
S3ANet | 0.9109 | 0.9599 | 0.9329 | 0.9182 | 0.9305 | 0.9358 | 0.9305 | 24,250 | 52 |
SPTNet | 0.9624 | 0.9730 | 0.9664 | 0.9590 | 0.9652 | 0.9531 | 0.9656 | 9731 | 45 |
Method | Metrics | Others | Peanut | Corn | Rice |
---|---|---|---|---|---|
RF | PA | 0.6074 | 0.9452 | 0.9705 | 0.8851 |
UA | 0.8980 | 0.8092 | 0.9424 | 0.7426 | |
XGBoost | PA | 0.7395 | 0.9646 | 0.9397 | 0.8831 |
UA | 0.8833 | 0.9530 | 0.9069 | 0.7229 | |
CNN | PA | 0.4428 | 0.9667 | 0.9653 | 0.9510 |
UA | 0.9484 | 0.8653 | 0.5554 | 0.8298 | |
CNN-RF | PA | 0.6974 | 0.9729 | 0.9662 | 0.9728 |
UA | 0.9734 | 0.9678 | 0.7121 | 0.8493 | |
S3ANet | PA | 0.8845 | 0.9376 | 0.9533 | 0.9756 |
UA | 0.9391 | 0.9834 | 0.9135 | 0.8672 | |
SPTNet | PA | 0.9639 | 0.9689 | 0.9696 | 0.9599 |
UA | 0.9610 | 0.9773 | 0.9634 | 0.9582 |
Baseline | SPM | TabNet | Multitask | SLIC | Others | Peanut | Corn | Rice | mF1 | KC | OA |
---|---|---|---|---|---|---|---|---|---|---|---|
✓ | 0.6037 | 0.9131 | 0.7051 | 0.8862 | 0.7770 | 0.8217 | 0.7891 | ||||
✓ | ✓ | 0.8274 | 0.9369 | 0.9675 | 0.8723 | 0.9010 | 0.8836 | 0.9256 | |||
✓ | ✓ | ✓ | 0.9332 | 0.9525 | 0.9492 | 0.9434 | 0.9446 | 0.9226 | 0.9435 | ||
✓ | ✓ | ✓ | ✓ | 0.9411 | 0.9698 | 0.9562 | 0.9409 | 0.9520 | 0.9530 | 0.9522 | |
✓ | ✓ | ✓ | ✓ | ✓ | 0.9624 | 0.9730 | 0.9664 | 0.9590 | 0.9652 | 0.9531 | 0.9656 |
Baseline | SPM | TabNet | Multitask | SLIC | Metrics | Others | Peanut | Corn | Rice |
---|---|---|---|---|---|---|---|---|---|
✓ | PA | 0.4428 | 0.9667 | 0.9653 | 0.9510 | ||||
UA | 0.9484 | 0.8653 | 0.5554 | 0.8298 | |||||
✓ | ✓ | PA | 0.7576 | 0.9552 | 0.9705 | 0.9551 | |||
UA | 0.9115 | 0.9194 | 0.9646 | 0.8028 | |||||
✓ | ✓ | ✓ | PA | 0.9790 | 0.9303 | 0.9230 | 0.9217 | ||
UA | 0.8916 | 0.9758 | 0.9771 | 0.9663 | |||||
✓ | ✓ | ✓ | ✓ | PA | 0.9236 | 0.9541 | 0.9624 | 0.9639 | |
UA | 0.9694 | 9708 | 0.9433 | 0.9228 | |||||
✓ | ✓ | ✓ | ✓ | ✓ | PA | 0.9639 | 0.9689 | 0.9696 | 0.9599 |
UA | 0.9610 | 0.9773 | 0.9634 | 0.9582 |
Split with | Others | Peanut | Corn | Rice | mF1 | KC | OA | |
---|---|---|---|---|---|---|---|---|
Convolutional Kernel Size | Patch Size | |||||||
✓ | 0.9341 | 0.9572 | 0.9535 | 0.9470 | 0.9480 | 0.9290 | 0.9475 | |
✓ | 0.9624 | 0.9730 | 0.9664 | 0.9590 | 0.9653 | 0.9531 | 0.9656 |
3 × 3 Pixel Patch | 5 × 5 Pixel Patch | 7 × 7 Pixel Patch | Others | Peanut | Corn | Rice | mF1 | KC | OA |
---|---|---|---|---|---|---|---|---|---|
✓ | ✓ | 0.9286 | 0.9634 | 0.9534 | 0.9283 | 0.9549 | 0.9312 | 0.9502 | |
✓ | ✓ | 0.9452 | 0.9667 | 0.9638 | 0.9375 | 0.9534 | 0.9374 | 0.9537 | |
✓ | ✓ | 0.9332 | 0.9525 | 0.9492 | 0.9434 | 0.9556 | 0.9226 | 0.9535 | |
✓ | ✓ | ✓ | 0.9624 | 0.9730 | 0.9664 | 0.9590 | 0.9653 | 0.9531 | 0.9656 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tian, X.; Bai, Y.; Li, G.; Yang, X.; Huang, J.; Chen, Z. An Adaptive Feature Fusion Network with Superpixel Optimization for Crop Classification Using Sentinel-2 Imagery. Remote Sens. 2023, 15, 1990. https://doi.org/10.3390/rs15081990
Tian X, Bai Y, Li G, Yang X, Huang J, Chen Z. An Adaptive Feature Fusion Network with Superpixel Optimization for Crop Classification Using Sentinel-2 Imagery. Remote Sensing. 2023; 15(8):1990. https://doi.org/10.3390/rs15081990
Chicago/Turabian StyleTian, Xiangyu, Yongqing Bai, Guoqing Li, Xuan Yang, Jianxi Huang, and Zhengchao Chen. 2023. "An Adaptive Feature Fusion Network with Superpixel Optimization for Crop Classification Using Sentinel-2 Imagery" Remote Sensing 15, no. 8: 1990. https://doi.org/10.3390/rs15081990
APA StyleTian, X., Bai, Y., Li, G., Yang, X., Huang, J., & Chen, Z. (2023). An Adaptive Feature Fusion Network with Superpixel Optimization for Crop Classification Using Sentinel-2 Imagery. Remote Sensing, 15(8), 1990. https://doi.org/10.3390/rs15081990