Classifying Individual Shrub Species in UAV Images—A Case Study of the Gobi Region of Northwest China
"> Figure 1
<p>Location of the study area (yellow rectangle) in the Gobi region. The backdrop is a Landsat-8 satellite image.</p> "> Figure 2
<p>Unmanned aerial vehicle (UAV) images processing of shrub vegetation alliances used for shrub species classification experiments and the sampling plots within the study area.</p> "> Figure 3
<p>Ground-based photos of different shrub species (<b>a</b>–<b>d</b>) and their corresponding unmanned aerial vehicle (UAV) images (<b>e</b>–<b>h</b>).</p> "> Figure 4
<p>Workflow for classification of individual shrub species. Abbreviations used in the classification step: excess green red index (ExGR), green blue ratio index (GBRI), vegetative index (VEG), red green blue vegetation index (RGBVI), excess green index (ExG), k-nearest neighbors (k-NN), support vector machine (SVM), random forest (RF), principal component analysis (PCA). The A-PC, B-PC, C-PC, and D-PC represent the sets of variables used in the classification experiments of shrub species after principal component analysis (PCA) of layers A, B, C, and D, respectively.</p> "> Figure 5
<p>Example of the multi-scale segmentation results with various scale, shape, and tightness parameters. (<b>a</b>) Segmentation results of various scales (shape = 0.5, tightness = 0.5). (<b>b</b>) Segmentation results of various shapes (scale = 80, tightness = 0.5). (<b>c</b>) Segmentation results of various tightness parameters (scale = 80, shape = 0.3).</p> "> Figure 6
<p>Results of extraction using different vegetation indices in the Gobi region (30 × 30 m). The white color is vegetation area, and black is non-vegetation area. Abbreviations: excess green red index (ExGR), green blue ratio index (GBRI), vegetative index (VEG), red green blue vegetation index (RGBVI), excess green index (ExG).</p> "> Figure 7
<p>Unmanned aerial vehicle (UAV) RGB image of the three shrub vegetation alliances (EPPR–SAXA–GYPR; EPPR–SALA; and EPPR–SALA–GYPR) and the corresponding prediction results for the four sets of variables (A-PC, B-PC, C-PC, and D-PC) generated by the random forest (RF) model.</p> "> Figure 8
<p>Spatial distribution of shrub species across different areas (<b>top</b>, <b>middle</b>, and <b>bottom</b>) of alluvial fans.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Field Sampling Data
2.3. Acquisition and Pre-Processing of UAV RGB Images
2.4. Fine Scale Classification of Individual Shrub Species
2.4.1. UAV RGB Image Segmentation
2.4.2. Variable Derivation for Classification
2.4.3. Training and Testing Data Construction
2.4.4. Shrub Species Classification
2.4.5. Classification Model Validation
3. Results
3.1. Image Segmentation Parameter Selection
3.2. Classification Results for Vegetation and Non-Vegetation
3.3. Classification Results for Individual Shrub Species
4. Discussion
4.1. Use of Multi-Scale Segmentation to Segment UAV Images
4.2. Individual Shrub-Based Species Classification
4.3. Spatial Distribution of Vegetation in Different Areas of Alluvial Fans
4.4. Advantages and Limitations
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
Appendix A
Type | Variable | Description | Formula |
---|---|---|---|
Gray-level co-occurrence matrix (GLCM) indices | Mean | Mean measures the average of gray level values in an image. | |
Variance | Measures texture heterogeneity. Variance increases when the gray level values differ from their mean. | ||
Homogeneity | A measure of homogeneity. Sensitive to the presence of near diagonal elements in a GLCM. | ||
Contrast | Contrast measures the drastic change in gray level between contiguous pixels. Low contrast image features low spatial frequencies. | ||
Dissimilarity | Dissimilarity is similar to contrast. Instead of weighting the diagonal exponentially, dissimilarity weights increase linearly. | ||
Entropy | A measure of the disorder in an image and is highly correlated to energy. Entropy is high when an image is not texturally uniform. | ||
Energy | Measures texture uniformity, or pixel pair repetitions. High energy occurs when the distribution of gray level values is constant or period. | ||
Correlation | Measures the linear dependency in an image. High correlation values mean a linear relationship between the gray levels of a contiguous set of pixel pairs. |
Name | Formula |
---|---|
Producer’s Accuracy (PA) | |
User’s Accuracy (UA) | |
Overall Accuracy (OA) | |
Kappa Coefficient (Kappa) |
Appendix B
Layer | Type | Variable | Equation | Reference |
---|---|---|---|---|
1 | Original bands | blue band (B) | — | [13] |
2 | green band (G) | — | ||
3 | red band (R) | — | ||
4 | Spectral indices | blue green ratio index (BGRI) | B/G | [64] |
5 | blue ratio (Bratio) | B/(B + G + R) | [65] | |
6 | blue red ratio index (BRRI) | B/R | [64] | |
7 | excess blue index (ExB) | 1.4 × blue ratio − green ratio | [66] | |
8 | excess green index (ExG) | 2 × green ratio − red ratio − blue ratio | [66,67] | |
9 | excess green red index (ExGR) | ExG − ExR | [33] | |
10 | excess red index (ExR) | 1.4 × red ratio − green ratio | [66] | |
11 | green blue ratio index (GBRI) | G/B | [65] | |
12 | green ratio (Gratio) | G/(B + G + R) | [65] | |
13 | green red ratio index (GRRI) | G/R | [65] | |
14 | Kawashima index (IKAW) | (R − B)/(R + B) | [68] | |
15 | color intensity index (INT) | (R + G + B)/3 | [65] | |
16 | modified green red vegetation index (MGRVI) | (G2 − R2)/(G2 + R2) | [69] | |
17 | modified VARI (MVARI) | (G − B)/(G + R − B) | [24] | |
18 | normalized green blue difference index (NGBDI) | (G − B)/(G + B) | [70] | |
19 | red blue ratio index (RBRI) | R/B | [65] | |
20 | red green blue vegetation index (RGBVI) | (G2 − R × B)/(G2 + R × B) | [66] | |
21 | red ratio (Rratio) | R/(B + G + R) | [65] | |
22 | triangular greenness index (TGI) | G − (0.39 × R) − (0.61 × B) | [71] | |
23 | visible atmospherically resistant index (VARI) | (G − R)/(G + R − B) | [24] | |
24 | vegetative index (VEG) | G/(R0.667 × B0.333) | [72] | |
25 | visible-band difference vegetation index (VDVI) | (G − R − B)/(G + R + B) | [24] | |
26 | Woebbecke index (WI) | (G − B)/(G + R) | [67] | |
27–34 | Texture indices | B_GLCM (mean, variance, homogeneity, contrast, dissimilarity, entropy, energy, correlation) | — | [45] |
35–42 | G_GLCM (mean, variance, homogeneity, contrast, dissimilarity, entropy, energy, correlation) | — | ||
43–50 | R_GLCM (mean, variance, homogeneity, contrast, dissimilarity, entropy, energy, correlation) | — |
Appendix C
Variable Set | Principal Components | Variance (%) |
---|---|---|
A-PC | PC1 | 87.87 |
PC2 | 8.22 | |
PC3 | 2.22 | |
PC4 | 1.06 | |
PC5 | 0.49 | |
PC6 | 0.14 | |
B-PC | PC1 | 85.67 |
PC2 | 8.24 | |
PC3 | 2.03 | |
PC4 | 1.28 | |
PC5 | 0.52 | |
PC6 | 0.36 | |
C-PC | PC1 | 82.03 |
PC2 | 9.92 | |
PC3 | 1.76 | |
PC4 | 0.74 | |
PC5 | 0.31 | |
PC6 | 0.25 | |
D-PC | PC1 | 84.74 |
PC2 | 10.46 | |
PC3 | 1.37 | |
PC4 | 0.42 | |
PC5 | 0.4 | |
PC6 | 0.23 |
Appendix D
Variable Set | Class | Producer Accuracy | Number of Samples | Ephedra przewalskii | Sarcozygium xanthoxylon | Gymnocarpos przewalskii |
---|---|---|---|---|---|---|
A-PC | Ephedra przewalskii | 0.83 | 176 | 146 | 21 | 9 |
Sarcozygium xanthoxylon | 0.43 | 68 | 31 | 29 | 8 | |
Gymnocarpos przewalskii | 0.50 | 44 | 14 | 8 | 22 | |
Total | 288 | 191 | 58 | 39 | ||
User Accuracy | 0.76 | 0.50 | 0.56 | |||
Overall Accuracy (197/288) = 68.40% | Kappa = 0.40 | |||||
B-PC | Ephedra przewalskii | 0.89 | 176 | 156 | 13 | 7 |
Sarcozygium xanthoxylon | 0.54 | 68 | 27 | 37 | 4 | |
Gymnocarpos przewalskii | 0.70 | 44 | 8 | 5 | 31 | |
Total | 288 | 191 | 55 | 42 | ||
User Accuracy | 0.82 | 0.67 | 0.74 | |||
Overall Accuracy (224/288) = 77.78% | Kappa = 0.58 | |||||
C-PC | Ephedra przewalskii | 0.92 | 176 | 162 | 9 | 5 |
Sarcozygium xanthoxylon | 0.66 | 68 | 9 | 45 | 14 | |
Gymnocarpos przewalskii | 0.68 | 44 | 1 | 13 | 30 | |
Total | 288 | 172 | 67 | 49 | ||
User Accuracy | 0.94 | 0.67 | 0.61 | |||
Overall Accuracy (237/288) = 82.29% | Kappa = 0.68 | |||||
D-PC | Ephedra przewalskii | 0.92 | 176 | 162 | 10 | 4 |
Sarcozygium xanthoxylon | 0.68 | 68 | 19 | 46 | 3 | |
Gymnocarpos przewalskii | 0.75 | 44 | 7 | 4 | 33 | |
Total | 288 | 188 | 60 | 40 | ||
User Accuracy | 0.86 | 0.77 | 0.83 | |||
Overall Accuracy (237/288) = 82.29% | Kappa = 0.68 |
Variable Set | Class | Producer Accuracy | Number of Samples | Ephedra przewalskii | Salsola laricifolia |
---|---|---|---|---|---|
A-PC | Ephedra przewalskii | 0.70 | 137 | 96 | 41 |
Salsola laricifolia | 0.71 | 148 | 43 | 105 | |
Total | 285 | 139 | 146 | ||
User Accuracy | 0.69 | 0.72 | |||
Overall Accuracy | Kappa = 0.41 | ||||
(201/285) = 70.53% | |||||
B-PC | Ephedra przewalskii | 0.80 | 137 | 109 | 28 |
Salsola laricifolia | 0.84 | 148 | 23 | 125 | |
Total | 285 | 132 | 153 | ||
User Accuracy | 0.83 | 0.82 | |||
Overall Accuracy | Kappa = 0.64 | ||||
(234/285) = 82.11% | |||||
C-PC | Ephedra przewalskii | 0.74 | 137 | 101 | 36 |
Salsola laricifolia | 0.76 | 148 | 36 | 112 | |
Total | 285 | 137 | 148 | ||
User Accuracy | 0.74 | 0.76 | |||
Overall Accuracy | Kappa = 0.49 | ||||
(213/285) = 74.74% | |||||
D-PC | Ephedra przewalskii | 0.92 | 137 | 126 | 11 |
Salsola laricifolia | 0.97 | 148 | 4 | 144 | |
Total | 285 | 130 | 155 | ||
User Accuracy | 0.97 | 0.93 | |||
Overall Accuracy | Kappa = 0.89 | ||||
(270/285) = 94.74% |
Variable Set | Class | Producer Accuracy | Number of Samples | Ephedra przewalskii | Salsola laricifolia | Gymnocarpos przewalskii |
---|---|---|---|---|---|---|
A-PC | Ephedra przewalskii | 0.73 | 144 | 105 | 25 | 14 |
Salsola laricifolia | 0.74 | 146 | 21 | 108 | 17 | |
Gymnocarpos przewalskii | 0.29 | 34 | 2 | 22 | 10 | |
Total | 324 | 128 | 155 | 41 | ||
User Accuracy | 0.82 | 0.70 | 0.24 | |||
Overall Accuracy (223/324) = 68.83% | Kappa = 0.48 | |||||
B-PC | Ephedra przewalskii | 0.76 | 144 | 110 | 26 | 8 |
Salsola laricifolia | 0.85 | 146 | 10 | 124 | 12 | |
Gymnocarpos przewalskii | 0.44 | 34 | 4 | 15 | 15 | |
Total | 324 | 124 | 165 | 35 | ||
User Accuracy | 0.89 | 0.75 | 0.43 | |||
Overall Accuracy (249/324) = 76.85% | Kappa = 0.61 | |||||
C-PC | Ephedra przewalskii | 0.64 | 144 | 92 | 46 | 6 |
Salsola laricifolia | 0.79 | 146 | 23 | 116 | 7 | |
Gymnocarpos przewalskii | 0.65 | 34 | 6 | 6 | 22 | |
Total | 324 | 121 | 168 | 35 | ||
User Accuracy | 0.76 | 0.69 | 0.63 | |||
Overall Accuracy (230/324) = 70.99% | Kappa = 0.51 | |||||
D-PC | Ephedra przewalskii | 0.90 | 144 | 130 | 5 | 9 |
Salsola laricifolia | 0.89 | 146 | 6 | 130 | 10 | |
Gymnocarpos przewalskii | 0.71 | 34 | — | 10 | 24 | |
Total | 324 | 136 | 145 | 43 | ||
User Accuracy | 0.96 | 0.90 | 0.56 | |||
Overall Accuracy (284/324) = 87.65% | Kappa = 0.79 |
References
- Li, X.R. Study on shrub community diversity of Ordos Plateau, Inner Mongolia, Northern China. J. Arid Environ. 2001, 47, 271–279. [Google Scholar]
- Hao, H.M.; Huang, Z.; Lu, R.; Jia, C.; Liu, Y.; Liu, B.R.; Wu, G.L. Patches structure succession based on spatial point pattern features in semi-arid ecosystems of the water-wind erosion crisscross region. Glob. Ecol. Conserv. 2017, 12, 158–165. [Google Scholar] [CrossRef]
- Halle, H.W.; Göttingen, K.W.; Marburg, G.M. Plant communities of the southern Mongolian Gobi. Phytocoenologia 2009, 39, 331–376. [Google Scholar]
- Li, F.R.; Liu, J.L.; Liu, C.A.; Niu, R.X. Shrubs and species identity effects on the distribution and diversity of ground-dwelling arthropods in a Gobi desert. J. Insect Conserv. 2013, 17, 319–331. [Google Scholar] [CrossRef]
- Guo, Z.C.; Wang, T.; Liu, S.L.; Kang, W.P.; Chen, X.; Feng, K.; Zhang, X.Q.; Zhi, Y. Biomass and vegetation coverage survey in the Mu Us sandy land-based on unmanned aerial vehicle RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 94, 102239. [Google Scholar] [CrossRef]
- Ripley, B.D. Modeling spatial patterns. J. R. Stat. Soc. 1977, 39, 172–212. [Google Scholar]
- Huang, H.Y.; Li, X.; Chen, C.C. Individual tree crown detection and delineation from very-high-Resolution UAV images based on Bias field and marker-controlled watershed segmentation algorithms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 2253–2262. [Google Scholar] [CrossRef]
- Chang, Y.M.; Baddeley, A.; Wallace, J.; Canci, M. Spatial statistical analysis of tree deaths using airborne digital imagery. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 418–426. [Google Scholar] [CrossRef]
- Salovaara, K.J.; Thessler, S.; Malik, R.N.; Tuomisto, H. Classification of Amazonian primary rain forest vegetation using Landsat ETM+ satellite imagery. Remote Sens. Environ. 2005, 97, 39–51. [Google Scholar] [CrossRef]
- Pant, P.; Heikkinen, V.; Hovi, I.A.; Korpela, I.; Hauta-Kasari, M.; Tokola, T. Evaluation of simulated bands in airborne optical sensors for tree species identification. Remote Sens. Environ. 2013, 138, 27–37. [Google Scholar] [CrossRef]
- Fassnacht, F.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
- Ozdarici-Ok, A. Automatic detection and delineation of citrus trees from VHR satellite imagery. Int. J. Remote Sens. 2015, 36, 4275–4296. [Google Scholar] [CrossRef]
- Zhang, X.L.; Zhang, F.; Qi, Y.X.; Deng, L.F.; Wang, X.L.; Yang, S.T. New research methods for vegetation information extraction based on visible light remote sensing images from an unmanned aerial vehicle (UAV). Int. J. Appl. Earth Obs. Geoinf. 2019, 78, 215–226. [Google Scholar] [CrossRef]
- Picos, J.; Bastos, G.; Míguez, D.; Alonso, L.; Armesto, J. Individual tree detection in a eucalyptus plantation using unmanned aerial vehicle (UAV)-LiDAR. Remote Sens. 2020, 12, 885. [Google Scholar] [CrossRef] [Green Version]
- Onishi, M.; Ise, T. Explainable identification and mapping of trees using UAV RGB image and deep learning. Sci. Rep. 2021, 11, 903. [Google Scholar] [CrossRef]
- Marcial-Pablo, M.d.J.; Gonzalez-Sanchez, A.; Jimenez-Jimenez, S.I.; Ontiveros-Capurata, R.E.; Ojeda-Bustamante, W. Estimation of vegetation fraction using RGB and multispectral images from UAV. Int. J. Remote Sens. 2019, 40, 420–438. [Google Scholar] [CrossRef]
- Riaño, D.; Chuvieco, E.; Condés, S.; González-Matesanz, J.; Ustin, S.L. Generation of crown bulk density for Pinus sylvestris L. From lidar. Remote Sens. Environ. 2004, 92, 345–352. [Google Scholar] [CrossRef]
- Suratno, A.; Seielstad, C.; Queen, L. Tree species identification in mixed coniferous forest using airborne laser scanning. ISPRS J. Photogramm. Remote Sens. 2009, 64, 683–693. [Google Scholar] [CrossRef]
- Korpela, I.; Ørka, H.O.; Hyyppä, J.; Heikkinen, V.; Tokola, T. Range and AGC normalization in airborne discrete-return LiDAR intensity data for forest canopies. ISPRS J. Photogramm. Remote Sens. 2010, 65, 369–379. [Google Scholar] [CrossRef]
- Zhang, K.W.; Hu, B.X. Individual urban tree species classification using very high spatial resolution airborne multi-spectral imagery using longitudinal profiles. Remote Sens. 2012, 4, 1741–1757. [Google Scholar] [CrossRef] [Green Version]
- Franklin, S.E. Pixel- and object-based multispectral classification of forest tree species from small unmanned aerial vehicles. J. Unmanned Veh. Syst. 2018, 6, 195–211. [Google Scholar] [CrossRef] [Green Version]
- Koc-San, D.; Selim, S.; Aslan, N.; San, B.T. Automatic citrus tree extraction from UAV images and digital surface models using circular Hough transform. Comput. Electron. Agric. 2018, 150, 289–301. [Google Scholar] [CrossRef]
- Iizuka, K.; Yonehara, T.; Itoh, M.; Kosugi, Y. Estimating tree height and diameter at breast height (DBH) from digital surface models and orthophotos obtained with an unmanned aerial system for a Japanese Cypress (Chamaecyparis obtusa) Forest. Remote Sens. 2018, 10, 13. [Google Scholar] [CrossRef] [Green Version]
- Cen, H.Y.; Wan, L.; Zhu, J.P.; Li, Y.J.; Li, X.R.; Zhu, Y.M.; Weng, H.Y.; Wu, W.K.; Yin, W.X.; Xu, C.; et al. Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras. Plant Methods 2019, 15, 32. [Google Scholar] [CrossRef] [PubMed]
- Karimi, Y.; Prasher, S.O.; Patel, R.M.; Kim, S.H. Application of support vector machine technology for weed and nitrogen stress detection in corn. Comput. Electron. Agric. 2006, 51, 99–109. [Google Scholar] [CrossRef]
- Safonova, A.; Tabik, S.; Alcaraz-Segura, D.; Rubtsov, A.; Maglinets, Y.; Herrera, F. Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sens. 2019, 11, 643. [Google Scholar] [CrossRef] [Green Version]
- Weinstein, B.G.; Marconi, S.; Bohlman, S.A.; Zare, A.; White, E.P. Cross-site learning in deep learning RGB tree crown detection. Ecol. Inform. 2020, 56, 101061. [Google Scholar] [CrossRef]
- Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.N.; et al. Individual tree detection and classification with UAV-Based photogrammetric point clouds and hyperspectral imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef] [Green Version]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Räsänen, A.; Juutinen, S.; Tuittila, E.; Aurela, M.; Virtanen, T. Comparing ultra-high spatial resolution remote-sensing methods in mapping peatland vegetation. J. Veg. Sci. 2019, 30, 1016–1026. [Google Scholar] [CrossRef]
- Sheykhmousa, M.; Mahdianpari, M.; Ghanbari, H.; Mohammadimanesh, F.; Ghamisi, P.; Homayouni, S. Support Vector Machine versus Random Forest for Remote Sensing Image Classification: A Meta-Analysis and Syst-ematic Review. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 6308–6325. [Google Scholar] [CrossRef]
- Talukdar, S.; Singha, P.; Mahato, S.; Shahfahad; Pal, S.; Liou, Y.-A.; Rahman, A. Land-Use Land-Cover Classification by Machine Learning Classifiers for Satellite Observations—A Review. Remote Sens. 2020, 12, 1135. [Google Scholar] [CrossRef] [Green Version]
- Niederheiser, R.; Rutzinger, M.; Bremer, M.; Wichmann, V. Dense image matching of terrestrial imagery for deriving high-resolution topographic properties of vegetation locations in alpine terrain. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 146–158. [Google Scholar] [CrossRef]
- Tian, X.M.; Chen, L.; Zhang, X.L. Classifying tree species in the plantations of southern China based on wavelet analysis and mathematical morphology. Comput. Geosci. 2021, 151, 104757. [Google Scholar] [CrossRef]
- Zhao, C.H.; Gao, B.; Zhang, L.J.; Wan, X.Q. Classification of Hyperspectral Imagery based on spectral gradient, SVM and spatial random forest. Infrared Phys. Technol. 2018, 95, 61–69. [Google Scholar]
- Xu, J.; Gu, H.B.; Meng, Q.M.; Cheng, J.H.; Liu, Y.H.; Jiang, P.A.; Sheng, J.D.; Deng, J.; Bai, X. Spatial pattern analysis of Haloxylon ammodendron using UAV imagery—A case study in the Gurbantunggut Desert. Int. J. Appl. Earth Obs. Geoinf. 2019, 83, 101891. [Google Scholar] [CrossRef]
- Mellor, A.; Haywood, A.; Stone, C.; Jones, S. The Performance of Random Forests in an Operational Setting for Large Area Sclerophyll Forest Classification. Remote Sens. 2013, 5, 2838–2856. [Google Scholar] [CrossRef] [Green Version]
- Alvarez-Taboada, F.; Paredes, C.; Julián-Pelaz, J. Mapping of the invasive species Hakea sericea using Unmanned Aerial Vehicle (UAV) and WorldView-2 imagery and an object-oriented approach. Remote Sens. 2017, 9, 913. [Google Scholar] [CrossRef] [Green Version]
- Mallinis, G.; Koutsias, N.; Tsakiri-Strati, M.; Karteris, M. Object-based classification using Quickbird imagery for delineating forest vegetation polygons in a Mediterranean test site. ISPRS J. Photogramm. Remote Sens. 2008, 63, 237–250. [Google Scholar] [CrossRef]
- Salamí, E.; Barrado, C.; Pastor, E. UAV Flight Experiments Applied to the Remote Sensing of Vegetated Areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef] [Green Version]
- Gebreslasie, M.T.; Ahmed, F.B.; van Aardt, J.A.N. Extracting structural attributes from IKONOS imagery for Eucalyptus plantation forests in KwaZulu-Natal, South Africa, using image texture analysis and artificial neural networks. Int. J. Remote Sens. 2011, 32, 7677–7701. [Google Scholar] [CrossRef]
- Charoenjit, K.; Zuddas, P.; Allemand, P.; Pattanakiat, S.; Pachana, K. Estimation of biomass and carbon stock in Para rubber plantations using object-based classification from Thaichote satellite data in Eastern Thailand. J. Appl. Remote Sens. 2015, 9, 096072. [Google Scholar] [CrossRef] [Green Version]
- Herold, M.; Liu, X.H.; Clarke, K.C. Spatial Metrics and Image Texture for Mapping Urban Land Use. Photogramm. Eng. Remote Sens. 2003, 11, 991–1001. [Google Scholar] [CrossRef] [Green Version]
- Wood, E.M.; Pidgeon, A.M.; Radeloff, V.C.; Keuler, N.S. Image texture as a remotely sensed measure of vegetation structure. Remote Sens. Environ. 2012, 121, 516–526. [Google Scholar] [CrossRef]
- Park, Y.; Guldmann, J.M. Measuring continuous landscape patterns with Gray-Level Co-Occurrence Matrix (GLCM) indices: An alternative to patch metrics? Ecol. Indic. 2020, 109, 105802. [Google Scholar] [CrossRef]
- Zhang, F.; Yang, X.J. Improving land cover classification in an urbanized coastal area by random forests: The role of variable selection. Remote Sens. Environ. 2020, 251, 112105. [Google Scholar] [CrossRef]
- Simonetti, E.; Simonetti, D.; Preatoni, D. Phenology-Based Land Cover Classification Using Landsat 8 Time Series; European Commission Joint Research Center: Ispra, Italy, 2014. [Google Scholar]
- Wicaksono, P. Improving the accuracy of Multispectral-based benthic habitats mapping using image rotations: The application of Principle Component Analysis and Independent Component Analysis. Eur. J. Remote Sens. 2016, 49, 433–463. [Google Scholar] [CrossRef] [Green Version]
- Herkül, K.; Kotta, J.; Kutser, T.; Vahtmäe, E. Relating Remotely Sensed Optical Variability to Marine Benthic Biodiversity. PLoS ONE. 2013, 8, e55624. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pu, R.L.; Liu, D.S. Segmented canonical discriminant analysis of in situ hyperspectral data for identifying 13 urban tree species. Int. J. Remote Sens. 2011, 32, 2207–2226. [Google Scholar] [CrossRef]
- Immitzer, M.; Atzberger, C.; Koukal, T. Tree species classification with random Forest using very high spatial resolution 8-Band WorldView-2 satellite data. Remote Sens. 2012, 4, 2661–2693. [Google Scholar] [CrossRef] [Green Version]
- Ke, Y.; Quackenbush, L.J.; Im, J. Synergistic use of QuickBird multispectral imagery and LIDAR data for object-based forest species classification. Remote Sens. Environ. 2010, 114, 1141–1154. [Google Scholar] [CrossRef]
- Brandt, M.; Tucker, C.J.; Kariryaa, A.; Rasmussen, K.; Abel, C.; Small, J.; Chave, J.; Rasmussen, L.V.; Hiernaux, P.; Diouf, A.A.; et al. An unexpectedly large count of trees in the West African Sahara and Sahel. Nature. 2020, 587, 78–82. [Google Scholar] [CrossRef] [PubMed]
- Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.; Tiede, D. Geographic Object-Based Image Analysis—Towards a new paradigm. ISPRS Int. J. Geo-Inf. 2014, 87, 180–191. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cao, J.J.; Leng, W.C.; Liu, K.; Liu, L.; He, Z.; Zhu, Y.H. Object-based mangrove species classification using unmanned aerial vehicle hyperspectral images and digital surface models. Remote Sens. 2018, 10, 89. [Google Scholar] [CrossRef] [Green Version]
- Robson, B.A.; Bolch, T.; MacDonell, S.; Hölbling, D.; Rastner, P.; Schaffer, N. Automated detection of rock glaciers using deep learning and object-based image analysis. Remote Sens. Environ. 2020, 250, 112033. [Google Scholar] [CrossRef]
- Tang, Y.W.; Jing, L.H.; Li, H.; Atkinson, P.M. A multiple-point spatially weighted k-NN method for object-based classification. Int. J. Appl. Earth Obs. Geoinf. 2016, 52, 263–274. [Google Scholar] [CrossRef] [Green Version]
- Machala, M.; Zejdová, L. Forest mapping through object-based image analysis of multispectral and LiDAR aerial data. Eur. J. Remote Sens. 2014, 47, 117–131. [Google Scholar] [CrossRef]
- Korznikov, K.A.; Kislov, D.E.; Altman, J.; Doležal, J.; Vozmishcheva, A.S.; Krestov, P.V. Using U-Net-Like Deep Convolutional Neural Networks for Precise Tree Recognition in Very High Resolution RGB (Red, Green, Blue) Satellite Images. Forests. 2021, 12, 66. [Google Scholar] [CrossRef]
- Ma, L.; Li, M.; Ma, X.; Cheng, L.; Du, P.; Liu, Y. A review of supervised object-based land-cover image classification. ISPRS J. Photogramm. Remote Sens. 2017, 130, 277–293. [Google Scholar] [CrossRef]
- Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. 2010, 65, 2–16. [Google Scholar] [CrossRef] [Green Version]
- Hossain, M.D.; Chen, D. Segmentation for object-based image analysis (OBIA): A review of algorithms and challenges fromremote sensing perspective. ISPRS J. Photogramm. 2019, 150, 115–134. [Google Scholar] [CrossRef]
- Johnson, B.A.; Ma, L. Image Segmentation and Object-Based Image Analysis for Environmental Monitoring: Recent Areas of Interest, Researchers’ Views on the Future Priorities. Remote Sens. 2020, 12, 1772. [Google Scholar] [CrossRef]
- Yue, J.B.; Feng, H.K.; Jin, X.L.; Yuan, H.H.; Li, Z.H.; Zhou, C.Q.; Yang, G.J.; Tian, Q.J. A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera. Remote Sens. 2018, 10, 1138. [Google Scholar] [CrossRef] [Green Version]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Maw, M.J.W.; Shakoor, N.; Mockler, T.; Fritschi, F.B. Vegetation Index Weighted Canopy Volume Model (CVMVI) for soybean biomass estimation from Unmanned Aerial System-based RGB imagery. ISPRS J. Photogramm. Remote Sens. 2019, 151, 27–41. [Google Scholar] [CrossRef]
- Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnypabc, M.L.; Barethac, G. Combining uav-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
- Woebbecke, D.M.; Meyer, G.E.; Bargen, K.V.; Mortensenet, D.A. Color Indices for Weed Identification under Various Soil, Residue, and Lighting Conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Kawashima, S.; Nakatani, M. An algorithm for estimating chlorophyll content in leaves using a video camera. Ann. Bot. 1998, 81, 49–54. [Google Scholar] [CrossRef] [Green Version]
- Nie, S.; Wang, C.; Dong, P.; Xi, X.; Zhou, H. Estimating leaf area index of maize using airborne discrete-return lidar data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 3259–3266. [Google Scholar] [CrossRef]
- Hunt, E.R.; Cavigelli, M.; Daughtry, C.S.T.; Mcmurtrey, J.E.; Walthall, C.L. Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precis. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
- Hunt, E.R.; Daughtry, C.S.T.; Mirsky, S.B.; Hively, W.D. Remote sensing with simulated unmanned aircraft imagery for precision agriculture applications. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4566–4571. [Google Scholar] [CrossRef]
- Hague, T.; Tillett, N.D.; Wheeler, H. Automated crop and weed monitoring in widely spaced cereals. Precis. Agric. 2006, 7, 21–32. [Google Scholar] [CrossRef]
- Niazmardi, S.; Homayouni, S.; Safari, A.; Mc, H.; Shang, J.L.; Beckett, K. Histogram-based spatio-temporal feature classification of vegetation indices time-series for crop mapping. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 34–41. [Google Scholar] [CrossRef]
- Kuhn, M. Building Predictive Models in R Using the caret Package. J. Stat. Softw. 2008, 28, 1–26. [Google Scholar] [CrossRef] [Green Version]
- Jimenez-Berni, J.A.; Deery, D.M.; Rozas-Larraondo, P.; Condon, A.G.; Rebetzke, G.J.; James, R.A.; Bovill, W.D.; Furbank, R.T.; Sirault, X.R.R. High throughput determination of plant height, ground cover, and above-ground biomass in Wheat with LiDAR. Front. Plant Sci. 2018, 9, 237. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jing, L.H.; Hu, B.X.; Noland, T.; Li, J.L. An individual tree crown delineation method based on multi-scale segmentation of imagery. J. Photogramm. Remote Sens. 2012, 70, 88–98. [Google Scholar] [CrossRef]
- Wang, H.Y.; Shen, Z.F.; Zhang, Z.H.; Xu, Z.Y.; Li, S.; Jiao, S.H.; Lei, Y.T. Improvement of Region-Merging Image Segmentation Accuracy Using Multiple Merging Criteria. Remote Sens. 2021, 13, 2782. [Google Scholar] [CrossRef]
- Han, L.; Yang, G.J.; Dai, H.Y.; Xu, B.; Yang, H.; Feng, H.K.; Li, Z.H.; Yang, X.D. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods. 2019, 15, 1–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Liu, Y.N.; Liu, S.S.; Li, J.; Guo, X.Y.; Wang, S.Q.; Lu, J.W. Estimating biomass of winter oilseed rape using vegetation indices and texture metrics derived from UAV multispectral images. Comput. Electron. Agric. 2019, 166, 105026. [Google Scholar] [CrossRef]
- Yan, S.; Yao, X.C.; Zhu, D.H.; Liu, D.Y.; Zhang, L.; Yu, G.J.; Gao, B.B.; Yang, J.Y.; Yun, W.J. Large-scale crop mapping from multi-source optical satellite imageries using machine learning with discrete grids. Int. J. Appl. Earth Obs. 2021, 103, 102485. [Google Scholar] [CrossRef]
- Casapia, X.T.; Falen, L.; Bartholomeus, H.; Cárdenas, R.; Flores, G.; Herold, M.; Coronado, E.N.H.; Baker, T.R. Identifying and Quantifying the Abundance of Economically Important Palms in Tropical Moist Forest Using UAV Imagery. Remote Sens. 2019, 12, 9. [Google Scholar] [CrossRef] [Green Version]
- Peerbhay, K.Y.; Mutanga, O.; Ismail, R. Investigating the capability of few strategically placed worldview-2 multispectral bands to discriminate Forest species in KwaZulu-Natal, South Africa. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 307–316. [Google Scholar] [CrossRef]
- Li, Y.S.; Li, Q.T.; Liu, Y.; Xie, W.X. A spatial-spectral SIFT for hyperspectral image matching and classification. Pattern Recogn. Lett. 2019, 127, 18–26. [Google Scholar] [CrossRef]
- Ise, T.; Minagawa, M.; Onishi, M. Identifying 3 moss species by deep learning, using the “chopped picture” method. Open J. Ecol. 2018, 8, 166–173. [Google Scholar] [CrossRef] [Green Version]
- Reichstein, M.; Camps-Valls, G.; Stevens, B.; Jung, M.; Denzler, J.; Carvalhais, N. Deep learning and process understanding for data-driven Earth system science. Nature 2019, 566, 195–204. [Google Scholar] [CrossRef] [PubMed]
- Ma, L.; Liu, Y.; Zhang, X.; Ye, Y.; Yin, G.; Johnson, B.A. Deep learning in remote sensing applications: A meta-analysis and review. ISPRS J. Photogramm. Remote Sens. 2019, 152, 166–177. [Google Scholar] [CrossRef]
- Minaee, S.; Boykov, Y.Y.; Porikli, F.; Plaza, A.J.; Kehtarnavaz, N.; Terzopoulos, D. Image Segmentation Using Deep Learning: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2021. [Google Scholar] [CrossRef]
- Muhammad, Z.K.; Mohan, K.G.; Yugyung, L.; Muazzam, A.K. Deep Neural Architectures for Medical Image Semantic Segmentation: Review. Access IEEE 2021, 9, 83002–83024. [Google Scholar]
Class | Description | Training (Number of Segments) | Testing (Number of Segments) |
---|---|---|---|
Ephedra przewalskii | The corresponding average height and crown width are 0.41 and 1 m. Growth between individual plants greatly differed. In UAV images, the individual shrub mostly appears yellowish green. | 1067 | 457 |
Salsola laricifolia | The average height is 0.25 m, and the average crown width is 0.49 m. It is often associated with individual Ephedra przewalskii plant. In UAV images, the individual shrub mostly appears dark green. | 686 | 294 |
Sarcozygium xanthoxylon | Shrub vegetation with fewer leaves and more thick branches. The average height is 0.47 m, and the average crown width is 0.83 m. In UAV images, the individual shrub mostly appears light green. | 159 | 68 |
Gymnocarpos przewalskii | Shrub vegetation with fewer leaves and more twigs. The average height is 0.34 m, the average crown width is 0.64 m, and the number of Gymnocarpos przewalskii is less. In UAV images, the individual shrub mostly appears light green. | 182 | 78 |
Vegetation Indices | Overall Accuracy (%) | Kappa Coefficient |
---|---|---|
ExGR | 67.07 | 0.35 |
GBRI | 71.27 | 0.42 |
VEG | 86.19 | 0.71 |
RGBVI | 94.53 | 0.88 |
ExG | 96.31 | 0.93 |
Variable Set | k-NN | SVM | RF | |||
---|---|---|---|---|---|---|
OA | Kappa | OA | Kappa | OA | Kappa | |
A-PC | 0.53 | 0.42 | 0.65 | 0.48 | 0.69 | 0.50 |
B-PC | 0.62 | 0.46 | 0.77 | 0.63 | 0.79 | 0.66 |
C-PC | 0.61 | 0.44 | 0.74 | 0.61 | 0.76 | 0.61 |
D-PC | 0.73 | 0.68 | 0.86 | 0.8 | 0.89 | 0.82 |
Mean | 0.62 | 0.5 | 0.76 | 0.63 | 0.78 | 0.65 |
Variable Set | Class | Producer Accuracy | Number of Samples | Ephedra przewalskii | Salsola laricifolia | Sarcozygium xanthoxylon | Gymnocarpos przewalskii |
---|---|---|---|---|---|---|---|
A-PC | Ephedra przewalskii | 0.76 | 457 | 347 | 66 | 21 | 23 |
Salsola laricifolia | 0.72 | 294 | 64 | 213 | — | 17 | |
Sarcozygium xanthoxylon | 0.43 | 68 | 31 | — | 29 | 8 | |
Gymnocarpos przewalskii | 0.41 | 78 | 16 | 22 | 8 | 32 | |
Total | 897 | 458 | 301 | 58 | 80 | ||
User Accuracy | 0.76 | 0.71 | 0.50 | 0.40 | |||
Overall Accuracy (621/897) = 69.23% | Kappa = 0.50 | ||||||
B-PC | Ephedra przewalskii | 0.82 | 457 | 375 | 54 | 13 | 15 |
Salsola laricifolia | 0.85 | 294 | 33 | 249 | — | 12 | |
Sarcozygium xanthoxylon | 0.54 | 68 | 27 | — | 37 | 4 | |
Gymnocarpos przewalskii | 0.59 | 78 | 12 | 15 | 5 | 46 | |
Total | 897 | 447 | 318 | 55 | 77 | ||
User Accuracy | 0.84 | 0.78 | 0.67 | 0.60 | |||
Overall Accuracy (707/897) = 78.82% | Kappa = 0.66 | ||||||
C-PC | Ephedra przewalskii | 0.78 | 457 | 355 | 82 | 9 | 11 |
Salsola laricifolia | 0.78 | 294 | 59 | 228 | — | 7 | |
Sarcozygium xanthoxylon | 0.66 | 68 | 9 | — | 45 | 14 | |
Gymnocarpos przewalskii | 0.67 | 78 | 7 | 6 | 13 | 52 | |
Total | 897 | 430 | 316 | 67 | 84 | ||
User Accuracy | 0.83 | 0.72 | 0.67 | 0.62 | |||
Overall Accuracy (680/897) = 75.81% | Kappa = 0.61 | ||||||
D-PC | Ephedra przewalskii | 0.91 | 457 | 418 | 16 | 10 | 13 |
Salsola laricifolia | 0.93 | 294 | 10 | 274 | — | 10 | |
Sarcozygium xanthoxylon | 0.68 | 68 | 19 | — | 46 | 3 | |
Gymnocarpos przewalskii | 0.73 | 78 | 7 | 10 | 4 | 57 | |
Total | 897 | 454 | 300 | 60 | 83 | ||
User Accuracy | 0.92 | 0.91 | 0.77 | 0.69 | |||
Overall Accuracy (795/897) = 88.63% | Kappa = 0.82 |
Location | Top of Alluvial Fans | Middle of Alluvial Fans | Bottom of Alluvial Fans | |
---|---|---|---|---|
Shrub Species | ||||
Ephedra przewalskii | 228 | 189 | 43 | |
Salsola laricifolia | 332 | 219 | 0 | |
Sarcozygium xanthoxylon | 36 | 31 | 0 | |
Gymnocarpos przewalskii | 42 | 20 | 0 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Z.; Ding, J.; Zhang, H.; Feng, Y. Classifying Individual Shrub Species in UAV Images—A Case Study of the Gobi Region of Northwest China. Remote Sens. 2021, 13, 4995. https://doi.org/10.3390/rs13244995
Li Z, Ding J, Zhang H, Feng Y. Classifying Individual Shrub Species in UAV Images—A Case Study of the Gobi Region of Northwest China. Remote Sensing. 2021; 13(24):4995. https://doi.org/10.3390/rs13244995
Chicago/Turabian StyleLi, Zhipeng, Jie Ding, Heyu Zhang, and Yiming Feng. 2021. "Classifying Individual Shrub Species in UAV Images—A Case Study of the Gobi Region of Northwest China" Remote Sensing 13, no. 24: 4995. https://doi.org/10.3390/rs13244995
APA StyleLi, Z., Ding, J., Zhang, H., & Feng, Y. (2021). Classifying Individual Shrub Species in UAV Images—A Case Study of the Gobi Region of Northwest China. Remote Sensing, 13(24), 4995. https://doi.org/10.3390/rs13244995