Tree Species Classification of Drone Hyperspectral and RGB Imagery with Deep Learning Convolutional Neural Networks
<p>Location and distribution of all regions. Each class is individually colored.</p> "> Figure 2
<p>Location and distribution of trees in the third region. Each class is individually colored.</p> "> Figure 3
<p>Treetop shape in 37 normalized RGB, canopy height model (CHM), and spectral channels.</p> "> Figure 4
<p>Feed forward neural network structure. Hidden and output nodes are magnified as blue and green boxes respectively.</p> "> Figure 5
<p>Structure of a 3D-convolutional neural networks (CNN) model for input of (i) layers of size 25 × 25.</p> "> Figure 6
<p>The optimization process of the feedforward network.</p> "> Figure 7
<p>Receiver operative characteristic (ROC) curve of the feedforward network for all layers.</p> "> Figure 8
<p>Confusion matrices of the multi-layer perceptron (MLP) with all 37 input layers (HS+RGB+CHM); (<b>a</b>) case numbers, (<b>b</b>) normalized percentages.</p> "> Figure 9
<p>Receiver operative characteristic (ROC) curve of the 3D-CNN model for all layers.</p> "> Figure 10
<p>Confusion matrices of the 3D-CNN with all 37 input layers (HS+RGB+CHM); (<b>a</b>) case numbers, (<b>b</b>) normalized percentages.</p> "> Figure 11
<p>Confusion matrices of the 3D-CNN with CHM and spectral channels; (<b>a</b>) case numbers, (<b>b</b>) normalized percentages.</p> "> Figure 12
<p>Confusion matrices of the 3D-CNN with RGB and CHM channels; (<b>a</b>) case numbers, (<b>b</b>) normalized percentages.</p> "> Figure 13
<p>Confusion matrices of the 3D-CNN with RGB and spectral channels; (<b>a</b>) case numbers, (<b>b</b>) normalized percentages.</p> "> Figure 14
<p>Confusion matrices of the 3D-CNN with only spectral channels (33 bands); (<b>a</b>) case numbers, (<b>b</b>) normalized percentages.</p> "> Figure 15
<p>Confusion matrices of the 3D-CNN with only RGB channels; (<b>a</b>) case numbers, (<b>b</b>) normalized percentages.</p> ">
Abstract
:1. Introduction
- An efficient structure for a 3D-CNN network that is suitable for tree species classification is proposed and investigated. The proposed structure achieves a very high classification accuracy, while it is simpler than previously proposed structures [2];
- An evaluation of the proposed model is performed by comparing it to an MLP classifier;
- Different feature combinations originating from different potential sensors are compared to find the most relevant feature set.
2. Materials and Methods
2.1. Test Site and Remote Sensing Datasets
2.2. Dataset Preparation for Classification
2.3. Classification Models
2.3.1. Neural Network (Multi-Layered Perceptron)
2.3.2. 3D Convolutional Neural Network
2.4. Performance Assessment
3. Results
3.1. Multi-Layer Perceptron
3.2. Convolutional Neural Network
4. Discussion
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A
Predicted Labels | |||||
Real Labels | Pine | Spruce | Birch | Total Producer | |
Pine | ++ | ||||
Spruce | |||||
Birch | |||||
Total User |
References
- Yu, X.; Hyyppä, J.; Litkey, P.; Kaartinen, H.; Vastaranta, M.; Holopainen, M. Single-Sensor Solution to Tree Species Classification Using Multispectral Airborne Laser Scanning. Remote Sens. 2017, 9, 108. [Google Scholar] [CrossRef] [Green Version]
- Pölönen, I.; Annala, L.; Rahkonen, S.; Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T. Tree Species Identification Using 3D Spectral Data and 3D Convolutional Neural Network. In Proceedings of the 2018 9th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), Amsterdam, The Netherlands, 23–26 September 2018; pp. 1–5. [Google Scholar]
- Peña, J.; Gutiérrez, P.; Hervás-Martínez, C.; Six, J.; Plant, R.; López-Granados, F. Object-based image classification of summer crops with machine learning methods. Remote Sens. 2014, 6, 5019–5041. [Google Scholar] [CrossRef] [Green Version]
- Li, Y.; Zhang, H.; Shen, Q. Spectral–spatial classification of hyperspectral imagery with 3D convolutional neural network. Remote Sens. 2017, 9, 67. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Lin, Z.; Zhao, X.; Wang, G.; Gu, Y. Deep learning-based classification of hyperspectral data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2094–2107. [Google Scholar] [CrossRef]
- Xie, Z.; Chen, Y.; Lu, D.; Li, G.; Chen, E. Classification of Land Cover, Forest, and Tree Species Classes with ZiYuan-3 Multispectral and Stereo Data. Remote Sens. 2019, 11, 164. [Google Scholar] [CrossRef] [Green Version]
- Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.N.; et al. Individual tree detection and classification with UAV-based photogrammetric point clouds and hyperspectral imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef] [Green Version]
- Safonova, A.; Tabik, S.; Alcaraz-Segura, D.; Rubtsov, A.; Maglinets, Y.; Herrera, F. Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sens. 2019, 11, 643. [Google Scholar] [CrossRef] [Green Version]
- Shrestha, S.; Bochenek, Z.; Smith, C. Extreme Learning Machine for classification of high resolution remote sensing images and its comparison with traditional Artificial Neural Networks (ANN). EARSeL eProc. 2014, 13, 49. [Google Scholar]
- Ji, S.; Zhang, C.; Xu, A.; Shi, Y.; Duan, Y. 3D convolutional neural networks for crop classification with multi-temporal remote sensing images. Remote Sens. 2018, 10, 75. [Google Scholar] [CrossRef] [Green Version]
- Honkavaara, E.; Khoramshahi, E. Radiometric correction of close-range spectral image blocks captured using an unmanned aerial vehicle with a radiometric block adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef] [Green Version]
- Friedman, J.; Hastie, T.; Tibshirani, R. The Elements of Statistical Learning; Springer Series in Statistics: New York, NY, USA, 2001; Volume 1. [Google Scholar]
- Raczko, E.; Zagajewski, B. Comparison of support vector machine, random forest and neural network classifiers for tree species classification on airborne hyperspectral APEX images. Eur. J. Remote Sens. 2017, 50, 144–154. [Google Scholar] [CrossRef] [Green Version]
- Franklin, S.E.; Ahmed, O.S. Deciduous tree species classification using object-based analysis and machine learning with unmanned aerial vehicle multispectral data. Int. J. Remote Sens. 2018, 39, 5236–5245. [Google Scholar] [CrossRef]
- Holmgren, J.; Persson, A. Identifying species of individual trees using airborne laser scanner. Remote Sens. Environ. 2004, 90, 415–423. [Google Scholar] [CrossRef]
- Hadi; Rautiainen, M. A study on the drivers of canopy reflectance variability in a boreal forest. Remote Sens. Lett. 2018, 9, 666–675. [Google Scholar] [CrossRef] [Green Version]
- Hovi, A.; Raitio, P.; Rautiainen, M. A spectral analysis of 25 boreal tree species. Silva Fenn. 2017, 51. [Google Scholar] [CrossRef] [Green Version]
- Korpela, I.; Ørka, H.O.; Maltamo, M.; Tokola, T.; Hyyppä, J. Tree species classification using airborne LiDAR–effects of stand and tree parameters, downsizing of training set, intensity normalization, and sensor type. Silva Fenn. 2010, 44, 319–339. [Google Scholar] [CrossRef] [Green Version]
- Dalponte, M.; Ørka, H.O.; Gobakken, T.; Gianelle, D.; Næsset, E. Tree Species Classification in Boreal Forests with Hyperspectral Data. IEEE Trans. Geosci. Remote Sens. 2013, 51, 2632–2645. [Google Scholar] [CrossRef]
- Heinzel, J.; Koch, B. Exploring full-waveform LiDAR parameters for tree species classification. Int. J. Appl. Earth Obs. Geoinf. 2011, 13, 152–160. [Google Scholar] [CrossRef]
- Heinzel, J.; Koch, B. Investigating multiple data sources for tree species classification in temperate forest and use for single tree delineation. Int. J. Appl. Earth Obs. Geoinf. 2012, 18, 101–110. [Google Scholar] [CrossRef]
- Yao, W.; Krzystek, P.; Heurich, M. Tree species classification and estimation of stem volume and DBH based on single tree extraction by exploiting airborne full-waveform LiDAR data. Remote Sens. Environ. 2012, 123, 368–380. [Google Scholar] [CrossRef]
- Fassnacht, F.E.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.T.; Straub, C.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
- Ferreira, M.P.; Wagner, F.H.; Aragão, L.E.; Shimabukuro, Y.E.; de Souza Filho, C.R. Tree species classification in tropical forests using visible to shortwave infrared WorldView-3 images and texture analysis. ISPRS J. Photogramm. Remote Sens. 2019, 149, 119–131. [Google Scholar] [CrossRef]
- Tuominen, S.; Balazs, A.; Honkavaara, E.; Pölönen, I.; Saari, H.; Hakala, T.; Viljanen, N. Hyperspectral UAV-imagery and photogrammetric canopy height model in estimating forest stand variables. Silva Fenn. 2017, 51. [Google Scholar] [CrossRef] [Green Version]
- Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
- Mäkynen, J.; Holmlund, C.; Saari, H.; Ojala, K.; Antila, T. Unmanned aerial vehicle (UAV) operated megapixel spectral camera. In Electro-Optical Remote Sensing, Photonic Technologies, and Applications V; International Society for Optics and Photonics: Bellingham, WA, USA, 2011; Volume 8186, p. 81860Y. [Google Scholar]
- Saari, H.; Pölönen, I.; Salo, H.; Honkavaara, E.; Hakala, T.; Holmlund, C.; Mäkynen, J.; Mannila, R.; Antila, T.; Akujärvi, A. Miniaturized hyperspectral imager calibration and UAV flight campaigns. In Sensors, Systems, and Next-Generation Satellites XVII; International Society for Optics and Photonics: Bellingham, WA, USA, 2013; Volume 8889, p. 88891O. [Google Scholar]
- Honkavaara, E.; Rosnell, T.; Oliveira, R.; Tommaselli, A. Band registration of tuneable frame format hyperspectral UAV imagers in complex scenes. ISPRS J. Photogramm. Remote Sens. 2017, 134, 96–109. [Google Scholar] [CrossRef]
- Cybenko, G. Approximation by superpositions of a sigmoidal function. Math. Control Signals Syst. 1989, 2, 303–314. [Google Scholar] [CrossRef]
- Hornik, K. Approximation capabilities of multilayer feedforward networks. Neural Netw. 1991, 4, 251–257. [Google Scholar] [CrossRef]
- Dumoulin, V.; Visin, F. A guide to convolution arithmetic for deep learning. arXiv 2016, arXiv:1603.07285. [Google Scholar]
- Bradley, A.P. The use of the area under the ROC curve in the evaluation of machine learning algorithms. Pattern Recognit. 1997, 30, 1145–1159. [Google Scholar] [CrossRef] [Green Version]
- Vedaldi, A.; Lenc, K. Matconvnet: Convolutional neural networks for matlab. In Proceedings of the 23rd ACM International Conference on Multimedia, Brisbane, Australia, 26–30 October 2015; pp. 689–692. [Google Scholar]
- Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef] [Green Version]
L0 (nm): 507.60, 509.50, 514.50, 520.80, 529.00, 537.40, 545.80, 554.40, 562.70, 574.20, 583.60, 590.40, 598.80, 605.70, 617.50, 630.70, 644.20, 657.20, 670.10, 677.80, 691.10, 698.40, 705.30, 711.10, 717.90, 731.30, 738.50, 751.50, 763.70, 778.50, 794.00, 806.30, 819.70 |
FWHM (nm): 11.2, 13.6, 19.4, 21.8, 22.6, 20.7, 22.0, 22.2, 22.1, 21.6, 18.0, 19.8, 22.7, 27.8, 29.3, 29.9, 26.9, 30.3, 28.5, 27.8, 30.7, 28.3, 25.4, 26.6, 27.5, 28.2, 27.4, 27.5, 30.5, 29.5, 25.9, 27.3, 29.9 |
Layer | Kernel Size | Kernel Number | Stride | Output Size |
---|---|---|---|---|
Input | - | - | 25 × 25 × (i) | |
Conv1 | 5 × 5 × i | 20 | 1 | 21 × 21 × 20 |
Max Pool | 3 × 3 | 1 | 3 | 7 × 7 × 20 |
Conv2 | 5 × 5 × 20 | 50 | 1 | 3 × 3 × 50 |
Max Pool | 3 × 3 | 1 | 3 | 1 × 1 × 50 |
ReLU | - | - | ||
Conv3 | 1 × 1 × 50 | 3 | 1 | 1 × 1 × 3 |
Soft Max loss | - | - | 3 | |
Total parameters: | 43650 (for 37 layers) |
Producer’s Accuracy | User’s Accuracy | Overall Accuracy | |||||
---|---|---|---|---|---|---|---|
Feature Set | Pine | Spruce | Birch | Pine | Spruce | Birch | |
HS | 0.990 | 0.910 | 0.970 | 0.970 | 0.952 | 1.000 | 0.970 |
RGB | 0.986 | 0.959 | 0.920 | 0.977 | 0.943 | 0.990 | 0.971 |
CHM | 0.965 | 0.184 | 0.000 | 0.665 | 0.593 | 0 | 0.660 |
HS+RGB | 0.996 | 0.948 | 0.974 | 0.981 | 0.976 | 1.000 | 0.983 |
HS+CHM | 0.99 | 0.897 | 0.965 | 0.964 | 0.951 | 1.000 | 0.966 |
HS+Blue | 0.986 | 0.920 | 0.974 | 0.971 | 0.947 | 1.000 | 0.970 |
RGB+CHM | 0.994 | 0.960 | 0.912 | 0.975 | 0.971 | 0.981 | 0.975 |
HS+RGB+CHM | 0.986 | 0.943 | 0.982 | 0.979 | 0.953 | 1.000 | 0.976 |
HS+RGB+CHM(MLP) | 0.984 | 0.822 | 0.956 | 0.937 | 0.935 | 1.000 | 0.945 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nezami, S.; Khoramshahi, E.; Nevalainen, O.; Pölönen, I.; Honkavaara, E. Tree Species Classification of Drone Hyperspectral and RGB Imagery with Deep Learning Convolutional Neural Networks. Remote Sens. 2020, 12, 1070. https://doi.org/10.3390/rs12071070
Nezami S, Khoramshahi E, Nevalainen O, Pölönen I, Honkavaara E. Tree Species Classification of Drone Hyperspectral and RGB Imagery with Deep Learning Convolutional Neural Networks. Remote Sensing. 2020; 12(7):1070. https://doi.org/10.3390/rs12071070
Chicago/Turabian StyleNezami, Somayeh, Ehsan Khoramshahi, Olli Nevalainen, Ilkka Pölönen, and Eija Honkavaara. 2020. "Tree Species Classification of Drone Hyperspectral and RGB Imagery with Deep Learning Convolutional Neural Networks" Remote Sensing 12, no. 7: 1070. https://doi.org/10.3390/rs12071070
APA StyleNezami, S., Khoramshahi, E., Nevalainen, O., Pölönen, I., & Honkavaara, E. (2020). Tree Species Classification of Drone Hyperspectral and RGB Imagery with Deep Learning Convolutional Neural Networks. Remote Sensing, 12(7), 1070. https://doi.org/10.3390/rs12071070