Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning
"> Figure 1
<p>Location of the four plots in the nature reserve “Stolby,” Krasnoyarsk city (Russia), where <b>A</b>, <b>B</b> plots are fragments from the orthophotos used to build the training dataset; and <b>C</b>, <b>D</b> plots are fragments from the orthophotos used to build the testing dataset used for external validation or independent testing.</p> "> Figure 2
<p>Damage categories of Siberian fir trees used in this study (adapted from Krivets et al. [<a href="#B32-remotesensing-11-00643" class="html-bibr">32</a>]): (<b>a</b>) completely healthy tree or recently attacked by beetles, (<b>b</b>) tree colonized by beetles, (<b>c</b>) recently died tree and (<b>d</b>) deadwood. Top figures illustrate the vertical orthoimages corresponding to the bottom horizontal pictures.</p> "> Figure 3
<p>The architecture of our CNN model of deep machine learning.</p> "> Figure 4
<p>Pre-processing consisted of the following steps: first, converting the three band image into one grey-scale band image (PAN); second, converting grey-scale band image into blurred image; third, converting the blurred image into a binary image based on a 100 over 256 digital value threshold; and fourth, detecting categories of trees on RGB images. The 48 candidate patches identified in test area C and 40 candidate patches identified in test area D are labelled with red contour in the right panel.</p> "> Figure 5
<p>Loss and accuracy for each epoch of the CNN model training with data augmentation.</p> "> Figure 6
<p>Confusion matrix of the proposed CNN model with data augmentation on the candidate regions obtained from test areas C and D.</p> "> Figure 7
<p>Results of the detection of damaged tree categories on test area C. C1, C2, C3 and C4 indicate the estimated class by our CNN classification model together with the corresponding probability. The symbols “+” and “−” indicate respectively correct and incorrect class estimation by our model.</p> "> Figure 8
<p>Results of the detection of damaged tree categories on test area D. C1, C2, C3 and C4 indicate the estimated class by our CNN classification model together with the corresponding probability. The symbols “+” and “−” indicate respectively correct and incorrect class estimation by our model.</p> ">
Abstract
:1. Introduction
- As far as we know, this is the first work in addressing the problem of forest damage detection caused by the P. proximus beetle in very high resolution images from UAVs with deep learning.
- We built a new labelled dataset of orthoimages with four categories of damage stages in fir trees.
- We designed a new CNN architecture to accurately classify trees in UAV images into different damage categories of trees and compared it to the most powerful CNN models in the stat.
- We developed the detection model as follows. First, a new detection method selects the candidate regions that contain trees in UAV images. Then, these candidate regions are processed by the multiclass model to finally predict the category of tree damage.
- We provide a complete description of the used methodology and the source code so that it can be replicated by other researchers to detect and classify tree damage by bark beetle in other images. Our source code can be found at https://github.com/ansafo/OurCNN.
2. Introduction to Deep learning and CNNs
- A convolution layer, the main building block of a CNN. It is based on a fundamental operation in image processing called convolution, which consists of filtering a 2D input image with a small 2D-matrix called filter or kernel.
- A pooling layer, which reduces the input matrix in general by half. Conceptually, this operation is applied to increase the abstraction of the extracted features and it usually follows the convolution layer.
- A fully connected layer, which is used as a classifier of the previously calculated high level features to derive scores for each target class.
2.1. CNNs Models
2.2. Transfer Learning and Data Augmentation
3. Classification of Trees in High Resolution Imagery and Related Works
4. Study Area and Data Acquisition
5. Methods
5.1. Definition of Fir Trees Damage Categories
5.2. Dataset Preprocessing and Data Augmentation Techniques of Sample patches for Training the CNN Models
- First, the dataset for training, validation and testing consisted in 50 manually sampled image-patches of single trees per each tree damage category, resulting in 200 image-patches from the plot A and B. To train the CNN model we used 80% of this selection. The remaining 20% of images were used for model internal validation.
- Second, the dataset for external testing or external validation consisted in 88 patch-images generated by our proposed detection technique from the areas C and D.
5.3. Classification Model
5.4. Proposed Detection Process
6. Results and Analysis
6.1. Evaluation Metrics
6.2. Evaluation of the Training Process and Comparison
6.3. External Test Detection Results
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Abbreviations
CNN | Convolutional neural networks |
UAV | Unmanned aerial vehicle |
P. proximus | Polygraphus proximus |
RGB | Red, green, blue |
ERS | Earth remote sensing |
SVM | Support vector machine |
RBF | Radial basis function |
EPSG | Google Earth in European Petroleum Survey Group |
OBIA | Object based image analysis |
LDA | Linear discriminant analysis |
PCA | Principal component analysis |
RSI | Remote sensing indices |
DJI | Dajiang Innovation Technology |
QGIS | Quantum Geographic Information System |
ADAM | ADAptive Moment estimation |
ReLU | Rectified linear unit |
TP | True positives |
TN | True negatives |
FP | False positives |
FN | False negatives |
API | Application programming interface |
GPU | Graphics processing unit |
CPU | Central processing unit |
Appendix A
Model | Without Augmentation | With Augmentation | ||
---|---|---|---|---|
Loss | Accuracy | Loss | Accuracy | |
Our CNN model | 0.05 | 1 | 0.001 | 1 |
Xception | 0.3 | 0.89 | 0.002 | 1 |
VGG-16 | 0.03 | 1 | 0.03 | 1 |
VGG-19 | 0.07 | 1 | 0.01 | 0.99 |
ResNet-50 + 4 | 0.21 | 0.94 | 0.01 | 1 |
Inception-V3 + 4 | 0.34 | 0.89 | 0.003 | 1 |
InceptionResNet-V2 + 4 | 0.13 | 0.94 | 0.04 | 0.99 |
DenseNet-121 + 4 | 0.17 | 0.94 | 0.02 | 0.99 |
DenseNet-169 + 4 | 0.08 | 1 | 0.004 | 0.99 |
DenseNet-201 + 4 | 0.12 | 0.94 | 0.01 | 1 |
Categories of Trees | TP | TN | FP | FN | Acc (%) | Precision (%) | Recall (%) | F-Score (%) |
---|---|---|---|---|---|---|---|---|
Xception | ||||||||
1 | 30 | 26 | 19 | 2 | 72.73 | 61.22 | 93.75 | 74.07 |
2 | 15 | 41 | 8 | 21 | 65.88 | 65.22 | 41.67 | 50.85 |
3 | 7 | 49 | 1 | 9 | 84.85 | 87.5 | 53.75 | 58.33 |
4 | 4 | 52 | 4 | 0 | 93.33 | 50 | 100 | 66.67 |
VGG-16 | ||||||||
1 | 24 | 43 | 3 | 8 | 85.90 | 88.89 | 75 | 81.36 |
2 | 33 | 34 | 14 | 3 | 79.76 | 70.21 | 91.67 | 79.52 |
3 | 6 | 61 | 0 | 10 | 87.01 | 100 | 37.5 | 54.55 |
4 | 4 | 63 | 4 | 0 | 94.37 | 50 | 100 | 66.67 |
VGG-19 | ||||||||
1 | 26 | 38 | 6 | 6 | 84.21 | 81.25 | 81.25 | 81.25 |
2 | 29 | 35 | 11 | 7 | 78.05 | 72.5 | 80.56 | 76.32 |
3 | 5 | 59 | 0 | 11 | 85.33 | 100 | 31.25 | 47.62 |
4 | 4 | 60 | 7 | 0 | 90.14 | 36.36 | 100 | 53.33 |
ResNet-50 | ||||||||
1 | 32 | 32 | 16 | 0 | 80 | 66.67 | 100 | 80 |
2 | 20 | 44 | 1 | 16 | 79.01 | 95.24 | 55.56 | 70.18 |
3 | 9 | 55 | 1 | 7 | 88.89 | 90 | 56.25 | 69.23 |
4 | 3 | 61 | 6 | 1 | 90.14 | 33.33 | 75 | 46.15 |
Inception-V3 | ||||||||
1 | 30 | 25 | 21 | 2 | 70.51 | 58.82 | 93.75 | 72.29 |
2 | 16 | 39 | 8 | 20 | 66.27 | 66.67 | 44.44 | 53.33 |
3 | 6 | 49 | 1 | 10 | 83.33 | 85.71 | 37.50 | 52.17 |
4 | 3 | 52 | 3 | 1 | 93.22 | 50 | 75 | 60 |
InceptionResNet-V2 | ||||||||
1 | 23 | 27 | 14 | 9 | 68.49 | 62.16 | 71.88 | 66.67 |
2 | 18 | 32 | 11 | 18 | 63.29 | 62.07 | 50 | 55.38 |
3 | 6 | 44 | 6 | 10 | 75.76 | 50 | 37.5 | 42.86 |
4 | 3 | 47 | 7 | 1 | 86.21 | 30 | 75 | 42.86 |
DenseNet-121 | ||||||||
1 | 26 | 33 | 10 | 6 | 78.67 | 72.22 | 81.25 | 76.47 |
2 | 24 | 35 | 8 | 12 | 74.68 | 75 | 66.67 | 70.59 |
3 | 6 | 53 | 1 | 10 | 84.29 | 85.71 | 37.50 | 52.17 |
4 | 3 | 56 | 10 | 1 | 84.29 | 23.08 | 75 | 35.29 |
DenseNet-169 | ||||||||
1 | 32 | 26 | 21 | 0 | 73.42 | 60.38 | 100 | 75.29 |
2 | 15 | 43 | 5 | 21 | 69.05 | 75 | 41.67 | 53.57 |
3 | 8 | 50 | 1 | 8 | 86.57 | 88.89 | 50 | 64 |
4 | 3 | 55 | 3 | 1 | 93.55 | 50 | 75 | 60 |
Dense Net-201 | ||||||||
1 | 32 | 22 | 22 | 0 | 71.05 | 59.26 | 100 | 74.42 |
2 | 15 | 39 | 7 | 21 | 65.85 | 68.18 | 41.67 | 51.72 |
3 | 4 | 50 | 1 | 12 | 80.6 | 80 | 25 | 38.1 |
4 | 3 | 51 | 4 | 1 | 91.53 | 42.86 | 75 | 54.55 |
Appendix B. Brief Description of the CNNs That Are Compared against the CNN Developed in This Work, That Is, VGG, ResNet, Inception-V3, InceptionResNet-V2, Xception and DenseNet
References
- Bonan, G.B. Forests and climate change: Forcings, feedbacks, and the climate benefits of forests. Science 2008, 320, 1444–1449. [Google Scholar] [CrossRef]
- Hansen, M.C.; Potapov, P.V.; Moore, R.; Hancher, M.; Turubanova, S.A.; Tyukavina, A.; Thau, D.; Stehman, S.V.; Goetz, S.J.; Loveland, T.R.; et al. High-resolution global maps of 21st-century forest cover change. Science 2013, 342, 850–853. [Google Scholar] [CrossRef]
- Kuznetsov, V.; Sinev, S.; Yu, C.; Lvovsky, A. Key to Insects of the Russian Far East (in 6 Volumes). Volume 5. Trichoptera and Lepidoptera. Part 3. Available online: https://www.rfbr.ru /rffi/ru/books/o_66092 (accessed on 4 March 2019).
- Kerchev, I. Ecology of four-eyed fir bark beetle Polygraphus proximus Blandford (Coleoptera, Curculionidae, Scolytinae) in the west Siberian region of invasion. Rus. J. Biol. Invasions 2014, 5, 176–185. [Google Scholar] [CrossRef]
- Pashenova, N.V.; Kononov, A.V.; Ustyantsev, K.V.; Blinov, A.G.; Pertsovaya, A.A.; Baranchikov, Y.N. Ophiostomatoid fungi associated with the four-eyed fir bark beetle on the territory of russia. Rus. J. Biol. Invasions 2018, 9, 63–74. [Google Scholar] [CrossRef]
- Baranchikov, Y.; Akulov, E.; Astapenko, S. Bark beetle Polygraphus proximus: A new aggressive far eastern invader on Abies species in Siberia and European Russia. In Proceedings of the 21st U.S. Department of Agriculture Interagency Research Forum on Invasive Species, Annapolis, MD, USA, 12–15 January 2010. [Google Scholar]
- Helbig, M.; Wischnewski, K.; Kljun, N.; Chasmer, L.E.; Quinton, W.L.; Detto, M.; Sonnentag, O. Regional atmospheric cooling and wetting effect of permafrost thaw-induced boreal forest loss. Glob. Chang. Biol. 2016, 22, 4048–4066. [Google Scholar] [CrossRef] [Green Version]
- Ma, Z. The Effects of Climate Stability on Northern Temperate Forests. Ph.D. Thesis, Aarhus University, Aarhus, Denmark, March 2016. [Google Scholar]
- Lehmann, J.R.K.; Nieberding, F.; Prinz, T.; Knoth, C. Analysis of unmanned aerial system-based CIR images in forestry—A new perspective to monitor pest infestation levels. Forests 2015, 6, 594–612. [Google Scholar] [CrossRef]
- Li, W.; Fu, H.; Yu, L.; Cracknell, A. Deep learning based oil palm tree detection and counting for high-resolution remote sensing images. Remote Sens. 2016, 9, 22. [Google Scholar] [CrossRef]
- Guirado, E.; Tabik, S.; Alcaraz-Segura, D.; Cabello, J.; Herrera, F. Deep-learning versus OBIA for scattered shrub detection with google earth imagery: Ziziphus lotus as case study. Remote Sens. 2017, 9, 1220. [Google Scholar] [CrossRef]
- Baeta, R.; Nogueira, K.; Menotti, D.; dos Santos, J.A. Learning deep features on multiple scales for coffee crop recognition. In Proceedings of the 2017 30th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), Niteroi, Brazil, 17–20 October 2017; pp. 262–268. [Google Scholar]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1–5. [Google Scholar] [CrossRef]
- Waser, L.T.; Küchler, M.; Jütte, K.; Stampfer, T. Evaluating the potential of worldview—2 data to classify tree species and different levels of ash mortality. Remote Sens. 2014, 6, 4515–4545. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y. Aaron Courville Deep Learning. MIT Press. Available online: https://mitpress.mit.edu/books/deep-learning (accessed on 4 March 2019).
- Tabik, S.; Peralta, D.; Herrera-Poyatos, A.; Herrera, F. A snapshot of image pre-processing for convolutional neural networks: Case study of MNIST. Int. J. Comput. Intell. Syst. 2017, 10, 555–568. [Google Scholar] [CrossRef]
- Deli, Z.; Bingqi, C.; Yunong, Y. Farmland scene classification based on convolutional neural network. In Proceedings of the 2016 International Conference on Cyberworlds (CW), Chongqing, China, 28–30 September 2016; pp. 159–162. [Google Scholar]
- Längkvist, M.; Kiselev, A.; Alirezaie, M.; Loutfi, A. Classification and segmentation of satellite orthoimagery using convolutional neural networks. Remote Sens. 2016, 8, 329. [Google Scholar] [CrossRef]
- Dyrmann, M.; Karstoft, H.; Midtiby, H. Plant species classification using deep convolutional neural network. Biosyst. Eng. 2016, 151, 72–80. [Google Scholar] [CrossRef]
- Razavi, S.; Yalcin, H. Using convolutional neural networks for plant classification. In Proceedings of the 25th Signal Processing and Communications Applications Conference (SIU), Antalya, Turkey, 15–18 May 2017; pp. 1–4. [Google Scholar]
- Dos Santos Ferreira, A.; Matte Freitas, D.; da Silva, G.; Pistori, H.; Folhes, M.T. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
- Onishi, M.; Ise, T. Automatic classification of trees using a UAV onboard camera and deep learning. arXiv, 2018; arXiv:1804.10390. [Google Scholar]
- Abdullah, H.; Darvishzadeh, R.; Skidmore, A.K.; Groen, T.A.; Heurich, M. European spruce bark beetle (Ips typographus, L.) green attack affects foliar reflectance and biochemical properties. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 199–209. [Google Scholar] [CrossRef]
- Heurich, M.; Ochs, T.; Andresen, T.; Schneider, T. Object-orientated image analysis for the semi-automatic detection of dead trees following a spruce bark beetle (Ips typographus) outbreak. Eur. J. For. Res. 2009, 129, 313–324. [Google Scholar] [CrossRef]
- Ortiz, S.M.; Breidenbach, J.; Kändler, G. Early detection of bark beetle green attack using terraSAR-X and rapideye data. Remote Sens. 2013, 5, 1912–1931. [Google Scholar] [CrossRef]
- Meddens, A.J.H.; Hicke, J.A.; Vierling, L.A.; Hudak, A.T. Evaluating methods to detect bark beetle-caused tree mortality using single-date and multi-date Landsat imagery. Remote Sens. Environ. 2013, 132, 49–58. [Google Scholar] [CrossRef]
- Näsi, R.; Honkavaara, E.; Lyytikäinen-Saarenmaa, P.; Blomqvist, M.; Litkey, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Tanhuanpää, T.; Holopainen, M. Using UAV-based photogrammetry and hyperspectral imaging for mapping bark beetle damage at tree-level. Remote Sens. 2015, 7, 15467–15493. [Google Scholar] [CrossRef]
- Dash, J.P.; Watt, M.S.; Pearse, G.D.; Heaphy, M.; Dungey, H.S. Assessing very high resolution UAV imagery for monitoring forest health during a simulated disease outbreak. ISPRS J. Photogramm. Remote Sens. 2017, 131, 1–14. [Google Scholar] [CrossRef]
- Näsi, R.; Honkavaara, E.; Blomqvist, M.; Lyytikäinen-Saarenmaa, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Holopainen, M. Remote sensing of bark beetle damage in urban forests at individual tree level using a novel hyperspectral camera from UAV and aircraft. Urban For. Urban Green. 2018, 30, 72–83. [Google Scholar] [CrossRef]
- Ryabovol, S.V. The Vegetetion of Krasnoyarsk. Modern Problems of Science and Education. Available online: https://www.science-education.ru/en/article/view?id=7582 (accessed on 1 March 2019).
- Agisoft PhotoScan User Manual—Professional Edition, Version 1.4. 127p. Available online: https://www.agisoft.com/pdf/photoscan-pro_1_4_en.pdf (accessed on 15 March 2019).
- Krivets, S.A.; Kerchev, I.A.; Bisirova, E.M.; Pashenova, N.V.; Demidko, D.A.; Petko, V.M.; Baranchikov, Y.N. Four-Eyed Fir Bark Beetle in Siberian Forests (Distribution, Biology, Ecology, Detection and Survey of Damaged Stands; UMIUM: Krasnoyarsk, Russia, 2015. [Google Scholar]
- Dawkins, P. Calculus III–Green’s Theorem. Available online: http://tutorial.math.lamar.edu/Classes/CalcIII/GreensTheorem.aspx (accessed on 23 November 2018).
- Basic Evaluation Measures from the Confusion Matrix. Classifier Evaluation with Imbalanced Datasets 2015. Available online: https://classeval.wordpress.com/introduction/basic-evaluation-measures/ (accessed on 15 March 2019).
- Keras Documentation. Available online: https://keras.io/ (accessed on 23 November 2018).
- Murray, C. Deep Learning CNN’s in Tensorflow with GPUs. Available online: https://hackernoon.com/deep-learning-cnns-in-tensorflow-with-gpus-cba6efe0acc2 (accessed on 23 November 2018).
- Jordan, J. Common Architectures in Convolutional Neural Networks. Available online: https://www.jeremyjordan.me/convnet-architectures/ (accessed on 2 August 2018).
- Mohan, M.; Silva, C.A.; Klauberg, C.; Jat, P.; Catts, G.; Cardil, A.; Hudak, A.T.; Dia, M. Individual tree detection from unmanned aerial vehicle (UAV) derived canopy height model in an open canopy mixed conifer forest. Forests 2017, 8, 340. [Google Scholar] [CrossRef]
- ImageNet Large Scale Visual Recognition Competition 2014 (ILSVRC2014). Available online: http://www.image-net.org/challenges/LSVRC/2014/results (accessed on 28 February 2019).
- COCO–Common Objects in Context. Available online: http://cocodataset.org/#home (accessed on 28 February 2019).
- Chollet, F. Xception: Deep learning with depthwise separable convolutions. arXiv, 2016; arXiv:1610.02357. [Google Scholar]
Steps | The Pre-Processing Stages of Input UAV Imagery |
---|---|
Step 1 | Selection of image patches—a set of pictures manually cropped from orthophotomosaics using QGIS 7.2.2. |
Step 2 | Resizing of patches to 150 × 200 RGB pixels using cubic interpolation. |
Step 3 | Manual assignment of each patch to the appropriate tree damage category. |
Methods for Increasing the Amount of Data | |
Step 1 | Change the saturation of RGB channels. |
Step 2 | Remove Gaussian blur filter with a blur value of 5% and a width and height of a kernel 0.5. |
Step 3 | Pixel averaging by collapsing an image with a normalized 4 × 4 pixel window filter. |
Step 4 | Image rotation relatively to its centre with 5°, 15°, 50°, 90°, 180° rotation angles. |
Step 5 | Cropping the central rectangle of each image to half distance from each edge (discarding the framing borders) and resizing back the central rectangle to 150 × 200 RGB pixels using cubic interpolation. |
Categories of Trees | Training Dataset (Plots A and B) | Testing Dataset for External Test | ||||
---|---|---|---|---|---|---|
Without Data Augmentation | With Data Augmentation | |||||
Training | Internal Validation | Training | Internal Validation | Test Area C | Test Area D | |
1 | 40 | 10 | 880 | 220 | 5 | 13 |
2 | 40 | 10 | 880 | 220 | 36 | 14 |
3 | 40 | 10 | 880 | 220 | 5 | 7 |
4 | 40 | 10 | 880 | 220 | 2 | 6 |
Total: | 160 | 40 | 3520 | 880 | 48 | 40 |
Layers | Output Size, pix | Network |
---|---|---|
Convolution | 150 × 200 | 3 × 3, 96 |
Max pooling | 75 × 100 | 2 × 2, stride 2 |
Convolution | 75 × 100 | 5 × 5, 128 |
Dropout | 75 × 100 | 0.25 |
Convolution | 75 × 100 | 3 × 3, 128 |
Max pooling | 38 × 50 | 2 × 2, stride 2 |
Convolution | 38 × 50 | 3 × 3, 128 |
Dropout | 38 × 50 | 0.25 |
Convolution | 38 × 50 | 5 × 5, 128 |
Dropout | 38 × 50 | 0.5 |
Convolution | 38 × 50 | 5 × 5, 512 |
Global Average pooling | 1 × 972,800 | stride 1 |
Dense | 1 × 972,800 | ReLu |
Dropout | 1 × 400 | 0.5 |
Dense | 1 × 100 | ReLu |
Dense | 1 × 4 | Softmax |
Steps | Data Processing Algorithm |
---|---|
Step 1 | Conversion of RGB images to a grayscale colour palette |
Step 2 | Blurring of grayscale images using the Gaussian high-pass filter in order to reduce Gaussian noise with the following function parameters: kernel size of 11 × 11 pixels and the standard deviation equal to 0 |
Step 3 | Creation of binary images from the grayscale blurred images by application of a threshold function with the optimal brightness threshold value of input image pixels equal to 100 |
Step 4 | Structuring of picture elements outlines by application of two successive functions (erosion and dilation) with several iterations for the binary pictures in order to distinguish individual tree crown contours and to minimize the effect of their confluence (fusion) in one object |
Step 5 | Detection of image patches was implemented using a contour area calculation function based on the Green formula (Dawkins [33]), object size for the function was set in the range between 50 × 50 and 200 × 200 pixels |
Categories of Trees | TP | TN | FP | FN | Accuracy (%) | Precision (%) | Recall (%) | F-Score (%) |
---|---|---|---|---|---|---|---|---|
Without Augmentation | ||||||||
1 | 18 | 44 | 3 | 14 | 78.48 | 85.71 | 56.25 | 67.92 |
2 | 30 | 32 | 15 | 6 | 74.7 | 66.67 | 83.33 | 74.07 |
3 | 10 | 52 | 0 | 6 | 91.18 | 100 | 65.5 | 76.92 |
4 | 4 | 58 | 8 | 0 | 88.57 | 33.33 | 100 | 50 |
With Augmentation | ||||||||
1 | 32 | 48 | 5 | 0 | 94.12 | 86.49 | 100 | 92.75 |
2 | 31 | 49 | 2 | 5 | 91.95 | 93.94 | 86.11 | 89.86 |
3 | 13 | 67 | 0 | 3 | 96.39 | 100 | 81.25 | 89.66 |
4 | 4 | 76 | 1 | 0 | 98.77 | 80 | 100 | 88.89 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Safonova, A.; Tabik, S.; Alcaraz-Segura, D.; Rubtsov, A.; Maglinets, Y.; Herrera, F. Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sens. 2019, 11, 643. https://doi.org/10.3390/rs11060643
Safonova A, Tabik S, Alcaraz-Segura D, Rubtsov A, Maglinets Y, Herrera F. Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sensing. 2019; 11(6):643. https://doi.org/10.3390/rs11060643
Chicago/Turabian StyleSafonova, Anastasiia, Siham Tabik, Domingo Alcaraz-Segura, Alexey Rubtsov, Yuriy Maglinets, and Francisco Herrera. 2019. "Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning" Remote Sensing 11, no. 6: 643. https://doi.org/10.3390/rs11060643
APA StyleSafonova, A., Tabik, S., Alcaraz-Segura, D., Rubtsov, A., Maglinets, Y., & Herrera, F. (2019). Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sensing, 11(6), 643. https://doi.org/10.3390/rs11060643