Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
Modeling Primary Branch Diameter and Length for Planted Pinus koraiensis by Incorporating Neighbor Competition in Northeast China
Next Article in Special Issue
Measuring the Tree Height of Picea crassifolia in Alpine Mountain Forests in Northwest China Based on UAV-LiDAR
Previous Article in Journal
Larval Instars and Adult Flight Period of Monochamus saltuarius (Coleoptera: Cerambycidae)
Previous Article in Special Issue
The Potential of Widespread UAV Cameras in the Identification of Conifers and the Delineation of Their Crowns
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Recent Advances in Forest Insect Pests and Diseases Monitoring Using UAV-Based Data: A Systematic Review

1
RAIZ—Forest and Paper Research Institute, Quinta de S. Francisco, Rua José Estevão (EN 230-1), Eixo, 3800-783 Aveiro, Portugal
2
NOVA Information Management School (NOVA IMS), Universidade NOVA de Lisboa, Campus de Campolide, 1070-312 Lisboa, Portugal
3
DGT—Direção Geral do Território, 1099-052 Lisboa, Portugal
*
Author to whom correspondence should be addressed.
Forests 2022, 13(6), 911; https://doi.org/10.3390/f13060911
Submission received: 4 May 2022 / Revised: 7 June 2022 / Accepted: 9 June 2022 / Published: 10 June 2022
(This article belongs to the Special Issue Advanced Applications of UAV Remote Sensing in Forest Structure)
Figure 1
<p>Search query design (“Platform” AND “Field” AND “Issue”) used.</p> ">
Figure 2
<p>PRISMA flow diagram for the selection of relevant papers (<span class="html-italic">n</span> = number of documents).</p> ">
Figure 3
<p>Temporal distribution of published papers during the period included.</p> ">
Figure 4
<p>World distribution of papers published focusing on UAV-based data.</p> ">
Figure 5
<p>Keyword co-occurrence diagram for the selected papers.</p> ">
Figure 6
<p>Summary of UAV types and model brands identified in the studies.</p> ">
Figure 7
<p>Summary of sensor types, including: (<b>a</b>) types of remote sensing technology identified in each study; (<b>b</b>) top 10 model camera brands.</p> ">
Figure 8
<p>Ground control sampling distance (GSD) versus flight height for different sensor types: (<b>a</b>) hyperspectral sensor (R<sup>2</sup> = 0.16); (<b>b</b>) multispectral sensors (R<sup>2</sup> = 0.39); (<b>c</b>) RGB sensors (R<sup>2</sup> = 0.44).</p> ">
Figure 9
<p>Frontal and side overlap distribution of UAV imagery included in every study.</p> ">
Figure 10
<p>Percentage of each category of ancillary field and laboratory data for UAV–FIPD. (i) No fieldwork; (ii) field visual assessment of the crown vigor or discoloration; (iii) field visual assessment and forest inventory; (iv) field visual assessment, spectroscopy, and laboratory analysis; (v) visual field assessment, forest inventory, and spectroscopy; (vi) visual field assessment, forest inventory, spectroscopy, and laboratory.</p> ">
Figure 11
<p>Summary of the algorithms used in the studies: CNN: convolutional neural network; ITCD: individual tree crown delineation; KNN: K-nearest neighbor; LOGR: logistic regression; LR: linear regression; MLC: maximum likelihood; MSS: multiscale segmentation; PLS: partial least squares; RF: random forest; SVM: support vector machine; TA: thresholding analysis; XGBoost: eXtreme gradient boosting.</p> ">
Figure 12
<p>The overall accuracy of the different classifiers.</p> ">
Figure 13
<p>Processing and analysis software applied in the studies. (<b>a</b>) Image processing software brands; (<b>b</b>) analysis software used.</p> ">
Versions Notes

Abstract

:
Unmanned aerial vehicles (UAVs) are platforms that have been increasingly used over the last decade to collect data for forest insect pest and disease (FIPD) monitoring. These machines provide flexibility, cost efficiency, and a high temporal and spatial resolution of remotely sensed data. The purpose of this review is to summarize recent contributions and to identify knowledge gaps in UAV remote sensing for FIPD monitoring. A systematic review was performed using the preferred reporting items for systematic reviews and meta-analysis (PRISMA) protocol. We reviewed the full text of 49 studies published between 2015 and 2021. The parameters examined were the taxonomic characteristics, the type of UAV and sensor, data collection and pre-processing, processing and analytical methods, and software used. We found that the number of papers on this topic has increased in recent years, with most being studies located in China and Europe. The main FIPDs studied were pine wilt disease (PWD) and bark beetles (BB) using UAV multirotor architectures. Among the sensor types, multispectral and red–green–blue (RGB) bands were preferred for the monitoring tasks. Regarding the analytical methods, random forest (RF) and deep learning (DL) classifiers were the most frequently applied in UAV imagery processing. This paper discusses the advantages and limitations associated with the use of UAVs and the processing methods for FIPDs, and research gaps and challenges are presented.

1. Introduction

Forests play a fundamental role in human well-being [1]. They are crucial carbon pools [2], contributing to mitigating the impacts of climate change [3,4] while ensuring important economic and social benefits, providing soil and water protection, and many other relevant environmental services [5].
In recent decades, changes in the frequency and severity of meteorological events seem to be related to a concomitant drop in the vitality of forests, namely with the outbreak of new insect pests and diseases [5,6,7]. These environmental disturbances can facilitate a change in the frequency of the occurrence of forest pests [8], which undoubtedly impacts the development, survival, reproduction, and dissemination of the species [5]. Insects have been recognized as the first indicators of climate change [9]. Reducing forest degradation and increasing its resilience involves managing and preventing these stressors and disturbing agents [10]. In this context, accurate and timely forest health monitoring is needed to mitigate climate change and support sustainable forest management [11].
Field sampling and symptom observation on foliage and trunks are the main methods to identify and register forest pests and diseases [11,12]. When remotely sensed data with high spatial and spectral resolution are collected at ideal times, we can differentiate canopy reflectance signals from noise in forests affected by pests and diseases [13,14]. Traditional field surveys based on forest inventories and observations are restricted by small area coverage and subjectivity [15]. However, when combined with unmanned aerial vehicles (UAVs), spatial coverage can be expanded, response time minimized, and the costs of monitoring forested areas reduced. UAV systems provide images of high spatial resolution and can obtain updated and timely data with different sensors [16,17]. In addition, they can complement the already well-known and explored satellites with airborne remote sensing capabilities [16,18].
UAVs can also be a valuable field data source to calibrate and validate remote sensing monitoring systems [19]. UAVs offer automatic movement and navigation, support different sensors, provide safe access to difficult locations, and enable data collection under cloudy conditions [20]. In addition, these systems can be operated to monitor specific phenological phases of plants or during pest/disease outbreaks [18,21]. In this sense, UAVs are versatile, flexible, and adaptable to different contexts [22]. Despite the relevant advantageous characteristics of UAVs, some limitations can also be identified, such as limited area coverage, battery duration, payload weight, and local regulations [23].
Several reviews have already provided critical aspects related to the application of UAVs to forest insect pest and disease (FIPD) monitoring (Table 1). Some of them focused on using UAVs in generic subjects, their applications, capabilities, and European regulations [24]. These authors highlighted only three studies related to forest pests and diseases. Adão et al. [25] provided another relevant review on UAV-based hyperspectral sensors and data processing for agriculture and forestry applications. These authors also included only three studies about FIPDs in their review. Eugenio et al. [26] presented a global state of the art method for the development and application of UAV technology in forestry. The authors addressed six studies about forest health monitoring and other forestry applications. Focusing on the data, processing, and potentialities, Guimarães et al. [16] presented nine studies related to FIPDs and other forestry applications. In 2021, a systematic review focusing on forest research applications was completed by Dainelli et al. [27], highlighting 17 studies in which host–pathogen systems and causal agents have been classified. The research question was about forest types, pests and diseases, and their incidence. Torres et al. [28] also proposed a systematic evidence synthesis to identify and analyze studies about forest health issues by applying remote sensing techniques from multiple platforms. In their work, 10 UAV studies were included. Recently, Eugenio et al. [29] proposed a systematic bibliometric literature review about the use of UAVs in forest pest and disease research. These authors studied the temporal trends of the last decade using UAVs based on 33 scientific articles. The authors examined the monitored pests and diseases, focusing on the sensor types, technical flight parameters, and applied analytical methods.
Despite the diversity of UAV–FIPD reviews, the rapid growth of these technologies and related computational advances have led to a need for the constant updating of the literature. On the other hand, the standards for mapping in the forestry context are unclear, so it is necessary to aggregate available scientific studies to improve the current UAV procedures. In this context, we propose this review to address these gaps to analyze the trends, challenges, and future development prospects for UAV–FIPD.
The main objective of this systematic review is to provide readers with the current practices and techniques in use and identify the knowledge gaps in UAV remote sensing for FIPD monitoring. For this purpose, we utilized the preferred reporting items for systematic reviews and meta-analysis (PRISMA) approach to review 49 peer-reviewed articles. A database was built based on bibliometric data, the taxonomic characterization of FIPDs, UAVs and sensor types, data collection and pre-processing, data processing and analytical methods, and software used, in order to find answers to these questions: (1) Which platforms sensors are commonly used? (2) Which are the optimal flight parameters? (3) What are the main strategies for monitoring FIPDs? The quantitative results of this systematic review will allow finding new insights, trends, and challenges for UAV–FIPD.
This systematic review is structured as follows: Section 2 presents the method used to gather data using the main databases, the eligibility criteria, bibliometric analysis, and quantitative analysis details. Section 3 provides our results and discussions, identifying the major sources of information, keyword co-occurrence, the taxonomic characterization of each pest or disease, the frequency of UAV data collection procedures, and the analytical methods applied. Section 4 outlines the research gaps, challenges, and ideas for further research. Finally, in Section 5, we present our conclusions and outline future work.

2. Methods

We reviewed studies using UAV-based data to detect and monitor FIPDs published on the major international journals of remote sensing, drones, plant ecology, and forests indexed by the Scopus and Web of Science (WoS) databases. The systematic review was conducted by adopting the PRISMA methodology [30]. A constructed search query (“Platform” AND “Field” AND “Issue”) was applied on Scopus and WoS scientific databases (Figure 1), making it possible to obtain the bibliographic resources used in this analysis.
The papers were filtered on 31 December 2021 using the search engine in both databases. The biennial conference UAV-g in Zurich, Switzerland, organized by the international photogrammetry community in 2011, was the base for the search time period. According to Colomina and Molina [31], UAS-related conferences and publications increased importantly in the referred period.
Our analysis considered only original articles and conference papers published in high-impact journals. Therefore, we excluded review papers, reports, book chapters, and Ph.D. theses. Furthermore, other search engines such as Google Scholar were utilized to ensure that no relevant studies were omitted. The eligibility criteria for the studies selection were defined as follows: (1) studies of FIPDs using UAV-based imagery; (2) studies providing the type of equipment used and the most critical flight plan parameters; (3) studies related to agroforestry systems; (4) studies of FIPD monitoring using artificial simulations.
A total of 471 records were returned by the query in the selected databases (Figure 2). This set was enriched with three additional studies found using a Google Scholar search. The subsequent analysis involved merging these studies and removing the duplicates using the bibliometrix package (University of Naples Federico II, Naples, Italy) [32] in R Studio (RStudio Team, Boston, MA, USA) [33]. Then, through an abstract screening process, 277 articles were excluded that were not within the scope of the research, such as UAV pest and disease mapping in crops (e.g., citrus or olive trees). A total of 28 articles were excluded that were related to other types of forestry damages or disturbances (e.g., abiotic disturbances, such as windthrow and fire) or lacked the development of appropriate photogrammetric and remote sensing methods for UAV imagery.
Subsequently, we extracted the categories, parameters, and detailed descriptions from each article. To ensure all parameters were included, we considered the procedures and parameters presented by Eskandari et al. [34] and Dash et al. [35] in their works. Studies were categorized according to the general characteristics (i.e., source of the study, year, authors, study location), the taxonomy (i.e., host species, pests, or disease species), UAV and sensor types (i.e., type of UAV, active or passive sensor, manufacturer and model), data collection and pre-processing (i.e., study area size, flight altitude, spatial resolution, frontal and side overlap, field data collection, radiometric and geometric correction), data processing and analytical methods (i.e., spatial analysis unit, segmentation single tree object, feature extraction and selection, analysis type, algorithms, overall accuracy) and finally the software used to pre-process imagery and to perform the analytical methods (Table 2). These categories tried to reflect the vast number of procedures, techniques, and methods commonly used in forest insect pest and disease monitoring with UAVs. According to the target categories, there was a revision of the full text of each of the selected articles retained for the literature review. The dataset created was analyzed using RStudio [33].
The keyword clustering analysis was performed using Zotero (George Mason University, Fairfax, VA, USA) [36] to create the Research Information Systems Citation (.ris) file and VosViewer software (Leiden University, Leiden, The Netherlands) [37]. The quantitative analysis, focused on acquiring the frequencies of each parameter, was summarized using tables and figures.

3. Results and Discussion

3.1. General Characterization of Selected Studies

Among the 49 publications selected, 45 were published in peer-reviewed journals, and 4 in conference proceedings. As shown in Table 3, most journals were Q1-quartile-ranked (40), representing 89%, and the remaining were Q2 articles (5). The top publishers identified were Multidisciplinary Digital Publishing Institute (MDPI) (Switzerland) (26), Elsevier (United States, The Netherlands and Germany) (8), Taylor & Francis Ltd. (China and United Kingdom) (3), and Springer (Germany) (3).
The main journals were the Remote Sensing journal, which published 17 papers related to FIPD, followed by the Forests journal, with 5 articles. The conference proceedings identified in this analysis included the International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences (ISPRS) Archives with three works, and the ISPRS Annals Photogrammetry Remote Sensing and Spatial Information Sciences with one, as shown in Table 4.
Figure 3 illustrates how FIPD monitoring studies using UAV platforms have increased over seven years. Out of the 49 studies, 18 were published in 2021, corresponding to 37%, while 11 were published in 2020 (22%) and 7 in 2018 and 2019 (14%). In recent years, there has been a significant increase in the number of publications, corroborating the result obtained by Eugenio et al. [29]. The advances in UAV capabilities and miniaturization is an essential factor contributing to this study’s interest.
With the growing risks to forests worldwide, forest health monitoring is critical to maintaining forest sustainability [11,38]. Thus, information obtained by UAV offers a variety of monitoring possibilities. Such opportunities include reaching otherwise inaccessible areas using high spatiotemporal resolution, which could complement or completely substitute time-consuming fieldwork [39,40].
Figure 4 illustrates the worldwide distribution of the included studies across four continents (Asia, Europe, Oceania, and North America). As shown, the studies using UAV-based data were located in China (14), the Czech Republic (6), Portugal (4), Spain (4), Finland (3), Scotland (2), South Korea (2), New Zealand (2), the United States (2) and Australia (2). This result may be associated with the type of biome [41] (temperate and boreal forests) and commercial coniferous and hardwood species in these areas [29].
The diversity of keywords used by authors and the number of clusters (3) can be observed in Figure 5. The size of the circle describes the number of occurrences of the keywords, and the color determines which cluster it belongs to. The width of the link between two keywords determines the strength of the connection. A keyword cluster analysis (text mining) was performed using VosViewer based on the frequency of the terms. We merged similar terms and synonyms in a thesaurus file. The words were included in cluster analysis if they occurred at least twice. We applied the node-repulsion LinLog modularity as normalization. Out of 460 keywords, 28 met the threshold. According to Figure 5, the most frequently used keywords were “Forestry”, “UAV”, “Remote sensing”, “Airborne sensing”, “Forest health monitoring” and “multispectral”. Each cluster represents the different study approaches. For instance, the link between “Forestry” and “UAV” is a different approach than the link between “Random Forest” and “UAV”. On the other hand, the strength between “Forestry” and “UAV” is stronger, because they belong to the same cluster. The link between “Random Forest” and “UAV” is less robust because they belong to different clusters.
The keyword analysis revealed how UAV technology is used in forestry and forest health monitoring, with various procedures and approaches for different purposes [42]. For instance, the first cluster (blue color) includes pine wild disease (PWD) detection studies using deep learning techniques such as the convolutional neural network (CNN). The second cluster (green color) contains all the studies about bark beetle (BB) detection and the classification process of insect outbreaks. Hence, this analysis suggests the types of FIPDs studied and the applied analysis types.

3.2. Taxonomic Characterization

The aggregation of the number of publications about pests, diseases and related hosts is shown in Table 5. We separated the studies considering the taxonomy of the pests, diseases and related host tree species. Regarding the forest pests, the BB was the most frequently studied (11), followed by the processionary moth (4), pine shoot beetles (3), and the Chinese pine caterpillar (2). The remaining studies only mentioned one pest species. The most frequently studied disease was the PWD (13), followed by the red band needle blight (2) and pathogenic microorganisms (2). According to this analysis, the remaining studies only mentioned one pest species.
The research provided by Briechle et al. [87] did not present a formal pest or disease, only the host tree species (Pinus sylvestris), because this work was performed in the Chernobyl Exclusion Zone. Otherwise, Dash et al. [15,40] conducted a simulated disease outbreak using herbicide on pinus radiata.
Due to the high number of hosts, the BB, PWD, and the processionary moth have been widely studied. Moreover, they have a tremendous economic impact worldwide. The BB was mainly studied in temperate forest ecosystems, while coniferous defoliators such as processionary moths were mostly studied in boreal and Mediterranean forests [41].
Despite the most studied species in this field being coniferous, we verified an increase in the study of hardwood species studies (3), such as Eucalyptus sp. [57,58,71]. The Eucalyptus genus is one of the most planted worldwide [88,89], especially in temperate regions [90].

3.3. UAV and Sensor Types

3.3.1. UAV Types

Figure 6 shows the circular packing graph where each circle is a group of UAV types considering the number of propellers and architecture. The bubbles inside the circles represent the sub-groups. Each bubble’s size is proportional to the UAV categories used in the studies. We extracted the quantities of each UAV type considering the number of propellers and based on commercial brands. Therefore, it was found that 84% of the studies used multirotor drones. Fixed-wing drones represented 12%, and 4% did not indicate the type, while the remaining 2% used both (fixed-wing and multirotor). Quadcopters were used in 58% of the studies, while hexacopters comprised 15%, octocopters 15%, and fixed-wing drones represented 12%.
Regarding the models used by the number of propellers, the quadcopter model DJI Phantom 4 Pro was used in 30% of the studies and DJI Phantom 3 in 14%. With regard to octocopters, the most used models were the DJI S1000 (25%), Arealtronics (25%), and the MicroKopter Droidwors AD-8 (25%). Thirteen percent made no distinction based on the model used. The hexacopter DJI Matrice 600 model was used in 36% of the works. Finally, in the fixed-wing segment, the most popular was the eBee Sense Fly model with 71% usage, followed by the Quest UAV Qpod (14%) and DB-2 (14%).
Regarding the choice of platform, the most widely adopted was the rotary-wing, which stands out due to its flexibility, versatility, maneuverability, and its ability to hover, offering a much easier automated experience [20,91,92]. Fixed-wing drones are more efficient, stable in crosswind flights, and have short flight times per unit of a mapped area [93]. However, they are less versatile for making small flights when compared with rotary-wing drones. In addition, rotary-wing drones are more suitable for mapping small and complex sites, while fixed-wing drones are more appropriate for covering more extensive areas [94]. Conversely, a faster vehicle may have issues mapping small objects and insufficient overlap [92]. In spite of this, both UAV types offer the possibility to collect data from short intervals and at a local scale, which is relevant for multitemporal studies [15,40]. Notably, the preference for quadcopters may be related to the low-cost acquisition, the wide availability on the market, and the assessment of FIPD in small areas [26]. For example, the DJI Phantom series was the most frequently used in this segment. The hexacopters and octocopters from the DJI series choice were due to the payload capabilities in the remaining studies. Finally, eBee Sense Fly stands out for its maturity in the market. The arguments presented indicate that rotary-wing drones are the most suitable for FIPD monitoring. However, more comparative studies are needed to support the appropriate UAV architecture for this forestry application. Despite these facts, platform choice depends on the survey requirements, the budget, and the experience of the researcher or practitioner. An important point to mention is the market offer of hybrid VTOL (vertical take-off and landing), of which the only disadvantage is the complex system mechanism [95,96]. We anticipate that this UAV type will be used in FIPD studies in the near future.

3.3.2. Sensor Types

Figure 7a illustrates the number of remote sensing sensors, and Figure 7b shows the top 10 model camera brands coupled with UAVs. The passive remote sensor quantities were grouped into four categories: (i) RGB, i.e., the simplification of multispectral red–green–blue (RGB); (ii) multispectral, including RGB, near-infrared, and red-edge bands; (iii) hyperspectral; and finally, (iv) thermal sensors. Light detection and ranging (LiDAR) was the only active sensor found in the studies. As shown in Figure 7a, RGB sensors were used in 12 studies, multispectral cameras in 10, hyperspectral in 3, and thermal sensors in one study. The most widely applied remote sensing technology combination was the RGB and multispectral combination in nine studies, followed by RGB and hyperspectral in three studies, and hyperspectral and LiDAR in three. The remaining combinations of RGB and thermal multispectral and thermal, and multispectral and LiDAR were used in one study. The most relevant sensors found operated in visible light (RGB) and NIR regions, which may be related to the low-cost acquisition and lesser complexity, size, and weight [23,39]. Visible light operates between 400 to 700 nm, while NIR is above 700 nm in terms of wavelength. Most DJI consumer drones are equipped with RGB cameras with minimal features and specifications to perform quality mappings. Besides, we found that researchers and practitioners couple multispectral cameras with consumer UAV types; for instance, Cardil et al. [65] used a multispectral Parrot Sequoia coupled with a Phantom 3 UAV, and Iordache et al. [72] used a Micasense Red-Edge MX connected to a Phantom 4 pro. On the other hand, the hyperspectral and LiDAR sensors are more expensive, have more complex specifications, and are commonly mounted on drones with a higher payload (professional UAV), such as the Matrice 600 used by Lin et al. [63,64].
Concerning the sensor model brands coupled with different UAV architectures, the multispectral cameras Micasense Red-edge and Parrot Sequoia were the most widely used, with nine and eight studies, respectively (Figure 7b). The Phantom 4 Pro Camera (multispectral RGB) was applied in seven studies, followed by the DJI Phantom 3 camera in four studies. Regarding the hyperspectral and LiDAR sensors, the Nano-Hyperspec sensor was used in four studies and LiAir 200 in two studies.
The preferred model brands of the cameras—related to the type and payload of the drones used in FIPD studies—were the DJI Phantom camera, due to the discussed reasons, and the Sony camera, which is known for its quality and specification [12,46,55,56,73,77]. The Micasense series was the leader of the multispectral cameras, containing five bands that capture data in the RGB, near-infrared, and red-edge regions (400–900 nm). The compact size and weight allow it to be used in a large variety of UAV types. Another preferred multispectral sensor is the Parrot Sequoia, which has a low price when compared with the Micasense series. This camera collects four discrete bands: green, red, red-edge, and NIR (530–810 nm). The interest in this type of camera is due to its ability to obtain information on the state of vegetation, thereby offering the chance calculate vegetation indices, since vegetation is more reflective in the infrared region [97] for disease detection [21]. On the other hand, there is the possibility to transform RGB cameras into NIR cameras by changing the filters [20,98,99]. For instance, Lehmann et al. [60] removed the visible light filter, and a neutral glass filter was placed to capture NIR radiation. A similar approach was applied by Brovkina et al. [12], who used an infra-red filter after removing the visible filter.
As for the hyperspectral sensors—Nano-Hyperspect, the Pika L. imaging spectrometer, and the UHD S185 spectrometer—these were the most used because they are adopted on a considerable variety of professional drone types. These sensors have a much broader spectrum than multispectral sensors, which allows the discrimination of small changes in pigmentation and minor anomalies [43], such as water content, and the structure of the tree crown [55]. For these reasons, their use is growing. Despite this, the authors of [72] stress that operational efforts, storage needed due to the high dimensional data and noise, and weight [100] are the main constraints of this type of sensor.
We found three studies that used thermal cameras. Smigaj et al. [84,85] associated the temperature of the vegetation, using a thermal camera, with red band needle blight severity. Using infrared thermography, Maes et al. [60] studied the canopy temperatures of mistletoe plants and infected and uninfected trees.
Active sensors such as LiDAR were used in five studies, mainly to extract structural features of the forest such as tree segmentation, tree crown delineation, and height percentiles for combination with passive sensors [63,81]. LiAir 200 and LR1601-IRIS LiDAR model brands were the most used in the studies analyzed. These models have compatible gimbals with the DJI Matrice series.
Notwithstanding the author’s preferences and costs, hyperspectral sensors register more precise spectral information and are more sensitive to small changes than multispectral sensors [56]. Therefore, they are suitable for identifying changes in vegetation at early stages [55,56,80], mid-term, and post-disturbance. In spite of this, the spatial resolution is lower than multispectral and RGB cameras, and they have a complicated process of imagery registration [22]. According Tmušić et al. [92], multisensor combination for FIPDs has been particularly advantageous. For instance, the authors of [79] used airborne hyperspectral and LiDAR to detect PWD.

3.4. UAV Data Collection

3.4.1. Area Coverage

We identified 35 experimental studies with specified area coverage and 14 without references. The largest mapped area was 16,043 ha, distributed over four sections of 3397 ha, 3825 ha, 5283 ha, and 3537 ha. The smallest area size mapped was 0.12 ha. Eighty seven percent of the studies carried out mappings up to 200 ha, and the remaining were exclusively above 200 ha. The median amount of covered area was 12.25 ha.
The parameters analyzed indicate that most of the studies were carried out in relatively small geographical areas (median = 12.25 ha). However, Xia et al. [78] mapped 16,043 ha in China, distributed in four sections at 700 m altitude, using a fixed-wing UAV to detect dead or diseased trees with PWD. This remarkable mapping shows the high capacity of professional civil UAVs. However, despite UAVs’ technological improvements and operational capabilities, there are barriers to research and development due to the regulatory frameworks adopted by countries worldwide [101]. Due to the recent European regulations of the Commission Implementing Regulation (EU) 2019/947 [102], the factors of area coverage, flight height, and UAV type are complicated. Firstly, remote pilots need to have a specific category course for performing this type of flight. Any UAV flight above 120 m and operating beyond visual line of sight (BVLOS) is only possible through a declaration of operational authorization. A risk analysis carried out through a Specific Operations Risk Assessment (SORA) is also required. This harmonized legislation poses a significant challenge to researchers, foresters, and practitioners, since the bureaucracy around operation is very complex.

3.4.2. Technical Flight Parameters

Table 6 shows the flight height and GSD descriptive statistics by sensor type used in the studies. GSD results from the combination of flight height, focal length, and sensor resolution [92]. According to this study, it is crucial to define the camera settings to determine GSD, which corresponds to the distance between pixel centers. The highest flight altitude was 700 m, and the lowest was 20 m performed with a hyperspectral sensor. The median of flight height for thermal sensor was 75 m, and the highest was 100 m using multispectral sensors.
In terms of GSD, the maximum value was 0.98 m with the thermal sensor, and the minimum was 0.015 m, acquired by an RGB sensor. The median flight height for thermal sensors’ was 75 m, and the highest was 100 m for the multispectral sensors. The RGB sensors’ median GSD was 0.028 m, and the highest was 0.211 m with the thermal sensors.
Figure 8 illustrates GSD versus flight height for different sensor types by remote sensing sensor type. Thermal sensors were ruled out because of the low number of studies. There is a positive correlation between flight height and GSD by sensor type (hyperspectral, multispectral, and RGB sensors).
A positive correlation between flight height and spatial resolution by sensor type was found, excluding the thermal sensors due to the low number of samples. The increase in flight height decreased the GSD [103]. However, there was not always a proportional relationship because GSD is comprised of a combination of flight height, sensor resolution, and focal length [92]. Thus, lower spatial resolution may affect the feature delineation resulting from a high flight height. Nevertheless, low flight height from an insufficient field of view might be detrimental to photogrammetric products [31].
Image overlapping is an essential component for structure for motion (SfM) photogrammetric reconstruction in order to produce digital surface models, orthomosaics, and 3D models [92,103,104]. SfM is a computer vision technique that is used to construct models and composite orthomosaic images. Figure 9a and Figure 9b indicate that multispectral and visible sensors have the same median of frontal–side overlap at 80.0% and 80.0%, respectively.
Hyperspectral and LiDAR sensors present the exact median of frontal–side overlap at about 60.0% and 60.0%. In the studies that used a multispectral camera, the mean of frontal and side overlaps were 81.3% and 77.3%, respectively. In the field of visible light cameras, the means of frontal and side overlap were 79.6% and 72.0%, respectively, while for hyperspectral cameras, the median was 75.0% (frontal) and 55.1% (side). Finally, the LiDAR boxplot showed means of 57.0% and 57.0%.
The inter-quartile range (IQR) in LiDAR presented a lower variability than the other cameras at about 60.0% and 50.0% for the frontal and side overlaps, respectively. In multispectral cameras, the IQR achieved a higher variation at about 90.0% for frontal and 50.0% for the side overlap. Comparing the amplitude range of frontal and side overlap, we identified a much higher IQR in side overlap, except in LiDAR cameras.
An essential component to planning a flight mission is image overlapping (frontal and side lap), specifically for the structure from motion (SfM) photogrammetric process. Appropriate image overlap depends on the flight height and the type of forest texture, repeated patterns, and trees movement caused by the wind, which introduce more significant uncertainty. The 3D point cloud obtained in the SfM process allows the detection of single trees, which may be combined with spectral data for crown segmentation to identify discolored trees [64]. In the studies analyzed, the image frontal–side overlap in the visible light and multispectral regions showed medians of 80% and 80%. Although there is no standardized protocol, there is a tendency to use a high overlap. The studies with the most image frontal–side overlap were Dell et al. [71] and Cessna et al. [54], using 95%/95% and 90%/90%. Although a high percentage increases the number of images, flight time, the volume of data, and computational requirements [34], the authors used geo-auxiliary structural metrics to improve the process of tree detection.
Regarding hyperspectral and LiDAR sensors, the percentage of frontal and side overlap was less than the other sensors. This decision could have been due to saving the batteries, decreasing the mapping time, or flying at a higher altitude to reduce the overlap percentage [105]. As a result, the weight of the sensors has much influence on battery consumption.
Radiometric calibration and correction to reduce the atmospheric effects (for instance, cloud percentage and illumination) were performed in 71.4% of the studies. Another critical aspect is geometric correction using GCPs, which consists of determining the absolute vertical and horizontal errors of the artificial or natural features with known locations [98]. However, only 53% of the studies performed geometric correction.
Most authors used empirical line methods such as Lambertian targets (calibration panels) to avoid radiometric problems in multispectral and hyperspectral images. This procedure is fundamental to reducing noise and avoiding vignetting effects and lens correction. However, in multitemporal studies, it is difficult to avoid this issue due to the imprecise calibration of the imagery, as referenced by Fraser and Congalton [86]. The illumination and atmospheric conditions are not the same.
With respect to thermal sensors, the authors performed the calibration in laboratory conditions against a thermally controlled blackbody radiation source [84,85]. The studies that used RGB sensors performed the gimbal calibration and adjusted the parameters according to the meteorological conditions. Finally, we notice that the studies did not provide the LiDAR calibration.
The authors used the traditional method based on GCPs for georeferencing. Even though it is time-consuming, this still presents an accurate and low-cost solution.

3.4.3. Ancillary Field and Laboratory Data for UAV–FIPD

Figure 10 shows the ancillary field and laboratory data for UAV–FIPD used in the studies. Most of the studies included fieldwork (91.8%), and different strategies were employed. For this analysis, we grouped the ancillary field and laboratory data into six categories: (i) no fieldwork; (ii) field visual assessment of the crown vigor or discoloration; (iii) field visual assessment and forest inventory; (iv) field visual assessment, spectroscopy, and laboratory analysis; (v) visual field assessment, forest inventory, and spectroscopy; (vi) visual field assessment, forest inventory, spectroscopy, and laboratory. The most applied assessment strategy used was category (ii) with 67.4%, followed by categories (iii) and (vi), with 10.2%.
Ancillary data for FIPDs are essential for fully understanding the spatio-temporal processes and validation model procedures. The strategy is highly dependent on the goals of the research. For example, the study of Briechle et al. [87] was conducted using only the interpretation of the imagery collected with UAVs due to the danger of radiation in the Chernobyl Exclusion Zone. The authors of [75,77,83] performed their research through imagery interpretation, without in situ measurements or laboratory data collection, to investigate the feasibility of using the specific classification algorithms.
Most authors used a visual field assessment of the crown vigor or discoloration for model validation. For instance, Näsi et al. [43] identified damaged trees using healthy, infected, and dead classes. Safonova et al. [47,52] assessed the damage to fir trees caused by the attacks of bark beetles using four health classes: (a) completely healthy tree or recently attacked by beetles; (b) tree colonized by beetles; (c) recently died tree; (d) deadwood. The authors of [46,68] turned to experts in order to support the damage or attack assessments.
Combining a visual field assessment with a forest inventory allows us to determine the defoliation rates for each sample tree [55] and characterize the stand using fundamental dendrometric variables, such as diameter at breast height (DBH), tree height, and the social categorization of trees [12]. In addition, a preexisting continuous forest inventory may be helpful to complement monitorization and generate collections of trees to use in studies as ground references [86].
Classical damage identification and sampling methods are limited in detecting changes after infections or pest attacks. In this sense, other field collection strategies can be applied, such as biochemical parameters using spectrometers [80] to measure the leaf chlorophyll content (Cab) and water content (WC) of each tree, spectral measurements, laboratory analysis [72], and the assessment of leaf area index (LAI) using a plant canopy analyzer [62].

3.5. Data Processing and Analytical Methods

3.5.1. Spatial Unit Analysis

Regarding spatial unit analysis, 67.4% (33 studies) used an object-based approach and 22.4% used a pixel-based approach (11 studies), and both were applied by 10.2% (5 studies). As a minimal unit in a digital image, pixels may be used for every scale study. However, only spectral properties are considered in analytical methods, while object-based approaches are performed using segmentation approaches that group objects based on statistical or feature similarities. This approach is mainly performed before feature extraction and applying classifiers, since these methods cannot add contextual information [28].
The authors preferred the object-based approach due to its high-resolution imagery and submeter resolution (<1 m), making it possible to perform an individual tree crown extraction and delineation by substituting the traditional fieldwork [19,44,50]. Zhang et al. [55] stress that tree crown extraction is a prerequisite for diseased detection and mapping.

3.5.2. Segmentation of Single Tree Objects

Table 7 summarizes the segmentation single tree methods used in the studies. Individual tree crown delineation (ITCD) studies using photogrammetric or LiDAR point-cloud utilize a canopy height model (CHM) or digital height model (DSM) to calculate the local maximum height value. For example, they find treetops or locate trees using algorithms such as local maxima filtering, image binarization, scale analysis, and template matching. Tree delineations are grouped in valley following, region growing, and watershed segmentation. Treetops are usually used as a seed for region growing and watershed segmentation. Therefore, this process is required prior to crown delineation [106]. Many studies combine tree detection and crown delineation to extract the crown shape [107,108].
The data aggregation was performed using Zhen et al.’s [109] categories. The segmentation of single tree classes established were manually digitalized, raster-based, and vector-based.
We highlight the manual method that was used in 12 studies, which consisted of manual tree crown delineation using geographic information system (GIS) software. The local maxima filter with a posterior buffer was applied in five studies.
The most frequently used raster-based method was watershed segmentation in six studies, with the original and marker-controlled variants. Concerning the region-growing algorithms used, the Dalponte individual tree segmentation was applied in two studies and Voronoi tessellation in one. Subsequently, geospatial object-based image analysis (GEOBIA) using multispectral or RGB image segmentation methods were used with multiresolution segmentation (3) and the mean shift algorithm (1). Original approaches such as wavelet-based local thresholding and treetops using normalized digital surface model (nDSM) data were also found in the studies.
The vector-based approaches were the 3D region-growing algorithm (4) and the normalized cut algorithm (1).
The analysis of the summarized methods revealed that the manual approach was preferred since it avoids background noise such as shadows and other vegetation types. On the other hand, using the manual approach prevented missing tree crowns or potential errors from interpolation and smoothing procedures [109], which are the disadvantages of raster-based methods. However, the manual approach could be unsustainable when areas have many trees.
Raster-based methods are easy to implement, despite the drawbacks mentioned earlier. Vector-based techniques could be useful for detecting small and understory trees, despite being harder to implement [118].

3.5.3. Feature Extraction and Selection

Table 8 summarizes the feature extraction techniques for UAV imagery applied in the studies. The investigated features used in the studies were catalogued according to the types suggested by the authors in [119,120]. These features are obtainable attributes or properties of objects and scenes [121]. They are computed from original bands or a combination of bands [122]. They include spectral features, textural features, linear transformations, multisensors and multitemporal images. Geo-auxiliary features extracted from LiDAR [118] or photogrammetric point clouds [103] include digital surface models (DSM), canopy height models (CHM), individual tree detection, and topographic features. Spectral features, including statistics of original bands, ratios between bands, and vegetation indices, were the most popular feature type, followed geo-auxiliary features. Variables including multisensor and multitemporal imagery were used in three studies. Textural and linear transformations were also used in three studies.
Spectral features are applied to explain differences in particular symptoms of canopy [19,123]. For instance, Klouček et al. [46] calculated selected vegetation indices and evaluated them based on visual differences in the spectral curves of infested and healthy trees. The classification included Greenness Index (GI), Simple Ratio (SR), Green Ratio Vegetation Index (GRVI), Normalized Difference Vegetation Index (NDVI), and Green Normalized Difference Vegetation Index (GNDVI).
In terms of geo-auxiliary features used to improve the analytical methods, Minařík et al. [50] extracted elevation features (crown area, height percentiles) and three vegetation indices (NDVI, Normalized Difference Red-Edge Index (NDRE), and Enhanced Normalized Difference Vegetation Index (ENDVI)) to detect a bark beetle disturbance in a mixed urban forest. We highlight Nguyen et al. [53], who used nine orthomosaics and normalized digital surface models (nDSM) to detect and classify healthy and declining fir trees and deciduous trees.
Considering the inclusion of multitemporal features in the analytical methods, we noticed that Abdollahnejad and Panagiotidis [48] used a combination of bi-temporal integrated spectral and textural information to discriminate tree species and health status, achieving a satisfactory result. Using multisensor features, Lin et al. [63] assessed the potential of a hyperspectral approach, a lidar approach, and a combined approach to characterize individual trees’ pine shoot beetle (PSB) damage.
Although less applied, the authors used textural features such as GLCM by Guerra-Hernández [65] and linear transformation through HSI [61].
Regarding feature selection, 83.7% of the studies selected variables without reduction techniques. The authors of [52,57,83] used the mean decrease in impurity (MDI) test to quantify the importance of features and excluded the less critical features. For example, Yu et al. [79,81] and Zhang et al. [56] used principal component analysis (PCA) to reduce the data dimensionally. Recursive feature elimination (RFE) for each flight campaign was applied by Pádua et al. [69]. Finally, Yu et al. [80] calculated the Pearson’s correlation coefficient between the features and used a stepwise regression method to test the multicollinearity between them, excluding redundant variables.
The above studies show that feature extraction and selection may improve the analytical methods applied to discriminate between unhealthy and healthy canopies. However, a small portion of the studies used a feature reduction or selection technique. This result can be explained by the high use of a stable image classification algorithm such as random forest, which is insensitive to the dimensionality of data [124,125]. On the other hand, most authors probably calculated a limited number of features due to high correlation. Other reasons may have been to avoid overfitting, a decrease in classification accuracy, or high computational costs [126].

3.5.4. Analysis Type, Algorithms, and Overall Accuracy (OA)

Figure 11 summarizes the algorithms used in the studies by analysis method. We have considered only the best performance of the algorithms in terms of accuracy or shared metric for the ease of describing the different algorithms used.
The classification approach is broadly used for quantifying trees. Regarding the analysis methods, most of the studies (79.6%) used a classification approach, 12.2% used regression and other methods such as statistical analysis and histogram analysis, and 8.2% used damage by stressors, different types of species, and the total area affected. Regression studies focus on a different level of damage and provide statistical significance for regression coefficients and the relation between classes. Statistical methods, physically based models such as radiosity applicable to porous individual objects to calculate different vegetation variables, and specific frameworks were also used to estimate the level of damage.
As previously stated, most of the studies used the classification approach. The classifiers most used in the classification approach were the random forest (RF) and convolutional neural network (CNN), with 11 studies (Figure 12). Five studies applied the support vector machine (SVM) algorithm, and three applied K-Nearest Neighbor (KNN) and the individual tree crown delineation (ITCD) algorithm. We found three studies using linear regression (LR), one using logistic regression (LOG), and two using RF regression models. We also highlighted the class “Others”, which included the radiosity applicable to porous individual objects (RAPID) model, the ISIC-SPA-P-PLSR framework, histogram analysis, and Getis Order GI among the different analytical methods.
The main CNN architectures used were R-CNN [77], AlexNET [74], YOLOv3 [76], ResNet50 [53,82], DeepLabv3+ [78], PointNet [87], 3D-ResCNN [81], 3D-CNN [51], SCANet [75], and other types [47].
Out of 32 studies that calculated OA, we used only 28 for this analysis (Figure 12). Some algorithms, such as logistic regression, linear regression, the 3D rapid model, the ISI-SPA-P-PLSR framework, and thresholding analysis, were ruled out of the study because they had only one sample. Figure 12 shows that the median overall accuracy for all algorithms used was higher than 0.85. SVM achieved the highest median (0.93), followed by ITCD and CNN with 0.90 and 0.88, respectively. The best algorithm performance was SVM (0.99) and ITCD (0.99). The worst performance was RF (0.55).
A wide range of non-parametric and parametric algorithms have been applied in FIPD monitoring. Non-parametric machine learning algorithms that input data are not limited to normal distribution, such as RF, CNN, and SVM, and are preferred to quantify and detect damages. RF can be used for classification and regression problems, allowing a straightforward interpretation of the model structure and determining the variable importance. For example, the authors of [72] used pixel-based RF to classify three levels of PWD infection (infected, suspicious, and health) and achieved 95% accuracy. Duarte et al. [57] performed large-scale mean shift segmentation (LSMSS) on a single-date multispectral image to extract tree crowns with binary classification (healthy or dead trees) and achieved 98% overall accuracy using a RF algorithm.
CNN is a class of deep learning algorithm most used for spatial pattern analysis in remote sensing imagery. Classification and regression approaches could be used in remotely sensed data in the following tasks: scene-wise classification, object detection, semantics, and instance segmentation [127,128]. We highlight the work of Safonova et al. [47] which includes two steps to detect bark beetle damage in a fir forest using an RGB image. First, the authors applied a strategy to find the tree crowns, and in the second step, a new CNN network architecture was used to classify each canopy. The authors using data augmentation achieved 95% accuracy. Briechle et al. [87] performed classification using 3D Deep Neural Network PointNet ++ using UAV-based LiDAR and multispectral imagery of multiple species and standing dead trees. The overall accuracy was 90.2%.
The SVM method based on statistical learning theory has been commonly used in FIPD detection and monitoring [48]. An example is a study by Safonova et al. [52], which automated individual tree crown delineation and Gaussian SVM to extract particular species pixel by pixel and assess the tree canopy vitality. The authors achieved 99% overall accuracy applying this methodology in a multispectral image.
In the first stage, the strategies adopted in different classification approaches—manual or automated tree segmentation—were applied to process the tree crown delineation. Next, the authors used spectral indices, topographic variables, texture, or contextual information based on different image types.

3.6. Pre-Processing and Analysis Software

Most of the studies used more than one processing and analysis software. Therefore, to count them, all the software brands in each paper were considered. Figure 13a illustrates the preferred processing software used in the studies. The photogrammetric software Agisoft (Metashape or Photoscan) was used in 27 studies, followed by Pix4D Mapper in 6 studies. The point cloud processing software LiDAR 360 was used in four studies. For hyperspectral imagery, Spectronon processing was used in three studies and Headwall Spectral View was also used in three. As shown in Figure 13b, ArcGIS was used in 17 studies, Python libraries in 15, and R software in 13 studies.
Agisoft was the most popular software due to its automatic image quality assessment advantages of excluding low-quality images and its standardized workflow [23]. Pix4D mapper was the second preferred software due to the dedicated and automated photogrammetry workflow. The software for imagery classification with the most occurrences was ArcGIS, which is user friendly and offers a complete and standardized workflow for remote sensing. Sophisticated computer vision algorithms are also available on Python libraries such as Tensorflow, Pytorch, and scikit-learn, which are easy to implement and adaptable to other code languages.

4. Research Gaps, Challenges, and Further Research

The scientific literature analyzed shows serious concern regarding improving the detection and monitoring of pests and diseases using UAV data. Proof of this is the increased number of studies in recent years. However, we found that most studies were carried out in small experimental areas that did not always represent the reality of disturbances in forests. In addition, the effects of climate change could promote the development of other pests and diseases.
Our systematic review analysis highlights that the first research gap is related to the lack of flight parameter standards in FIPD monitoring, since each case is unique. Moreover, there is no base protocol for different UAV systems or sensor types. Therefore, the UAV type, sensor type, flight parameters, pre-processing and processing steps, weather conditions, and regulations can affect the results. Hence, providing a detailed description of all flight parameters and processing activities is essential to be taken as a reference for practitioners, researchers, and forest professionals [29,92].
A second research gap is how to take advantage of different UAV imagery and point clouds to detect pests and diseases. Many features can be extracted from UAV-based images and point clouds, such as spectral indices, gray level co-occurrence matrices (GLCMs), digital surface models (DSMs), digital terrain models (DTMs), and point cloud metrics, to integrate classification or regression models. We found a lack of studies using data fusion between optical sensors and LiDAR. Combining these technologies is advantageous for studying vegetation structure, especially tree crown delineation [54,85]. One obstacle is the difficulty in performing image alignment due to the repeated patterns in forests [129]. On the other hand, the high price of sensors is also an important constraint.
In terms of feature extraction, we noticed that genetic programming (GP) was used in the studies to combine spectral bands in satellite imagery [130], to improve the process of land use and land cover (LULC) classification [131], and to identify burned areas [132]. Although the work of Mejia-Zuluaga et al. [133] is outside of the time interval of this study, we verified that the GP algorithm applied achieved an overall accuracy of 96% for classifying mistletoe. In this sense, this approach shows the ability to extract features and improve damage classification. However, to the best of our knowledge, multiclass genetic programming (M3GP) has not been used in UAV images to classify different vegetation vitality.
Deep learning approaches such as CNN may be a robust option for segmentation tasks and identifying different levels of tree crown vitality, as revealed in the studies performed in [47,51,53,74,78].
The third research gap is the limitation of UAV-based imagery to cover large scales. This limitation was highlighted by Eugenio et al. [26,29], who noted that UAVs can be “upscaled” for satellites for expansion without losing accuracy.
The challenges of FIPD monitoring include the recently imposed UAV regulations. Flight altitude is limited to 120 m and a maximum radius of 500 m, and BVLOS rules are problematic for forest surveys.
Each year, UAV technologies are improved. UAV and sensor miniaturization bring new challenges. For example, the battery duration issues are now a reality (for instance, the Matrice 300 battery has a duration of 55 min); however, increasingly efficient drones such as VTOLs may be used to improve research (Wingtra drone series). However, the miniaturization of hyperspectral cameras and their image collection process is being increasingly improved so, in the short term, costs may drop importantly.
One of the biggest challenges is the vulgarization of drones to complement or substitute field data collection. Eugenio et al. [26,29] stress that breaking resistance and disseminating UAVs in the forest community is essential to monitoring our forests.
Future research will benefit from upscaling with satellite imagery to increase area coverage and improve early detection systems. Dash et al. [40] found that RapidEye satellite data can expand stress monitoring and be improved with UAV sensor data. Therefore, area coverage can be increased by combining these different platforms [39]. To this end, intelligent algorithms based on deep learning and genetic programming are necessary for detecting and monitoring disturbances in forest contexts.

5. Conclusions

This systematic literature review aimed to identify the contribution of UAVs to forest insect pest and disease monitoring. Using a PRISMA protocol, we reviewed 49 peer-reviewed articles on FIPD monitoring to provide readers with the current practices and techniques and to identify knowledge gaps and challenges.
We conclude that the number of papers has increased in recent years, especially in 2021 (18 articles). Based on our analysis, China and European Union (EU) countries are the ones with more studies about FIPD monitoring using UAV-based data. The most studied diseases were pine wilt disease (PWD) and the most common pests were bark beetles (BB). Pine, European spruce, and fir were the conifers most frequently studied, while the most common hardwoods were eucalypts.
Rotary-wing drones were the most frequently used due to the market and costs. Our findings document that multispectral and visible light is preferred to monitor FIPDs. Regarding RGB sensors, the DJI Phantom series camera was the most widely used, while the Micasense series was most used for multispectral segments. In addition, we noticed an increase in hyperspectral and LiDAR sensors in the research of FIPDs.
Despite the lack of standards for UAV data collection for FIPDs, our findings may constitute a reference for further research. We found a positive correlation between GSD and flight altitude by sensor type and the median of frontal and side overlap concerning visible, multispectral, and hyperspectral sensors. Most studies included fieldwork to validate the research, and a significant number performed radiometric and geometric calibration.
Concerning the methodological approach of the studies, most works used an object-based analysis unit. Due to the high spatial resolution of the images, the authors of these studies applied several types of methods for tree crown delineation. Tree crown delineation is an essential prerequisite for FIPD detection and monitoring. The spectral and geo-auxiliary features were most used in feature extraction and selection. Regarding analytical methods, random forests (RF) and deep learning (DL) classifiers were the most frequently applied in UAV imagery processing.
Our literature review suggests the lack of flight parameter standards in FIPD monitoring. Data fusion procedures for studying vegetation structure could potentially be improved by combining optical and LiDAR technologies. Other possible improvements for feature extraction include evolutionary algorithms, such as multiclass genetic programming. Deep learning algorithms can be fundamental for pattern recognition and automatic data processing regarding classification or regression.
Finally, upscaling UAV data for satellites to expand data collection without losing accuracy is essential for monitoring our forests.

Author Contributions

Conceptualization, A.D. and M.C.; methodology, A.D.; software, A.D.; formal analysis, A.D.; data curation, A.D.; writing—original draft preparation, A.D., P.C. and M.C.; writing—review and editing, A.D., P.C., N.B. and M.C.; visualization, A.D.; supervision, M.C.; funding acquisition, M.C. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by national funds through Fundação para a Ciência e a Tecnologia (FCT) under the project UIDB/04152/2020—Centro de Investigação em Gestão de Informação (MagIC).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank Cindy Santos, Luís Acevedo-Muñoz, João Rocha and Sérgio Fabres, for all their valuable comments and support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Anderegg, W.R.L.; Trugman, A.T.; Badgley, G.; Anderson, C.M.; Bartuska, A.; Ciais, P.; Cullenward, D.; Field, C.B.; Freeman, J.; Goetz, S.J.; et al. Climate-Driven Risks to the Climate Mitigation Potential of Forests. Science 2020, 368, eaaz7005. [Google Scholar] [CrossRef] [PubMed]
  2. FAO. Assessing Forest Degradation: Towards the Development of Globally Applicable Guidlines; Forest Resources Assessment Working Paper 177; Food and Agriculture Organization of the United Nations: Rome, Italy, 2011. [Google Scholar]
  3. Canadell, J.G.; Raupach, M.R. Managing Forests for Climate Change Mitigation. Science 2008, 320, 1456–1457. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. FAO. Climate Change Guidelines for Forest Managers; FAO Forestry Paper 172; Food and Agriculture Organization of the United Nations: Rome, Italy, 2013; p. 123. [Google Scholar]
  5. FAO. Managing Forests for Climate Change; Food and Agriculture Organization of the United Nations: Rome, Italy, 2010; p. 20. [Google Scholar]
  6. Dale, V.; JOYCE, L.; Mcnulty, S.; Neilson, R.; Ayres, M.; Flannigan, M.; Hanson, P.; Irland, L.; Lugo, A.; PETERSON, C.; et al. Climate Change and Forest Disturbances. BioScience 2001, 51, 723–734. [Google Scholar] [CrossRef] [Green Version]
  7. Senf, C.; Buras, A.; Zang, C.S.; Rammig, A.; Seidl, R. Excess Forest Mortality Is Consistently Linked to Drought across Europe. Nat. Commun. 2020, 11, 6200. [Google Scholar] [CrossRef]
  8. Seidl, R.; Spies, T.A.; Peterson, D.L.; Stephens, S.L.; Hicke, J.A. Searching for Resilience: Addressing the Impacts of Changing Disturbance Regimes on Forest Ecosystem Services. J. Appl. Ecol. 2016, 53, 120–129. [Google Scholar] [CrossRef] [Green Version]
  9. Koricheva, J.; Castagneyrol, B. Science Direct Responses of Forest Insect Pests to Climate Change: Not so Simple. Curr. Opin. Insect Sci. 2019, 35, 103–108. [Google Scholar] [CrossRef]
  10. Raffa, K.F.; Aukema, B.H.; Bentz, B.J.; Carroll, A.L.; Hicke, J.A.; Turner, M.G.; Romme, W.H. Cross-Scale Drivers of Natural Disturbances Prone to Anthropogenic Amplification: The Dynamics of Bark Beetle Eruptions. BioScience 2008, 58, 501–517. [Google Scholar] [CrossRef] [Green Version]
  11. Lausch, A.; Borg, E.; Bumberger, J.; Dietrich, P.; Heurich, M.; Huth, A.; Jung, A.; Klenke, R.; Knapp, S.; Mollenhauer, H.; et al. Understanding Forest Health with Remote Sensing, Part III: Requirements for a Scalable Multi-Source Forest Health Monitoring Network Based on Data Science Approaches. Remote Sens. 2018, 10, 1120. [Google Scholar] [CrossRef] [Green Version]
  12. Brovkina, O.; Cienciala, E.; Surový, P.; Janata, P. Unmanned Aerial Vehicles (UAV) for Assessment of Qualitative Classification of Norway Spruce in Temperate Forest Stands. Geo-Spat. Inf. Sci. 2018, 21, 12–20. [Google Scholar] [CrossRef] [Green Version]
  13. Lausch, A.; Heurich, M.; Gordalla, D.; Dobner, H.J.; Gwillym-Margianto, S.; Salbach, C. Forecasting Potential Bark Beetle Outbreaks Based on Spruce Forest Vitality Using Hyperspectral Remote-Sensing Techniques at Different Scales. For. Ecol. Manag. 2013, 308, 76–89. [Google Scholar] [CrossRef]
  14. Gómez, C.; Alejandro, P.; Hermosilla, T.; Montes, F.; Pascual, C.; Ruiz, L.Á.; Álvarez-Taboada, F.; Tanase, M.A.; Valbuena, R. Remote Sensing for the Spanish Forests in the 21st century: A Review of Advances, Needs, and Opportunities. For. Syst. 2019, 28, eR001. [Google Scholar] [CrossRef]
  15. Dash, J.P.; Watt, M.S.; Pearse, G.D.; Heaphy, M.; Dungey, H.S. Assessing Very High Resolution UAV Imagery for Monitoring Forest Health during a Simulated Disease Outbreak. ISPRS J. Photogramm. Remote Sens. 2017, 131, 1–14. [Google Scholar] [CrossRef]
  16. Guimarães, N.; Pádua, L.; Marques, P.; Silva, N.; Peres, E.; Sousa, J.J. Forestry Remote Sensing from Unmanned Aerial Vehicles: A Review Focusing on the Data, Processing and Potentialities. Remote Sens. 2020, 12, 1046. [Google Scholar] [CrossRef] [Green Version]
  17. Poley, L.G.; McDermid, G.J. A Systematic Review of the Factors Influencing the Estimation of Vegetation Aboveground Biomass Using Unmanned Aerial Systems. Remote Sens. 2020, 12, 1052. [Google Scholar] [CrossRef] [Green Version]
  18. Klosterman, S.; Richardson, A. Observing Spring and Fall Phenology in a Deciduous Forest with Aerial Drone Imagery. Sensors 2017, 17, 2852. [Google Scholar] [CrossRef] [Green Version]
  19. Hall, R.J.; Castilla, G.; White, J.C.; Cooke, B.J.; Skakun, R.S. Remote Sensing of Forest Pest Damage: A Review and Lessons Learned from a Canadian Perspective. Can. Entomol. 2016, 148, S296–S356. [Google Scholar] [CrossRef]
  20. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, Sensors, and Data Processing in Agroforestry: A Review towards Practical Applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  21. Rullan-Silva, C.D.; Olthoff, A.E.; Delgado de la Mata, J.A.; Pajares-Alonso, J.A. Remote Monitoring of Forest Insect Defoliation. A Review. For. Syst. 2013, 22, 377–391. [Google Scholar] [CrossRef] [Green Version]
  22. Yao, H.; Qin, R.; Chen, X. Unmanned Aerial Vehicle for Remote Sensing Applications—A Review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef] [Green Version]
  23. Manfreda, S.; McCabe, M.; Miller, P.; Lucas, R.; Pajuelo Madrigal, V.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the Use of Unmanned Aerial Systems for Environmental Monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef] [Green Version]
  24. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; Zaldei, A.; Wallace, L. Forestry Applications of UAVs in Europe: A Review. Int. J. Remote Sens. 2017, 38, 2427–2447. [Google Scholar] [CrossRef]
  25. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  26. Eugenio, F.C.; Schons, C.T.; Mallmann, C.L.; Schuh, M.S.; Fernandes, P.; Badin, T.L. Remotely Piloted Aircraft Systems and Forests: A Global State of the Art and Future Challenges. Can. J. For. Res. 2020, 50, 705–716. [Google Scholar] [CrossRef]
  27. Dainelli, R.; Toscano, P.; Di Gennaro, S.F.; Matese, A. Recent Advances in Unmanned Aerial Vehicles Forest Remote Sensing—A Systematic Review. Part II: Research Applications. Forests 2021, 12, 397. [Google Scholar] [CrossRef]
  28. Torres, P.; Rodes-Blanco, M.; Viana-Soto, A.; Nieto, H.; García, M. The Role of Remote Sensing for the Assessment and Monitoring of Forest Health: A Systematic Evidence Synthesis. Forests 2021, 12, 1134. [Google Scholar] [CrossRef]
  29. Eugenio, F.C.; PereiradaSilva, S.D.; Fantinel, R.A.; de Souza, P.D.; Felippe, B.M.; Romua, C.L.; Elsenbach, E.M. Remotely Piloted Aircraft Systems to Identify Pests and Diseases in Forest Species: The Global State of the Art and Future Challenges. IEEE Geosci. Remote Sens. Mag. 2021, 10, 2–15. [Google Scholar] [CrossRef]
  30. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. Syst. Rev. 2021, 10, 89. [Google Scholar] [CrossRef]
  31. Colomina, I.; Molina, P. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  32. Aria, M.; Cuccurullo, C. Bibliometrix: An R-Tool for Comprehensive Science Mapping Analysis. J. Informetr. 2017, 11, 959–975. [Google Scholar] [CrossRef]
  33. RStudio Team RStudio: Integrated Development Environment for R; RStudio, PBC: Boston, MA, USA, 2021.
  34. Eskandari, R.; Mahdianpari, M.; Mohammadimanesh, F.; Salehi, B.; Brisco, B.; Homayouni, S. Meta-Analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-Environmental Monitoring Using Machine Learning and Statistical Models. Remote Sens. 2020, 12, 3511. [Google Scholar] [CrossRef]
  35. Dash, J.P.; Watt, M.S.; Paul, T.S.H.; Morgenroth, J.; Hartley, R. Taking a Closer Look at Invasive Alien Plant Research: A Review of the Current State, Opportunities, and Future Directions for UAVs. Methods Ecol. Evol. 2019, 10, 2020–2033. [Google Scholar] [CrossRef] [Green Version]
  36. Zotero; Center for History and New Media at George Mason University: Fairfax, VA, USA, 2022.
  37. van Eck, N.J.; Waltman, L. Software Survey: VOSviewer, a Computer Program for Bibliometric Mapping. Scientometrics 2010, 84, 523–538. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Lausch, A.; Erasmi, S.; King, D.; Magdon, P.; Heurich, M. Understanding Forest Health with Remote Sensing-Part II—A Review of Approaches and Data Models. Remote Sens. 2017, 9, 129. [Google Scholar] [CrossRef] [Green Version]
  39. Cardil, A.; Vepakomma, U.; Brotons, L. Assessing Pine Processionary Moth Defoliation Using Unmanned Aerial Systems. Forests 2017, 8, 402. [Google Scholar] [CrossRef] [Green Version]
  40. Dash, J.; Pearse, G.; Watt, M. UAV Multispectral Imagery Can Complement Satellite Data for Monitoring Forest Health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef] [Green Version]
  41. Senf, C.; Seidl, R.; Hostert, P. Remote Sensing of Forest Insect Disturbances: Current State and Future Directions. Int. J. Appl. Earth Obs. Geoinf. 2017, 60, 49–60. [Google Scholar] [CrossRef] [Green Version]
  42. Adamopoulos, E.; Rinaudo, F. UAS-Based Archaeological Remote Sensing: Review, Meta-Analysis and State-of-the-Art. Drones 2020, 4, 46. [Google Scholar] [CrossRef]
  43. Näsi, R.; Honkavaara, E.; Lyytikäinen-Saarenmaa, P.; Blomqvist, M.; Litkey, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Tanhuanpää, T.; Holopainen, M. Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level. Remote Sens. 2015, 7, 15467–15493. [Google Scholar] [CrossRef] [Green Version]
  44. Näsi, R.; Honkavaara, E.; Blomqvist, M.; Lyytikäinen-Saarenmaa, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Holopainen, M. Remote Sensing of Bark Beetle Damage in Urban Forests at Individual Tree Level Using a Novel Hyperspectral Camera from UAV and Aircraft. Urban For. Urban Green. 2018, 30, 72–83. [Google Scholar] [CrossRef]
  45. Minařík, R.; Langhammer, J. Use of a Multispectral UAV Photogrammetry for Detection and Tracking of Forest Disturbance Dynamics. In The International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences—ISPRS Archives; Zdimal, V., Ramasamy, S.M., Skidmore, A., Altan, O., Comiso, J., Thenkabail, P.S., Halounova, L., Safar, V., Planka, L., Raju, P.L.N., et al., Eds.; International Society for Photogrammetry and Remote Sensing: Christian Heipke, Germany, 2016; Volume 41, pp. 711–718. [Google Scholar]
  46. Klouček, T.; Komárek, J.; Surový, P.; Hrach, K.; Janata, P.; Vašíček, B. The Use of UAV Mounted Sensors for Precise Detection of Bark Beetle Infestation. Remote Sens. 2019, 11, 1561. [Google Scholar] [CrossRef] [Green Version]
  47. Safonova, A.; Tabik, S.; Alcaraz-Segura, D.; Rubtsov, A.; Maglinets, Y.; Herrera, F. Detection of Fir Trees (Abies Sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sens. 2019, 11, 643. [Google Scholar] [CrossRef] [Green Version]
  48. Abdollahnejad, A.; Panagiotidis, D. Tree Species Classification and Health Status Assessment for a Mixed Broadleaf-Conifer Forest with Uas Multispectral Imaging. Remote Sens. 2020, 12, 3722. [Google Scholar] [CrossRef]
  49. Honkavaara, E.; Näsi, R.; Oliveira, R.; Viljanen, N.; Suomalainen, J.; Khoramshahi, E.; Hakala, T.; Nevalainen, O.; Markelin, L.; Vuorinen, M.; et al. Using Multitemporal Hyper-and Multispectral UAV Imaging for Detecting Bark Beetle Infestation on Norway Spruce. In The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences—ISPRS Archives; Paparoditis, N., Mallet, C., Lafarge, F., Jiang, J., Shaker, A., Zhang, H., Liang, X., Osmanoglu, B., Soergel, U., Honkavaara, E., et al., Eds.; International Society for Photogrammetry and Remote Sensing: Christian Heipke, Germany, 2020; Volume 43, pp. 429–434. [Google Scholar]
  50. Minařík, R.; Langhammer, J.; Lendzioch, T. Automatic Tree Crown Extraction from Uas Multispectral Imagery for the Detection of Bark Beetle Disturbance in Mixed Forests. Remote Sens. 2020, 12, 4081. [Google Scholar] [CrossRef]
  51. Minařík, R.; Langhammer, J.; Lendzioch, T. Detection of Bark Beetle Disturbance at Tree Level Using UAS Multispectral Imagery and Deep Learning. Remote Sens. 2021, 13, 4768. [Google Scholar] [CrossRef]
  52. Safonova, A.; Hamad, Y.; Dmitriev, E.; Georgiev, G.; Trenkin, V.; Georgieva, M.; Dimitrov, S.; Iliev, M. Individual Tree Crown Delineation for the Species Classification and Assessment of Vital Status of Forest Stands from UAV Images. Drones 2021, 5, 77. [Google Scholar] [CrossRef]
  53. Nguyen, H.T.; Caceres, M.L.L.; Moritake, K.; Kentsch, S.; Shu, H.; Diez, Y. Individual Sick Fir Tree (Abies mariesii) Identification in Insect Infested Forests by Means of UAV Images and Deep Learning. Remote Sens. 2021, 13, 260. [Google Scholar] [CrossRef]
  54. Cessna, J.; Alonzo, M.G.; Foster, A.C.; Cook, B.D. Mapping Boreal Forest Spruce Beetle Health Status at the Individual Crown Scale Using Fused Spectral and Structural Data. Forests 2021, 12, 1145. [Google Scholar] [CrossRef]
  55. Zhang, N.; Wang, Y.; Zhang, X. Extraction of Tree Crowns Damaged by Dendrolimus Tabulaeformis Tsai et Liu via Spectral-Spatial Classification Using UAV-Based Hyperspectral Images. Plant Methods 2020, 16, 1–9. [Google Scholar] [CrossRef]
  56. Zhang, N.; Zhang, X.; Yang, G.; Zhu, C.; Huo, L.; Feng, H. Assessment of Defoliation during the Dendrolimus Tabulaeformis Tsai et Liu Disaster Outbreak Using UAV-Based Hyperspectral Images. Remote Sens. Environ. 2018, 217, 323–339. [Google Scholar] [CrossRef]
  57. Duarte, A.; Acevedo-Muñoz, L.; Gonçalves, C.I.; Mota, L.; Sarmento, A.; Silva, M.; Fabres, S.; Borralho, N.; Valente, C. Detection of Longhorned Borer Attack and Assessment in Eucalyptus Plantations Using UAV Imagery. Remote Sens. 2020, 12, 3153. [Google Scholar] [CrossRef]
  58. Megat Mohamed Nazir, M.N.; Terhem, R.; Norhisham, A.R.; Mohd Razali, S.; Meder, R. Early Monitoring of Health Status of Plantation-Grown Eucalyptus Pellita at Large Spatial Scale via Visible Spectrum Imaging of Canopy Foliage Using Unmanned Aerial Vehicles. Forests 2021, 12, 1393. [Google Scholar] [CrossRef]
  59. Miraki, M.; Sohrabi, H.; Fatehi, P.; Kneubuehler, M. Detection of Mistletoe Infected Trees Using UAV High Spatial Resolution Images. J. Plant Dis. Prot. 2021, 128, 1679–1689. [Google Scholar] [CrossRef]
  60. Maes, W.; Huete, A.; Avino, M.; Boer, M.; Dehaan, R.; Pendall, E.; Griebel, A.; Steppe, K. Can UAV-Based Infrared Thermography Be Used to Study Plant-Parasite Interactions between Mistletoe and Eucalypt Trees? Remote Sens. 2018, 10, 2062. [Google Scholar] [CrossRef] [Green Version]
  61. Lehmann, J.R.K.; Nieberding, F.; Prinz, T.; Knoth, C. Analysis of Unmanned Aerial System-Based CIR Images in Forestry—A New Perspective to Monitor Pest Infestation Levels. Forests 2015, 6, 594–612. [Google Scholar] [CrossRef] [Green Version]
  62. Lin, Q.; Huang, H.; Chen, L.; Wang, J.; Huang, K.; Liu, Y. Using the 3D Model RAPID to Invert the Shoot Dieback Ratio of Vertically Heterogeneous Yunnan Pine Forests to Detect Beetle Damage. Remote Sens. Environ. 2021, 260, 112475. [Google Scholar] [CrossRef]
  63. Lin, Q.; Huang, H.; Wang, J.; Huang, K.; Liu, Y. Detection of Pine Shoot Beetle (PSB) Stress on Pine Forests at Individual Tree Level Using UAV-Based Hyperspectral Imagery and Lidar. Remote Sens. 2019, 11, 2540. [Google Scholar] [CrossRef] [Green Version]
  64. Liu, M.; Zhang, Z.; Liu, X.; Yao, J.; Du, T.; Ma, Y.; Shi, L. Discriminant Analysis of the Damage Degree Caused by Pine Shoot Beetle to Yunnan Pine Using UAV-Based Hyperspectral Images. Forests 2020, 11, 1258. [Google Scholar] [CrossRef]
  65. Cardil, A.; Otsu, K.; Pla, M.; Silva, C.A.; Brotons, L. Quantifying Pine Processionary Moth Defoliation in a Pine-Oak Mixed Forest Using Unmanned Aerial Systems and Multispectral Imagery. PLoS ONE 2019, 14, e0213027. [Google Scholar] [CrossRef]
  66. Otsu, K.; Pla, M.; Duane, A.; Cardil, A.; Brotons, L. Estimating the Threshold of Detection on Tree Crown Defoliation Using Vegetation Indices from Uas Multispectral Imagery. Drones 2019, 3, 80. [Google Scholar] [CrossRef] [Green Version]
  67. Otsu, K.; Pla, M.; Vayreda, J.; Brotons, L. Calibrating the Severity of Forest Defoliation by Pine Processionary Moth with Landsat and UAV Imagery. Sensors 2018, 18, 3278. [Google Scholar] [CrossRef] [Green Version]
  68. Guerra-Hernández, J.; Díaz-Varela, R.A.; Ávarez-González, J.G.; Rodríguez-González, P.M. Assessing a Novel Modelling Approach with High Resolution UAV Imagery for Monitoring Health Status in Priority Riparian Forests. For. Ecosyst. 2021, 8, 61. [Google Scholar] [CrossRef]
  69. Pádua, L.; Marques, P.; Martins, L.; Sousa, A.; Peres, E.; Sousa, J.J. Monitoring of Chestnut Trees Using Machine Learning Techniques Applied to UAV-Based Multispectral Data. Remote Sens. 2020, 12, 3032. [Google Scholar] [CrossRef]
  70. Sandino, J.; Pegg, G.; Gonzalez, F.; Smith, G. Aerial Mapping of Forests Affected by Pathogens Using UAVs, Hyperspectral Sensors, and Artificial Intelligence. Sensors 2018, 18, 944. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  71. Dell, M.; Stone, C.; Osborn, J.; Glen, M.; McCoull, C.; Rimbawanto, A.; Tjahyono, B.; Mohammed, C. Detection of Necrotic Foliage in a Young Eucalyptus Pellita Plantation Using Unmanned Aerial Vehicle RGB Photography—A Demonstration of Concept. Aust. For. 2019, 82, 79–88. [Google Scholar] [CrossRef] [Green Version]
  72. Iordache, M.-D.; Mantas, V.; Baltazar, E.; Pauly, K.; Lewyckyj, N. A Machine Learning Approach to Detecting Pine Wilt Disease Using Airborne Spectral Imagery. Remote Sens. 2020, 12, 2280. [Google Scholar] [CrossRef]
  73. Syifa, M.; Park, S.-J.; Lee, C.-W. Detection of the Pine Wilt Disease Tree Candidates for Drone Remote Sensing Using Artificial Intelligence Techniques. Engineering 2020, 6, 919–926. [Google Scholar] [CrossRef]
  74. Tao, H.; Li, C.; Zhao, D.; Deng, S.; Hu, H.; Xu, X.; Jing, W. Deep Learning-Based Dead Pine Tree Detection from Unmanned Aerial Vehicle Images. Int. J. Remote Sens. 2020, 41, 8238–8255. [Google Scholar] [CrossRef]
  75. Qin, J.; Wang, B.; Wu, Y.; Lu, Q.; Zhu, H. Identifying Pine Wood Nematode Disease Using Uav Images and Deep Learning Algorithms. Remote Sens. 2021, 13, 162. [Google Scholar] [CrossRef]
  76. Wu, B.; Liang, A.; Zhang, H.; Zhu, T.; Zou, Z.; Yang, D.; Tang, W.; Li, J.; Su, J. Application of Conventional UAV-Based High-Throughput Object Detection to the Early Diagnosis of Pine Wilt Disease by Deep Learning. For. Ecol. Manag. 2021, 486, 118986. [Google Scholar] [CrossRef]
  77. Park, H.G.; Yun, J.P.; Kim, M.Y.; Jeong, S.H. Multichannel Object Detection for Detecting Suspected Trees with Pine Wilt Disease Using Multispectral Drone Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 8350–8358. [Google Scholar] [CrossRef]
  78. Xia, L.; Zhang, R.; Chen, L.; Li, L.; Yi, T.; Wen, Y.; Ding, C.; Xie, C. Evaluation of Deep Learning Segmentation Models for Detection of Pine Wilt Disease in Unmanned Aerial Vehicle Images. Remote Sens. 2021, 13, 3594. [Google Scholar] [CrossRef]
  79. Yu, R.; Luo, Y.; Zhou, Q.; Zhang, X.; Wu, D.; Ren, L. A Machine Learning Algorithm to Detect Pine Wilt Disease Using UAV-Based Hyperspectral Imagery and LiDAR Data at the Tree Level. Int. J. Appl. Earth Obs. Geoinf. 2021, 101, 102363. [Google Scholar] [CrossRef]
  80. Yu, R.; Ren, L.; Luo, Y. Early Detection of Pine Wilt Disease in Pinus Tabuliformis in North China Using a Field Portable Spectrometer and UAV-Based Hyperspectral Imagery. For. Ecosyst. 2021, 8, 44. [Google Scholar] [CrossRef]
  81. Yu, R.; Luo, Y.; Li, H.; Yang, L.; Huang, H.; Yu, L.; Ren, L. Three-Dimensional Convolutional Neural Network Model for Early Detection of Pine Wilt Disease Using Uav-Based Hyperspectral Images. Remote Sens. 2021, 13, 4065. [Google Scholar] [CrossRef]
  82. Yu, R.; Luo, Y.; Zhou, Q.; Zhang, X.; Wu, D.; Ren, L. Early Detection of Pine Wilt Disease Using Deep Learning Algorithms and UAV-Based Multispectral Imagery. For. Ecol. Manag. 2021, 497, 119493. [Google Scholar] [CrossRef]
  83. Sun, Z.; Wang, Y.; Pan, L.; Xie, Y.; Zhang, B.; Liang, R.; Sun, Y. Pine Wilt Disease Detection in High-Resolution UAV Images Using Object-Oriented Classification. J. For. Res. 2021, 577. [Google Scholar] [CrossRef]
  84. Smigaj, M.; Gaulton, R.; Barr, S.L.; Suárez, J.C. UAV-Borne Thermal Imaging for Forest Health Monitoring: Detection Of Disease-Induced Canopy Temperature Increase. In International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences—ISPRS Archives; Paparoditis, N., Raimond, A.-M., Sithole, G., Rabatel, G., Coltekin, A., Rottensteiner, F., Briottet, X., Christophe, S., Dowman, I., Elberink, S.O., et al., Eds.; International Society for Photogrammetry and Remote Sensing: Christian Heipke, Germany, 2015; Volume 40, pp. 349–354. [Google Scholar]
  85. Smigaj, M.; Gaulton, R.; Suárez, J.C.; Barr, S.L. Canopy Temperature from an Unmanned Aerial Vehicle as an Indicator of Tree Stress Associated with Red Band Needle Blight Severity. For. Ecol. Manag. 2019, 433, 699–708. [Google Scholar] [CrossRef]
  86. Fraser, B.T.; Congalton, R.G. Monitoring Fine-Scale Forest Health Using Unmanned Aerial Systems (UAS) Multispectral Models. Remote Sens. 2021, 13, 4873. [Google Scholar] [CrossRef]
  87. Briechle, S.; Krzystek, P.; Vosselman, G. Classification of Tree Species and Standing Dead Trees by Fusing Uav-Based Lidar Data and Multispectral Imagery in the 3D Deep Neural Network Pointnet++. In ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences; Paparoditis, N., Mallet, C., Lafarge, F., Remondino, F., Toschi, I., Fuse, T., Eds.; Copernicus GmbH: Göttingen, Germany, 2020; Volume 5, pp. 203–210. [Google Scholar]
  88. Elli, E.F.; Sentelhas, P.C.; Bender, F.D. Impacts and Uncertainties of Climate Change Projections on Eucalyptus Plantations Productivity across Brazil. For. Ecol. Manag. 2020, 474, 118365. [Google Scholar] [CrossRef]
  89. Zhang, Y.; Wang, X. Geographical Spatial Distribution and Productivity Dynamic Change of Eucalyptus Plantations in China. Sci. Rep. 2021, 11, 19764. [Google Scholar] [CrossRef]
  90. Potts, B.M.; Vaillancourt, R.E.; Jordan, G.; Dutkowski, G.; Costa e Silva, J.; Gay, M.; Steane, D.; Volker, P.; Lopez, G.; Apiolazza, L.; et al. Exploration of the Eucalyptus Globulus Gene Pool. In Proceedings of the Eucalyptus in a Changing World—IUFRO Conference, Aveiro, Portugal, 1–15 October 2004; Borralho, N., Pereira, J.S., Marques, C., Coutinho, J., Madeira, M., Tomé, M., Eds.; IUFRO: Aveiro, Portugal, 2004; pp. 46–61. [Google Scholar]
  91. Cromwell, C.; Giampaolo, J.; Hupy, J.; Miller, Z.; Chandrasekaran, A. A Systematic Review of Best Practices for UAS Data Collection in Forestry-Related Applications. Forests 2021, 12, 957. [Google Scholar] [CrossRef]
  92. Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current Practices in UAS-Based Environmental Monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef] [Green Version]
  93. Shi, Y.; Thomasson, J.A.; Murray, S.C.; Pugh, N.A.; Rooney, W.L.; Shafian, S.; Rajan, N.; Rouze, G.; Morgan, C.L.S.; Neely, H.L.; et al. Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  94. González-Jorge, H.; Martínez-Sánchez, J.; Bueno, M.; Arias, P. Unmanned Aerial Systems for Civil Applications: A Review. Drones 2017, 1, 2. [Google Scholar] [CrossRef]
  95. Watts, A.C.; Ambrosia, V.G.; Hinkley, E.A. Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use. Remote Sens. 2012, 4, 1671–1692. [Google Scholar] [CrossRef] [Green Version]
  96. Zong, J.; Zhu, B.; Hou, Z.; Yang, X.; Zhai, J. Evaluation and Comparison of Hybrid Wing VTOL UAV with Four Different Electric Propulsion Systems. Aerospace 2021, 8, 256. [Google Scholar] [CrossRef]
  97. Müllerová, J.; Bartaloš, T.; Brůna, J.; Dvořák, P.; Vítková, M. Unmanned Aircraft in Nature Conservation: An Example from Plant Invasions. Int. J. Remote Sens. 2017, 38, 2177–2198. [Google Scholar] [CrossRef]
  98. Assmann, J.J.; Kerby, J.T.; Cunliffe, A.M.; Myers-Smith, I.H. Vegetation Monitoring Using Multispectral Sensors—Best Practices and Lessons Learned from High Latitudes. J. Unmanned Veh. Sys. 2019, 7, 54–75. [Google Scholar] [CrossRef] [Green Version]
  99. Pajares, G. Overview and Current Status of Remote Sensing Applications Based on Unmanned Aerial Vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–330. [Google Scholar] [CrossRef] [Green Version]
  100. Stuart, M.B.; McGonigle, A.J. Hyperspectral Imaging in Environmental Monitoring: A Review of Recent Developments and Technological Advances in Compact Field Deployable Systems. Sensors 2019, 19, 3071. [Google Scholar] [CrossRef] [Green Version]
  101. Stöcker, C.; Bennett, R.; Nex, F.; Gerke, M.; Zevenbergen, J. Review of the Current State of UAV Regulations. Remote Sens. 2017, 9, 459. [Google Scholar] [CrossRef] [Green Version]
  102. EASA. Commission Implementing Regulation (EU) 2019/947 of 24 May 2019 on the Rules and Procedures for the Operation of Unmanned Aircraft. Available online: https://eur-lex.europa.eu/eli/reg_impl/2019/947/oj (accessed on 5 March 2021).
  103. Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O’Connor, J.; Rosette, J. Structure from Motion Photogrammetry in Forestry: A Review. Curr. For. Rep. 2019, 5, 155–168. [Google Scholar] [CrossRef] [Green Version]
  104. Whitehead, K.; Hugenholtz, C.H. Applying ASPRS Accuracy Standards to Surveys from Small Unmanned Aircraft Systems (UAS). Photogramm. Eng. Remote Sens. 2015, 81, 787–793. [Google Scholar] [CrossRef]
  105. Barbedo, J. A Review on the Use of Unmanned Aerial Vehicles and Imaging Sensors for Monitoring and Assessing Plant Stresses. Drones 2019, 3, 40. [Google Scholar] [CrossRef] [Green Version]
  106. Ke, Y.; Quackenbush, L.J. A Review of Methods for Automatic Individual Tree-Crown Detection and Delineation from Passive Remote Sensing. Int. J. Remote Sens. 2011, 32, 4725–4747. [Google Scholar] [CrossRef]
  107. Koch, B.; Kattenborn, T.; Straub, C.; Vauhkonen, J. Segmentation of Forest to Tree Objects. In Forestry Application of Airborne Laser Scanning: Concept and Case Studies; Maltamo, M., Naesset, E., Vauhkonen, J., Eds.; Springer Netherlands: Dordrecht, The Netherlands, 2014; pp. 89–112. ISBN 94-017-8662-3. [Google Scholar]
  108. Wang, L.; Gong, P.; Biging, G.S. Individual Tree-Crown Delineation and Treetop Detection in High-Spatial-Resolution Aerial Imagery. Photogramm. Eng. Remote Sens. 2004, 70, 351–357. [Google Scholar] [CrossRef] [Green Version]
  109. Zhen, Z.; Quackenbush, L.; Zhang, L. Trends in Automatic Individual Tree Crown Detection and Delineation—Evolution of LiDAR Data. Remote Sens. 2016, 8, 333. [Google Scholar] [CrossRef] [Green Version]
  110. Mohan, M.; Silva, C.; Klauberg, C.; Jat, P.; Catts, G.; Cardil, A.; Hudak, A.; Dia, M. Individual Tree Detection from Unmanned Aerial Vehicle (UAV) Derived Canopy Height Model in an Open Canopy Mixed Conifer Forest. Forests 2017, 8, 340. [Google Scholar] [CrossRef] [Green Version]
  111. Dalponte, M.; Reyes, F.; Kandare, K.; Gianelle, D. Delineation of Individual Tree Crowns from ALS and Hyperspectral Data: A Comparison among Four Methods. Eur. J. Remote Sens. 2015, 48, 365–382. [Google Scholar] [CrossRef] [Green Version]
  112. Dalponte, M.; Coomes, D.A. Tree-centric Mapping of Forest Carbon Density from Airborne Laser Scanning and Hyperspectral Data. Methods Ecol. Evol. 2016, 7, 1236–1245. [Google Scholar] [CrossRef] [Green Version]
  113. Vincent, L.; Soille, P. Watersheds in Digital Spaces: An Efficient Algorithm Based on Immersion Simulations. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 583–598. [Google Scholar] [CrossRef] [Green Version]
  114. Gu, J.; Grybas, H.; Congalton, R.G. Individual Tree Crown Delineation from UAS Imagery Based on Region Growing and Growth Space Considerations. Remote Sens. 2020, 12, 2363. [Google Scholar] [CrossRef]
  115. Hyyppa, J.; Kelle, O.; Lehikoinen, M.; Inkinen, M. A Segmentation-Based Method to Retrieve Stem Volume Estimates from 3-D Tree Height Models Produced by Laser Scanners. IEEE Trans. Geosci. Remote Sens. 2001, 39, 969–975. [Google Scholar] [CrossRef]
  116. Li, W.; Guo, Q.; Jakubowski, M.K.; Kelly, M. A New Method for Segmenting Individual Trees from the Lidar Point Cloud. Photogramm. Eng. Remote Sens. 2012, 78, 75–84. [Google Scholar] [CrossRef] [Green Version]
  117. Reitberger, J.; Schnörr, C.; Krzystek, P.; Stilla, U. 3D Segmentation of Single Trees Exploiting Full Waveform LIDAR Data. ISPRS J. Photogramm. Remote Sens. 2009, 64, 561–574. [Google Scholar] [CrossRef]
  118. Hyyppä, J.; Hyyppä, H.; Leckie, D.; Gougeon, F.; Yu, X.; Maltamo, M. Review of Methods of Small-footprint Airborne Laser Scanning for Extracting Forest Inventory Data in Boreal Forests. Int. J. Remote Sens. 2008, 29, 1339–1366. [Google Scholar] [CrossRef]
  119. Zhou, Y.; Zhang, R.; Wang, S.; Wang, F. Feature Selection Method Based on High-Resolution Remote Sensing Images and the Effect of Sensitive Features on Classification Accuracy. Sensors 2018, 18, 2013. [Google Scholar] [CrossRef] [Green Version]
  120. Lu, D.; Weng, Q. A Survey of Image Classification Methods and Techniques for Improving Classification Performance. Int. J. Remote Sens. 2007, 28, 823–870. [Google Scholar] [CrossRef]
  121. Sowmya, A.; Trinder, J. Modelling and Representation Issues in Automated Feature Extraction from Aerial and Satellite Images. ISPRS J. Photogramm. Remote Sens. 2000, 55, 34–47. [Google Scholar] [CrossRef]
  122. Fotso Kamga, G.A.; Bitjoka, L.; Akram, T.; Mengue Mbom, A.; Rameez Naqvi, S.; Bouroubi, Y. Advancements in Satellite Image Classification: Methodologies, Techniques, Approaches and Applications. Int. J. Remote Sens. 2021, 42, 7662–7722. [Google Scholar] [CrossRef]
  123. Oumar, Z.; Mutanga, O.; Ismail, R. Predicting Thaumastocoris Peregrinus Damage Using Narrow Band Normalized Indices and Hyperspectral Indices Using Field Spectra Resampled to the Hyperion Sensor. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 113–121. [Google Scholar] [CrossRef]
  124. Ma, L.; Fu, T.; Blaschke, T.; Li, M.; Tiede, D.; Zhou, Z.; Ma, X.; Chen, D. Evaluation of Feature Selection Methods for Object-Based Land Cover Mapping of Unmanned Aerial Vehicle Imagery Using Random Forest and Support Vector Machine Classifiers. Int. J. Geo-Inf. 2017, 6, 51. [Google Scholar] [CrossRef]
  125. Ma, L.; Cheng, L.; Li, M.; Liu, Y.; Ma, X. Training Set Size, Scale, and Features in Geographic Object-Based Image Analysis of Very High Resolution Unmanned Aerial Vehicle Imagery. ISPRS J. Photogramm. Remote Sens. 2015, 102, 14–27. [Google Scholar] [CrossRef]
  126. Ma, L.; Li, M.; Ma, X.; Cheng, L.; Du, P.; Liu, Y. A Review of Supervised Object-Based Land-Cover Image Classification. ISPRS J. Photogramm. Remote Sens. 2017, 130, 277–293. [Google Scholar] [CrossRef]
  127. Ma, L.; Liu, Y.; Zhang, X.; Ye, Y.; Yin, G.; Johnson, B.A. Deep Learning in Remote Sensing Applications: A Meta-Analysis and Review. ISPRS J. Photogramm. Remote Sens. 2019, 152, 166–177. [Google Scholar] [CrossRef]
  128. Osco, L.P.; Marcato Junior, J.; Marques Ramos, A.P.; de Castro Jorge, L.A.; Fatholahi, S.N.; de Andrade Silva, J.; Matsubara, E.T.; Pistori, H.; Gonçalves, W.N.; Li, J. A Review on Deep Learning in UAV Remote Sensing. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102456. [Google Scholar] [CrossRef]
  129. Nasiri, V.; Darvishsefat, A.A.; Arefi, H.; Pierrot-Deseilligny, M.; Namiranian, M.; Le Bris, A. Unmanned Aerial Vehicles (UAV)-Based Canopy Height Modeling under Leaf-on and Leaf-off Conditions for Determining Tree Height and Crown Diameter (Case Study: Hyrcanian Mixed Forest). Can. J. For. Res. 2021, 51, 962–971. [Google Scholar] [CrossRef]
  130. Puente, C.; Olague, G.; Smith, S.V.; Bullock, S.H.; Hinojosa-Corona, A.; González-Botello, M.A. A Genetic Programming Approach to Estimate Vegetation Cover in the Context of Soil Erosion Assessment. Photogramm. Eng. Remote Sens. 2011, 77, 363–376. [Google Scholar] [CrossRef] [Green Version]
  131. Batista, J.E.; Cabral, A.I.R.; Vasconcelos, M.J.P.; Vanneschi, L.; Silva, S. Improving Land Cover Classification Using Genetic Programming for Feature Construction. Remote Sens. 2021, 13, 1623. [Google Scholar] [CrossRef]
  132. Batista, J.E.; Silva, S. Improving the Detection of Burnt Areas in Remote Sensing Using Hyper-Features Evolved by M3GP. In Proceedings of the 2020 IEEE Congress on Evolutionary Computation (CEC), Glasgow, UK, 19–24 July 2020. [Google Scholar]
  133. Mejia-Zuluaga, P.A.; Dozal, L.; Valdiviezo-N, J.C. Genetic Programming Approach for the Detection of Mistletoe Based on UAV Multispectral Imagery in the Conservation Area of Mexico City. Remote Sens. 2022, 14, 801. [Google Scholar] [CrossRef]
Figure 1. Search query design (“Platform” AND “Field” AND “Issue”) used.
Figure 1. Search query design (“Platform” AND “Field” AND “Issue”) used.
Forests 13 00911 g001
Figure 2. PRISMA flow diagram for the selection of relevant papers (n = number of documents).
Figure 2. PRISMA flow diagram for the selection of relevant papers (n = number of documents).
Forests 13 00911 g002
Figure 3. Temporal distribution of published papers during the period included.
Figure 3. Temporal distribution of published papers during the period included.
Forests 13 00911 g003
Figure 4. World distribution of papers published focusing on UAV-based data.
Figure 4. World distribution of papers published focusing on UAV-based data.
Forests 13 00911 g004
Figure 5. Keyword co-occurrence diagram for the selected papers.
Figure 5. Keyword co-occurrence diagram for the selected papers.
Forests 13 00911 g005
Figure 6. Summary of UAV types and model brands identified in the studies.
Figure 6. Summary of UAV types and model brands identified in the studies.
Forests 13 00911 g006
Figure 7. Summary of sensor types, including: (a) types of remote sensing technology identified in each study; (b) top 10 model camera brands.
Figure 7. Summary of sensor types, including: (a) types of remote sensing technology identified in each study; (b) top 10 model camera brands.
Forests 13 00911 g007
Figure 8. Ground control sampling distance (GSD) versus flight height for different sensor types: (a) hyperspectral sensor (R2 = 0.16); (b) multispectral sensors (R2 = 0.39); (c) RGB sensors (R2 = 0.44).
Figure 8. Ground control sampling distance (GSD) versus flight height for different sensor types: (a) hyperspectral sensor (R2 = 0.16); (b) multispectral sensors (R2 = 0.39); (c) RGB sensors (R2 = 0.44).
Forests 13 00911 g008
Figure 9. Frontal and side overlap distribution of UAV imagery included in every study.
Figure 9. Frontal and side overlap distribution of UAV imagery included in every study.
Forests 13 00911 g009
Figure 10. Percentage of each category of ancillary field and laboratory data for UAV–FIPD. (i) No fieldwork; (ii) field visual assessment of the crown vigor or discoloration; (iii) field visual assessment and forest inventory; (iv) field visual assessment, spectroscopy, and laboratory analysis; (v) visual field assessment, forest inventory, and spectroscopy; (vi) visual field assessment, forest inventory, spectroscopy, and laboratory.
Figure 10. Percentage of each category of ancillary field and laboratory data for UAV–FIPD. (i) No fieldwork; (ii) field visual assessment of the crown vigor or discoloration; (iii) field visual assessment and forest inventory; (iv) field visual assessment, spectroscopy, and laboratory analysis; (v) visual field assessment, forest inventory, and spectroscopy; (vi) visual field assessment, forest inventory, spectroscopy, and laboratory.
Forests 13 00911 g010
Figure 11. Summary of the algorithms used in the studies: CNN: convolutional neural network; ITCD: individual tree crown delineation; KNN: K-nearest neighbor; LOGR: logistic regression; LR: linear regression; MLC: maximum likelihood; MSS: multiscale segmentation; PLS: partial least squares; RF: random forest; SVM: support vector machine; TA: thresholding analysis; XGBoost: eXtreme gradient boosting.
Figure 11. Summary of the algorithms used in the studies: CNN: convolutional neural network; ITCD: individual tree crown delineation; KNN: K-nearest neighbor; LOGR: logistic regression; LR: linear regression; MLC: maximum likelihood; MSS: multiscale segmentation; PLS: partial least squares; RF: random forest; SVM: support vector machine; TA: thresholding analysis; XGBoost: eXtreme gradient boosting.
Forests 13 00911 g011
Figure 12. The overall accuracy of the different classifiers.
Figure 12. The overall accuracy of the different classifiers.
Forests 13 00911 g012
Figure 13. Processing and analysis software applied in the studies. (a) Image processing software brands; (b) analysis software used.
Figure 13. Processing and analysis software applied in the studies. (a) Image processing software brands; (b) analysis software used.
Forests 13 00911 g013
Table 1. Review studies on unmanned aerial vehicle (UAV) remote sensing for forest insect pests and diseases.
Table 1. Review studies on unmanned aerial vehicle (UAV) remote sensing for forest insect pests and diseases.
No.Ref.YearTitleJournalContents
1[24]2017Forestry applications of UAVs in Europe: a reviewInternational Journal of Remote SensingA review of UAV-based forestry applications and aspects of regulations in Europe. Three studies about FIPDs were reviewed.
2[25]2017Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and ForestryRemote SensingA review on UAV-based hyperspectral sensors, data processing, and applications for agriculture and forestry. Three studies about FIPDs were reviewed.
3[26]2020Remotely piloted aircraft systems and forests: a global state of the art and future challengesCanadian Journal of Forest ResearchA review of UAV-based forestry applications. Six studies about FIPDs were reviewed.
4[16]2020Forestry Remote Sensing from Unmanned Aerial Vehicles: A Review Focusing on the Data, Processing and PotentialitiesRemote SensingA review focusing on data, processing, and potentialities. It covers all types of procedures and provides examples. Nine studies about FIPDs were reviewed.
5[27]2021Recent Advances in Unmanned Aerial Vehicles Forest Remote Sensing—A Systematic Review. Part II: Research ApplicationsForestsA systematic review of UAV system solutions, technical advantages, drawbacks of the technology, and considerations on technology transfer. Seventeen studies about FIPDs were reviewed.
6[28]2021The Role of Remote Sensing for the Assessment and Monitoring of Forest Health: A Systematic Evidence SynthesisForestsA systematic evidence synthesis about forest health issues with reference to different remote sensing platforms and techniques. Ten studies about UAV–FIPDs were reviewed.
7[29]2021Remotely Piloted Aircraft Systems to Identify Pests and Diseases in Forest Species: The Global State of the Art and Future ChallengesIEEE Geoscience and remote sensing magazineA literature review of UAV-based on forest pest and disease monitoring. Thirty-three studies about FIPDs were reviewed.
Table 2. Categories of the parameters extracted from screened articles in the database.
Table 2. Categories of the parameters extracted from screened articles in the database.
CategoryParameterDescription
GeneralSourceRefereed journals and conference proceedings
Year-
Authors-
Study locationThe geographic location of the study area
TaxonomySpecieName of the host tree specie
Pest or diseaseName of the pest or disease
UAV and sensor typesUAV typeType of the UAV (fixed-wing, rotary-wing)
Sensor typeActive or passive sensor, manufacturer, model
Data collection and pre-processingStudy area sizeArea coverage in hectares
Flight altitudeMeasured (m)
Spatial resolutionMeasured centimeters (cm)
Imagery OverlapPercentage of frontal and side overlap
Field data collectionAncillary field and laboratory data about FIPD
Radiometric calibrationCalibrated panels
Geometric calibrationGround control points (GCPs)
Data processing and analytical methodsSpatial unit analysisPixel-based, object-based
Segmentation single treeManual, raster-based, vector-based
Feature extraction and selection No feature extraction, vegetation indices, textural or contextual image, linear transformations, auxiliary data
Analysis typeClassification, regression, other
AlgorithmsStatistical, machine learning, deep learning, other
Accuracy metricsMeasured in percentage
Software usedSoftware brandsSoftware used to process imagery and analytical methods
Table 3. Studies published by journal, quartile rank, and publisher. No. indicates the number of papers.
Table 3. Studies published by journal, quartile rank, and publisher. No. indicates the number of papers.
JournalsNo.Quartile RankPublisher
Remote Sensing17Q1MDPI
Forests5Q1MDPI
Forest Ecology and Management3Q1Elsevier Inc.
Drones2Q1MDPI
Forest Ecosystems2Q1Springer
Remote Sensing of Environment2Q1Elsevier Inc.
Sensors2Q2MDPI
Australian Forestry1Q1Taylor & Francis Ltd.
Engineering1Q1Elsevier Inc.
Geo-Spatial Information Science1Q1Taylor & Francis Ltd.
IEEE Journal of selected topics in Applied Earth Observation and Remote Sensing1Q2Institute of Electrical and Electronics Engineers Inc.
International Journal of Applied Earth Observation and Geoinformation1Q1Elsevier Inc.
International Journal of Remote Sensing1Q1Taylor & Francis Ltd.
ISPRS Journal of Photogrammetry and Remote Sensing1Q1Elsevier Inc.
Journal of Forestry Research1Q2Northeast Forestry University
Journal of Plant Diseases and Protection1Q2Springer International Publishing AG
Plant Methods1Q1BioMed Central Ltd.
PLoS One1Q1Public Library of Science
Urban Forestry and Urban Greening1Q1Urban und Fischer Verlag GmbH und Co. KG
Table 4. Studies presented in conference proceedings by publisher country and publisher. No. indicates the number of conference proceedings.
Table 4. Studies presented in conference proceedings by publisher country and publisher. No. indicates the number of conference proceedings.
Conference ProceedingsNo.Publisher
International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences (ISPRS) Archives3International Society for Photogrammetry and Remote Sensing
ISPRS Annals of the Photogrammetry Remote Sensing and Spatial Information Sciences1Copernicus GmbH
Table 5. Summary of common names of pests or diseases and related host tree species in the studies analyzed.
Table 5. Summary of common names of pests or diseases and related host tree species in the studies analyzed.
Common NameHost Tree SpeciesStudies
PestsBark beetleAbies sibirica, Abies mariesii, Picea abies, Pinus sylvestris, Pinus nigra[43,44,45,46,47,48,49,50,51,52,53,54]
Chinese pine caterpillarPinus tabulaeformis[55,56]
Longhorned borerEucalyptus globulus[57]
Mosquito bugsEucalyptus pellita[58]
MistletoeParrotia persica[59,60]
Oak splendor beetleQuercus robur[61]
Pine shoot beetlePinus yunnanensis[62,63,64]
Processionary mothPinus Sylvestris, Pinus nigra, Pinus halepensis[39,65,66,67]
Stem borerEucalyptus pellita[58]
Tortrix mothAbies mariesii[53]
DiseasesArmillaria root rotPicea abies[12]
Alder PhytophtoraAlnus glutinosa[68]
Chestnut ink diseaseCastanea sativa[69]
Myrtle rustMelaleuca quinquenervia[70]
Bacterial wildEucalyptus pellita[58,71]
Pine wild diseasePinus pinaster, P. desiflora, P. massoniana[72,73,74,75,76,77,78,79,80,81,82,83]
Red band needle blightPinus Sylvestris and P. contorta[84,85]
White pine needle castPinus strobus and Pinus resinosa[86]
SimulatedPinus radiata[15,40]
Table 6. Flight height and GSD descriptive statistics by sensor type.
Table 6. Flight height and GSD descriptive statistics by sensor type.
Flight Height (m)GSD (m)
Sensor TypeNo.MaxMinMedianMaxMinMedian
RGB2970030900.0800.0150.028
Multispectral27200501000.1700.0200.070
Hyperspectral1214020950.5600.0470.200
Thermal412260750.9800.1500.211
Table 7. Summary of segmentation single tree methods in the studies.
Table 7. Summary of segmentation single tree methods in the studies.
Segmentation Single TreeMethodSynopsisStudies
Manually Manually segmented treesDigitalization of each tree crown above imagery using GIS software.[15,39,40,50,54,56,60,64,68,79,80]
Local maxima filter and BufferLocal maxima filter within a rasterized CHM to detect the treetops, then a buffer applied on the treetop using GIS software. [39,46,48,84,85]
Raster-basedMean shift algorithmGEOBIA method. Multispectral image segmentation using ArcGIS segment mean shift tool.[66]
Multiresolution segmentationGEOBIA method. Multispectral image segmentation using eCognition software multiresolution segmentation tool.[12,61,83]
Local maxima filter and mean shift algorithmLocal maxima of a sliding window using the brightness of the multispectral image. Then, the select by location tool is used between treetops and for large-scale mean shift algorithm segments (GEOBIA).[57]
Safonova et al. Wavelet-based local thresholding Tree crown delineation using RGB images. The steps are contrast enhancement, crown segmentation based on wavelet transformation and morphological operations, and boundary detection.[52]
Safonova et al. Treetop detectionRGB images are transformed into one grey-scale band image; next, the grey-scale band image is converted into a blurred image; finally, the blurred image is converted into a binary image.[47]
Voronoi Tesselations Local maxima filter within a rasterized CHM calculates the treetops and then uses a Voronoi tessellation algorithm [110].[65]
Dalponte individual tree segmentation Local maxima within a rasterized CHM calculates the treetops and then uses a region-growing algorithm for individual segmentation [111,112].[50,59]
Watershed segmentation Vicent and Soille original algorithm [113]. When the CHM is inverted, tree tops or vegetation clusters look like “basins”.[49]
Marker-controlled watershed [109]. Marker and segmentation functions are used for multi-tree identification and segmentation using rasterized CHM [114].[50,86]
Binary watershed analysis and the Euclidean distance using rasterized CHM or NIR band.[69,79]
Hyyppä et al. [115] methodology.[43]
Nyguen Treetops in nDSM dataBased on pixel intensity, an iterative sliding window is passed over the nDSM. Finally, the refinement is applied to eliminate treetops that are too close to each other.[53]
Vector-based3D region-growing algorithm 3D region-growing algorithm applied in a point cloud (LiDAR or photogrammetric) using a built-in function for treetop detection [116].[50,63,79]
3D segmentation of single trees Point cloud-based method with tree segmentation using a normalized cut algorithm [117]. [87]
Voxel-based single treeLidar point cloud data are converted into voxels in order to estimate the leaf area density and the construction of the 3D forest scene. [63]
Table 8. Summary of feature extraction techniques of UAV imagery applied in the studies.
Table 8. Summary of feature extraction techniques of UAV imagery applied in the studies.
Feature TypeDescriptionStudies
Spectral featuresStatistics of original bands, ratios between bands, vegetation indices[12,15,39,40,43,44,45,46,48,49,50,51,54,55,56,57,58,59,60,61,62,63,64,66,68,69,70,71,72,75,77,79,80,81,82,83,84,85,86,87]
Textural featuresGray level co-occurrence matrix (GLCM), grey level difference vector (GLDV)[48,68,86]
Linear transformationsHue, saturated and intensity (HSI), principal component analysis (PCA)[55,61,79]
Geo-auxiliaryOriginal and normalized digital surface models (DSM) such as digital elevation models (DEM), canopy height models (CHM), slope, aspect, height percentiles[12,39,48,50,53,54,62,63,65,68,71,81,85,86,87]
MultisensorInclusion of data obtained from different sensors in analytical methods[44,62,79,84,87]
MultitemporalInclusion of multitemporal data classification in analytical methods[15,40,48,59,69]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Duarte, A.; Borralho, N.; Cabral, P.; Caetano, M. Recent Advances in Forest Insect Pests and Diseases Monitoring Using UAV-Based Data: A Systematic Review. Forests 2022, 13, 911. https://doi.org/10.3390/f13060911

AMA Style

Duarte A, Borralho N, Cabral P, Caetano M. Recent Advances in Forest Insect Pests and Diseases Monitoring Using UAV-Based Data: A Systematic Review. Forests. 2022; 13(6):911. https://doi.org/10.3390/f13060911

Chicago/Turabian Style

Duarte, André, Nuno Borralho, Pedro Cabral, and Mário Caetano. 2022. "Recent Advances in Forest Insect Pests and Diseases Monitoring Using UAV-Based Data: A Systematic Review" Forests 13, no. 6: 911. https://doi.org/10.3390/f13060911

APA Style

Duarte, A., Borralho, N., Cabral, P., & Caetano, M. (2022). Recent Advances in Forest Insect Pests and Diseases Monitoring Using UAV-Based Data: A Systematic Review. Forests, 13(6), 911. https://doi.org/10.3390/f13060911

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop