Nothing Special   »   [go: up one dir, main page]

Next Issue
Volume 8, April
Previous Issue
Volume 8, February
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 

Drones, Volume 8, Issue 3 (March 2024) – 47 articles

Cover Story (view full-size image): In response to the rising demand for autonomous quadrotor flights, this study addresses the lack of comprehensive analyses in existing reviews. By examining experimental results from leading publications, it identifies trends and research gaps in quadrotor tracking control. Through historical insights and data-driven analyses, it objectively identifies top-performing controllers across diverse applications. Aimed at aiding early-career researchers, this review aims to facilitate meaningful contributions to quadrotor control technology, while highlighting three crucial gaps hindering effective comparison and progress. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
22 pages, 1971 KiB  
Article
Estimating Total Length of Partially Submerged Crocodylians from Drone Imagery
by Clément Aubert, Gilles Le Moguédec, Alvaro Velasco, Xander Combrink, Jeffrey W. Lang, Phoebe Griffith, Gualberto Pacheco-Sierra, Etiam Pérez, Pierre Charruau, Francisco Villamarín, Igor J. Roberto, Boris Marioni, Joseph E. Colbert, Asghar Mobaraki, Allan R. Woodward, Ruchira Somaweera, Marisa Tellez, Matthew Brien and Matthew H. Shirley
Drones 2024, 8(3), 115; https://doi.org/10.3390/drones8030115 - 21 Mar 2024
Cited by 2 | Viewed by 4432
Abstract
Understanding the demographic structure is vital for wildlife research and conservation. For crocodylians, accurately estimating total length and demographic class usually necessitates close observation or capture, often of partially immersed individuals, leading to potential imprecision and risk. Drone technology offers a bias-free, safer [...] Read more.
Understanding the demographic structure is vital for wildlife research and conservation. For crocodylians, accurately estimating total length and demographic class usually necessitates close observation or capture, often of partially immersed individuals, leading to potential imprecision and risk. Drone technology offers a bias-free, safer alternative for classification. We evaluated the effectiveness of drone photos combined with head length allometric relationships to estimate total length, and propose a standardized method for drone-based crocodylian demographic classification. We evaluated error sources related to drone flight parameters using standardized targets. An allometric framework correlating head to total length for 17 crocodylian species was developed, incorporating confidence intervals to account for imprecision sources (e.g., allometric accuracy, head inclination, observer bias, terrain variability). This method was applied to wild crocodylians through drone photography. Target measurements from drone imagery, across various resolutions and sizes, were consistent with their actual dimensions. Terrain effects were less impactful than Ground-Sample Distance (GSD) errors from photogrammetric software. The allometric framework predicted lengths within ≃11–18% accuracy across species, with natural allometric variation among individuals explaining much of this range. Compared to traditional methods that can be subjective and risky, our drone-based approach is objective, efficient, fast, cheap, non-invasive, and safe. Nonetheless, further refinements are needed to extend survey times and better include smaller size classes. Full article
Show Figures

Figure 1

Figure 1
<p>Estimating the total length of crocodylians from drone-captured images and the various sources of imprecision. This schematic represents the various steps (bolded) to obtain demographic information from crocodiles observed by drone: I. picture capture with the drone; II. image processing; III. measurement of head length (HL); and IV. model-based estimation of the total body length (TL) of the crocodile (dashed lines represent the CI on the TL estimation). The various sources of imprecision at each step are also indicated (green, italicized): (<span class="html-italic">a</span>) head inclination, (<span class="html-italic">b</span>) topology, (<span class="html-italic">c</span>) orthophoto treatment (i.e., software effects), (<span class="html-italic">d</span>) head delimitation (i.e., observer effects), and (<span class="html-italic">e</span>) allometry and individual variation.</p>
Full article ">Figure 2
<p><span class="html-italic">Crocodylus suchus</span> measurements using QGIS. For head length (HLp): (<b>a</b>) partially submerged with only the head visible, (<b>b</b>) partially submerged, (<b>c</b>) on the shore. For total length (TLp): (<b>d</b>) on the shore. Pendjari National Park (Benin) and W National Park (Niger).</p>
Full article ">Figure 3
<p>Effect of flight height, target length and ground topology on measurement precision in drone photos. Parts (<b>a</b>,<b>b</b>): Distribution of the difference between target length as measured in the drone photo and actual target length at different flight heights (<b>a</b>) and for different target lengths (<b>b</b>). The boxplots represent the median, 25% and 75% quartiles, whiskers representing 5% and 95% quartiles, and dots the outliers. Part (<b>c</b>): the true Ground-Sample Distance (GSD) was calculated relative to flight height (red line, see Equation (1)) and compared to the GSD automatically estimated by the photogrammetric processing software Agisoft Metashape Pro (blue dots) to assess whether minor differences (±2 m) in ground topology would affect the estimation. The inaccuracy automatically computed (±0.070 cm/pixel at 40 m) was larger than the true difference in GSD (0.025 cm/pixel per meter). The topology effect is thus overshadowed by the GSD variation in the processing software.</p>
Full article ">Figure 4
<p>Simple allometric relationship between head length (HL) and total length (TL) in log–log scale for 17 crocodylian species. The allometric relationships were derived from measurement of 7368 wild-caught individuals (<a href="#drones-08-00115-t001" class="html-table">Table 1</a>). For the allometry computation, only data considered as realistic, those with the TL:HL ratio between 4 and 9, have been used.</p>
Full article ">Figure 5
<p>Allometric relationship between head length (HL) and total length (TL) in log–log scale of wild-caught West African crocodiles (<span class="html-italic">Crocodylus suchus</span>) measured in natural populations. (<b>a</b>) The allometric relationship is derived from HL and TL measurements from 116 individual <span class="html-italic">C. suchus</span> captured from throughout the species distribution. We discarded all observations (red stars; kept individuals are indicated by a blue dot) for which the ratio was greater than 1:9 (grey dotted line) and less than 1:4 (grey dashed line). The allometry prediction curve (red line) and its 95% confidence envelope (blue lines) are illustrated. (<b>b</b>) We estimated the variance by simulating 125,000 values (i.e., 50 head inclinations × 50 target length acquisitions × 50 allometry values randomly chosen from their respective distributions) to assess the contribution of each source of bias to the overall imprecision in the predicted total length estimations based on the allometric relationship: (i) head inclination (light green), (ii) head length measurement (blue), (iii) allometry estimation (red), and (iv) natural allometric variability, i.e., biological variation (grey).</p>
Full article ">Figure 6
<p>Assessing the accuracy of our reference allometric framework model to estimate the TL from HL measured in drone photos of crocodiles in natural populations. We measured head length (HL<sub>p</sub>) and total length (TL<sub>p</sub>) for <span class="html-italic">Crocodylus suchus</span> individuals detected in drone photos (from [<a href="#B10-drones-08-00115" class="html-bibr">10</a>]) from the Tapoa River (W National Park, Niger) (light green dots, n = 67) and Bali Pond (Pendjari National Park, Benin) (orange dots, n = 32). For each HL<sub>p</sub> value, we also estimated the TL (TL<sub>e</sub>) using our reference allometric framework, which is represented by the blue solid-line and including its 95% confidence interval envelope (blue dotted-lines). TL<sub>p</sub> were slightly lower than TL<sub>e</sub> in most cases, though not statistically significantly, and for 26 out of 99 individuals the TL<sub>p</sub> was below the 95% CI of TL<sub>e</sub>.</p>
Full article ">Figure 7
<p>Size–class distribution inferred from drone-captured pictures. We counted <span class="html-italic">C. suchus</span> from pictures captured by drones in a 2018 survey in a 2 km-long transect of the Tapoa River, W National Park (Niger; n = 131; light blue bars) and the Bali pond (1.32 ha), Pendjari National Park (Benin; n = 38; red bars) [<a href="#B10-drones-08-00115" class="html-bibr">10</a>]. Their total body length (TL<sub>e</sub>) was estimated using our reference allometric framework, and they were then categorized as hatchlings (&lt;50 cm), individuals smaller than 100 cm (&lt;100), and then into 25 cm classes from TL = 100 to 350 cm, and over &gt;350 cm. Some individuals were detected in the pictures but their HL could not be measured (blurred, partially visible, etc.) (Niger = 42, Benin = 12).</p>
Full article ">
34 pages, 2373 KiB  
Article
Modeling of the Flight Performance of a Plasma-Propelled Drone: Limitations and Prospects
by Sylvain Grosse, Eric Moreau and Nicolas Binder
Drones 2024, 8(3), 114; https://doi.org/10.3390/drones8030114 - 21 Mar 2024
Cited by 3 | Viewed by 3427
Abstract
The resurgence in interest in aircraft electro-aerodynamic (EAD) propulsion has been sparked due to recent advancements in EAD thrusters, which generate thrust by employing a plasma generated through electrical discharge. With potentially quieter propulsion that could contribute to the generation of lift or [...] Read more.
The resurgence in interest in aircraft electro-aerodynamic (EAD) propulsion has been sparked due to recent advancements in EAD thrusters, which generate thrust by employing a plasma generated through electrical discharge. With potentially quieter propulsion that could contribute to the generation of lift or the control of attitude, it is important to determine the feasibility of an EAD-propelled airplane. First, the main propulsive characteristics (thrust generation and power consumption) of EAD thrusters were drawn from the literature and compared with existing technologies. Second, an algorithm was developed to couple standard equations of flight with EAD propulsion performance and treat the first-order interactions. It fairly replicated the performance of the only available autonomous EAD-propelled drone. A test case based on an existing commercial UAV of 10 kg equipped with current-generation EAD thrusters anticipated a flight of less than 10 min, lower than 30 m in height, and below 8 m · s −1 in velocity. Achieving over 2 h of flight at 30 m of height at 10 m · s −1 requires the current EAD thrust to be doubled without altering the power consumption. For the same flight performance as the baseline UAV, the prediction asked for a tenfold increase in the thrust at the same power consumption. Full article
Show Figures

Figure 1

Figure 1
<p>Schematic of a single-stage wire-to-cylinder thruster.</p>
Full article ">Figure 2
<p>Schematic of different configurations of single-stage thrusters.</p>
Full article ">Figure 3
<p>Thrust versus current (<b>a</b>) and thrust-to-power ratio versus thrust (<b>b</b>) for single (red) and dual (blue) collectors (experimental data from [<a href="#B5-drones-08-00114" class="html-bibr">5</a>]).</p>
Full article ">Figure 4
<p>Overall efficiency against flight velocity and thrust-to-power ratio as (<b>a</b>) a contour plot and (<b>b</b>) three relevant velocities.</p>
Full article ">Figure 5
<p>Electric and aerodynamic forces against flight velocity for single (<b>a</b>) airfoil and (<b>b</b>) cylinder collectors and (<b>c</b>,<b>d</b>) dual airfoil collectors. Remarkable points: (0) values reported in [<a href="#B1-drones-08-00114" class="html-bibr">1</a>], (1) balance of drag and EAD forces, and (2) cancellation of the thrust.</p>
Full article ">Figure 6
<p>Schematic of the proposed design of a multistage thruster (four stages in parallel and three stages in series with dual collectors).</p>
Full article ">Figure 7
<p>Multistage thrusters with (<b>a</b>) single and (<b>b</b>) dual collectors (<math display="inline"><semantics> <mrow> <mi>N</mi> <mo>=</mo> <mn>4</mn> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>M</mi> <mo>=</mo> <mn>3</mn> </mrow> </semantics></math>).</p>
Full article ">Figure 8
<p>The mass of the multistage thruster as a function of the cruise velocity for aircraft masses of 10, 100, and 1000 kg.</p>
Full article ">Figure 9
<p>Thrust–power curves used in the model (based on the data in [<a href="#B5-drones-08-00114" class="html-bibr">5</a>]).</p>
Full article ">Figure 10
<p>Schematic of the architecture of the algorithm.</p>
Full article ">Figure 11
<p>Contribution of the EAD thruster to the airplane lift in relation to the lift coefficient of the electrodes.</p>
Full article ">Figure A1
<p>Schematic of the control volume around (<b>a</b>) an EAD thruster and (<b>b</b>) a propeller.</p>
Full article ">Figure A2
<p>Schematic of the aircraft in different flight phases (<b>a</b>) and balance of the forces in (<b>b</b>) cruising and (<b>c</b>) climbing.</p>
Full article ">Figure A3
<p>Schematic of the modeled airplane with (<b>a</b>) its different components and (<b>b</b>) their masses and dimensions.</p>
Full article ">
22 pages, 35809 KiB  
Article
UAV-Based Wetland Monitoring: Multispectral and Lidar Fusion with Random Forest Classification
by Robert Van Alphen, Kai C. Rains, Mel Rodgers, Rocco Malservisi and Timothy H. Dixon
Drones 2024, 8(3), 113; https://doi.org/10.3390/drones8030113 - 21 Mar 2024
Viewed by 2892
Abstract
As sea levels rise and temperatures increase, vegetation communities in tropical and sub-tropical coastal areas will be stressed; some will migrate northward and inland. The transition from coastal marshes and scrub–shrubs to woody mangroves is a fundamental change to coastal community structure and [...] Read more.
As sea levels rise and temperatures increase, vegetation communities in tropical and sub-tropical coastal areas will be stressed; some will migrate northward and inland. The transition from coastal marshes and scrub–shrubs to woody mangroves is a fundamental change to coastal community structure and species composition. However, this transition will likely be episodic, complicating monitoring efforts, as mangrove advances are countered by dieback from increasingly impactful storms. Coastal habitat monitoring has traditionally been conducted through satellite and ground-based surveys. Here we investigate the use of UAV-LiDAR (unoccupied aerial vehicle–light detection and ranging) and multispectral photogrammetry to study a Florida coastal wetland. These data have higher resolution than satellite-derived data and are cheaper and faster to collect compared to crewed aircraft or ground surveys. We detected significant canopy change in the period between our survey (2020–2022) and a previous survey (2015), including loss at the scale of individual buttonwood trees (Conocarpus erectus), a woody mangrove associate. The UAV-derived data were collected to investigate the utility of simplified processing and data inputs for habitat classification and were validated with standard metrics and additional ground truth. UAV surveys combined with machine learning can streamline coastal habitat monitoring, facilitating repeat surveys to assess the effects of climate change and other change agents. Full article
Show Figures

Figure 1

Figure 1
<p>Study site and the surrounding landscape. (<b>A</b>): Map of Florida (inset) and the greater Tampa Bay area. The location of the study site is indicated by a red star. (<b>B</b>): Aerial photograph of the study area. The training site is indicated by the solid green box. The testing site is indicated by the dashed blue polygon. Validation survey sites in the testing site are indicated by red points.</p>
Full article ">Figure 2
<p>Simplified methods workflow emphasizing the steps taken for the random forest classification.</p>
Full article ">Figure 3
<p>UAV structure from motion-derived orthomosaic maps of the testing ((<b>A</b>) to the north) and training ((<b>B</b>) to the south) field sites.</p>
Full article ">Figure 4
<p>Representative images of habitat types. (<b>A</b>): Mixed hardwood. (<b>B</b>): Water. (<b>C</b>): Low vegetation. (<b>D</b>): Road. (<b>E</b>): Sand. (<b>F</b>): Mangrove (author for scale).</p>
Full article ">Figure 5
<p>The digital terrain model results based on the 2022 UAV-LiDAR. This model was included as one of the classification features and to create the canopy height models. White points correspond to points in <a href="#drones-08-00113-f006" class="html-fig">Figure 6</a>. Black points correspond to points in <a href="#drones-08-00113-f007" class="html-fig">Figure 7</a>.</p>
Full article ">Figure 6
<p>A comparison of the 2022 UAV-LiDAR, 2015 DEM, and ground-based survey elevation data obtained at point locations indicated by white dots in <a href="#drones-08-00113-f005" class="html-fig">Figure 5</a>. Black circles are the ground-based survey, red squares are the 2022 UAV-LiDAR, and blue triangles are the 2015 DTM. RMSE values were computed between the ground-based survey and the respective LiDAR in the vertical.</p>
Full article ">Figure 7
<p>A comparison of canopy elevation data obtained in 2022 and in 2015 at point locations indicated by the black dots in <a href="#drones-08-00113-f005" class="html-fig">Figure 5</a>. The transect includes points obtained along the paved road (gray bar, left) and within vegetated areas (center and right). East and west are indicated by the letter in the top corners.</p>
Full article ">Figure 8
<p>Comparison of the DEM of difference with color imagery. (<b>A</b>): DEM of difference between the 2015 aircraft-LiDAR and the 2022 UAV-LiDAR canopy height models, both normalized to the 2020 ground surface. (<b>B</b>): 2020 orthophoto from UAV photogrammetry. (<b>C</b>): Satellite image from 2015 of the training site. Blue circles highlight some areas of negative change. Imagery in (<b>C</b>) taken from Google Earth.</p>
Full article ">Figure 9
<p>Three-dimensional plots of the principal component analysis. (<b>A</b>): Comparison of the PC space between mangroves and mixed hardwood. (<b>B</b>): Comparison between shadow and low vegetation. (<b>C</b>): Comparison of all the classifications. (<b>D</b>): Comparison between mangroves, mixed hardwood, shadow, and low vegetation.</p>
Full article ">Figure 10
<p>Feature importance graphs. (<b>A</b>): Feature importance calculated for the 2000-pixel model. (<b>B</b>): Feature importance calculated for the 5000-pixel model.</p>
Full article ">Figure 11
<p>Habitat classification maps with shadow pixels removed (testing site). (<b>A</b>): The 2000-pixel (no shadow) habitat map. (<b>B</b>): The 5000-pixel (no shadow) habitat map.</p>
Full article ">Figure 12
<p>Habitat classification maps smoothed (testing site). (<b>A</b>): The 2000-pixel smoothed habitat map. (<b>B</b>): The 5000-pixel smoothed habitat map.</p>
Full article ">
21 pages, 7411 KiB  
Article
MFMG-Net: Multispectral Feature Mutual Guidance Network for Visible–Infrared Object Detection
by Fei Zhao, Wenzhong Lou, Hengzhen Feng, Nanxi Ding and Chenglong Li
Drones 2024, 8(3), 112; https://doi.org/10.3390/drones8030112 - 21 Mar 2024
Viewed by 1943
Abstract
Drones equipped with visible and infrared sensors play a vital role in urban road supervision. However, conventional methods using RGB-IR image pairs often struggle to extract effective features. These methods treat these spectra independently, missing the potential benefits of their interaction and complementary [...] Read more.
Drones equipped with visible and infrared sensors play a vital role in urban road supervision. However, conventional methods using RGB-IR image pairs often struggle to extract effective features. These methods treat these spectra independently, missing the potential benefits of their interaction and complementary information. To address these challenges, we designed the Multispectral Feature Mutual Guidance Network (MFMG-Net). To prevent learning bias between spectra, we have developed a Data Augmentation (DA) technique based on the mask strategy. The MFMG module is embedded between two backbone networks, promoting the exchange of feature information between spectra to enhance extraction. We also designed a Dual-Branch Feature Fusion (DBFF) module based on attention mechanisms, enabling deep feature fusion by emphasizing correlations between the two spectra in both the feature channel and space dimensions. Finally, the fused features feed into the neck network and detection head, yielding ultimate inference results. Our experiments, conducted on the Aerial Imagery (VEDAI) dataset and two other public datasets (M3FD and LLVIP), showcase the superior performance of our method and the effectiveness of MFMG in enhancing multispectral feature extraction for drone ground detection. Full article
Show Figures

Figure 1

Figure 1
<p>Comparison of network structure: (<b>a</b>) represents the conventional RGB-IR joint target detection methods; (<b>b</b>) depicts the RGB-IR joint target detection method designed by our team.</p>
Full article ">Figure 2
<p>The network framework of the proposed MFMG-Net.</p>
Full article ">Figure 3
<p>Example of RGB-IR image pair. (<b>a</b>) In this scenario, the RGB image is affected by the intense light from car headlights, which can obscure details of traffic conditions. (<b>b</b>) In this scenario, the RGB image is advantageous due to their rich texture and color information. The red box indicates potential targets.</p>
Full article ">Figure 4
<p>The data augmentation (DA) method proposed in this study. The two black squares represent randomly generated masks. One is used to block the RGB image and the other is used to block the IR image. The purpose of making the model learn only RGB detection, only IR detection and multi-modal detection is to prevent the model from biasing in multi-modal learning.</p>
Full article ">Figure 5
<p>Proposed MFMG module.</p>
Full article ">Figure 6
<p>Dual-branch feature fusion module.</p>
Full article ">Figure 7
<p>The baseline detector and the detector network implementation plan. (<b>a</b>) Base-line detector, the feature extraction process is independent, and the characteristics of the features are added with elements. (<b>b</b>) The proposed MFMG-Net, the feature extraction under mutual guidance and the characteristic fusion of the characteristic fusion module of the two-point branches.</p>
Full article ">Figure 8
<p>Detection results for four representative scenarios in the VEDAI dataset. Note that red inverted triangles indicate FNs, and green inverted triangles show FPs. Zoomed in to see details.</p>
Full article ">Figure 9
<p>Detection results for four representative scenarios in the M3FD dataset. Note that red inverted triangles indicate FNs, and green inverted triangles show FPs. Zoomed in to see details.</p>
Full article ">Figure 10
<p>Detection results for four representative scenarios in the LLVIP dataset. Note that green inverted triangles show FPs. Zoomed in to see details.</p>
Full article ">Figure 11
<p>Experimental equipment and scenarios, as well as sample data.</p>
Full article ">Figure 12
<p>Some typical RGB-IR data pairs taken at different time periods and different sampling places.</p>
Full article ">Figure 13
<p>Detection results of typical RGB-IR data pairs captured in the field.</p>
Full article ">
23 pages, 2415 KiB  
Article
Hybrid Encryption for Securing and Tracking Goods Delivery by Multipurpose Unmanned Aerial Vehicles in Rural Areas Using Cipher Block Chaining and Physical Layer Security
by Elias Yaacoub, Khalid Abualsaud and Mohamed Mahmoud
Drones 2024, 8(3), 111; https://doi.org/10.3390/drones8030111 - 21 Mar 2024
Cited by 1 | Viewed by 2045
Abstract
This paper investigated the use of unmanned aerial vehicles (UAVs) for the delivery of critical goods to remote areas in the absence of network connectivity. Under such conditions, it is important to track the delivery process and record the transactions in a delay-tolerant [...] Read more.
This paper investigated the use of unmanned aerial vehicles (UAVs) for the delivery of critical goods to remote areas in the absence of network connectivity. Under such conditions, it is important to track the delivery process and record the transactions in a delay-tolerant fashion so that this information can be recovered after the UAV’s return to base. We propose a novel framework that combines the strengths of cipher block chaining, physical layer security, and symmetric and asymmetric encryption techniques in order to safely encrypt the transaction logs of remote delivery operations. The proposed approach is shown to provide high security levels, making the keys undetectable, in addition to being robust to attacks. Thus, it is very useful in drone systems used for logistics and autonomous goods delivery to multiple destinations. This is particularly important in health applications, e.g., for vaccine transmissions, or in relief and rescue operations. Full article
(This article belongs to the Special Issue Advances of Drones in Logistics)
Show Figures

Figure 1

Figure 1
<p>DTN connectivity provided by UAVs benefiting from railroad stations as recharge points: (<b>a</b>) village with central BS; (<b>b</b>) village without central BS.</p>
Full article ">Figure 2
<p>Tracking goods delivery.</p>
Full article ">Figure 3
<p>CBC process for encrypting the transaction data.</p>
Full article ">Figure 4
<p>(<b>a</b>) Encryption at the UAV; (<b>b</b>) Decryption at the Control Center.</p>
Full article ">Figure 5
<p>Percentage of eavesdropper’s correct “guesses” of the bits of a given key.</p>
Full article ">Figure 6
<p>Example of trajectory interruption.</p>
Full article ">
20 pages, 11298 KiB  
Article
Air–Ground Collaborative Multi-Target Detection Task Assignment and Path Planning Optimization
by Tianxiao Ma, Ping Lu, Fangwei Deng and Keke Geng
Drones 2024, 8(3), 110; https://doi.org/10.3390/drones8030110 - 21 Mar 2024
Cited by 4 | Viewed by 2065
Abstract
Collaborative exploration in environments involving multiple unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs) represents a crucial research direction in multi-agent systems. However, there is still a lack of research in the areas of multi-target detection task assignment and swarm path planning, [...] Read more.
Collaborative exploration in environments involving multiple unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs) represents a crucial research direction in multi-agent systems. However, there is still a lack of research in the areas of multi-target detection task assignment and swarm path planning, both of which play a vital role in enhancing the efficiency of environment exploration and reducing energy consumption. In this paper, we propose an air–ground collaborative multi-target detection task model based on Mixed Integer Linear Programming (MILP). In order to make the model more suitable for real situations, kinematic constraints of the UAVs and UGVs, dynamic collision avoidance constraints, task allocation constraints, and obstacle avoidance constraints are added to the model. We also establish an objective function that comprehensively considers time consumption, energy consumption, and trajectory smoothness to improve the authenticity of the model and achieve a more realistic purpose. Meanwhile, a Branch-and-Bound method combined with the Improved Genetic Algorithm (IGA-B&B) is proposed to solve the objective function, and the optimal task assignment and optimal path of air–ground collaborative multi-target detection can be obtained. A simulation environment with multi-agents, multi-obstacles, and multi-task points is established. The simulation results show that the proposed IGA-B&B algorithm can reduce the computation time cost by 30% compared to the traditional Branch-and-Bound (B&B) method. In addition, an experiment is carried out in an outdoor environment, which further validates the effectiveness and feasibility of the proposed method. Full article
Show Figures

Figure 1

Figure 1
<p>Maximum speed limit circle and linearization.</p>
Full article ">Figure 2
<p>Polygonal and circular descriptions of complex obstacles: (<b>a</b>) polygonal description; (<b>b</b>) circular description.</p>
Full article ">Figure 3
<p>The four orientations of the agent in the anti-collision constraint: (<b>a</b>) front, (<b>b</b>) rear, (<b>c</b>) right, (<b>d</b>) left.</p>
Full article ">Figure 4
<p>Four different example scenarios: (<b>a</b>) scene suitable for UAV; (<b>b</b>) scene suitable for UGVs; (<b>c</b>) scene suitable for both UGVs and UAVs; (<b>d</b>) scene that needs to be detected by both UGVs and UAVs.</p>
Full article ">Figure 5
<p>Schematic diagram of the task completion constraints of the target point.</p>
Full article ">Figure 6
<p>Description after chromosome encoding and meaning after decoding.</p>
Full article ">Figure 7
<p>Schematic diagram of chromosome crossover operation.</p>
Full article ">Figure 8
<p>Schematic diagram of chromosome crossover-adjustment operation.</p>
Full article ">Figure 9
<p>Schematic diagram of chromosome mutation operation.</p>
Full article ">Figure 10
<p>The effect of iteration time on the results of Branch-and-Bound method.</p>
Full article ">Figure 11
<p>The results of optimization with or without energy consumption constraints: (<b>a</b>) with only time constraint; (<b>b</b>) with time and energy consumption constraints.</p>
Full article ">Figure 12
<p>Optimization results with and without trajectory smoothness: (<b>a</b>) with time and energy consumption constraints; (<b>b</b>) with optimized time, energy consumption, and trajectory smoothness constraints.</p>
Full article ">Figure 13
<p>Comparison of several optimization methods: (<b>a</b>) Genetic Algorithm; (<b>b</b>) B&amp;B; (<b>c</b>) IGA-B&amp;B.</p>
Full article ">Figure 14
<p>Equipment used during the experimental test: (<b>a</b>) UGV; (<b>b</b>) UAV.</p>
Full article ">Figure 15
<p>The experimental scene: (<b>a</b>) schematic diagram; (<b>b</b>) side view of the experimental scene.</p>
Full article ">Figure 16
<p>Trajectories of UAVs and UGVs: (<b>a</b>) trajectory of UAVs; (<b>b</b>) trajectory of UGVs.</p>
Full article ">
19 pages, 7383 KiB  
Article
Global Navigation Satellite Systems Signal Vulnerabilities in Unmanned Aerial Vehicle Operations: Impact of Affordable Software-Defined Radio
by Andrej Novák, Kristína Kováčiková, Branislav Kandera and Alena Novák Sedláčková
Drones 2024, 8(3), 109; https://doi.org/10.3390/drones8030109 - 20 Mar 2024
Cited by 9 | Viewed by 2057
Abstract
Spoofing, alongside jamming of the Global Navigation Satellite System signal, remains a significant hazard during general aviation or Unmanned Aerial Vehicle operations. As aircraft utilize various support systems for navigation, such as INS, an insufficient Global Navigation Satellite System signal renders Unmanned Aerial [...] Read more.
Spoofing, alongside jamming of the Global Navigation Satellite System signal, remains a significant hazard during general aviation or Unmanned Aerial Vehicle operations. As aircraft utilize various support systems for navigation, such as INS, an insufficient Global Navigation Satellite System signal renders Unmanned Aerial Vehicles nearly uncontrollable, thereby posing increased danger to operations within airspace and to individuals on the ground. This paper primarily focuses on assessing the impact of the budget friendly Software-Defined Radio, HackRF One 1.0, on the safety of Unmanned Aerial Vehicles operations. Considering the widespread use of Software-Defined Radio devices today, with some being reasonably inexpensive, understanding their influence on Unmanned Aerial Vehicles safety is crucial. The generation of artificial interference capable of posing a potential threat in expanding Unmanned Aerial Vehicles airspace is deemed unacceptable. Full article
(This article belongs to the Section Drone Communications)
Show Figures

Figure 1

Figure 1
<p>Scheme of SDR.</p>
Full article ">Figure 2
<p>Scheme of SDR configuration for GNSS signal transmitting.</p>
Full article ">Figure 3
<p>Faulty response from SDR.</p>
Full article ">Figure 4
<p>Correct answer from SDR.</p>
Full article ">Figure 5
<p>Process of generating binary file.</p>
Full article ">Figure 6
<p>Process of transmitting the spoof signal.</p>
Full article ">Figure 7
<p>Spectral analysis of the spectrum interest.</p>
Full article ">Figure 8
<p>Spectrum of interest while transmitting the spoof signal.</p>
Full article ">Figure 9
<p>Course of a reference measurements.</p>
Full article ">Figure 10
<p>Course of a measurements during transmitting of a spoof signal and scatter plot visualization.</p>
Full article ">Figure 11
<p>Course of a measurements during transmitting of a spoof signal without an active GLONASS receiver and scatter plot visualization.</p>
Full article ">
22 pages, 6075 KiB  
Article
Impact of Drone Battery Recharging Policy on Overall Carbon Emissions: The Traveling Salesman Problem with Drone
by Emine Es Yurek
Drones 2024, 8(3), 108; https://doi.org/10.3390/drones8030108 - 20 Mar 2024
Cited by 3 | Viewed by 2461
Abstract
This study investigates the traveling salesman problem with drone (TSP-D) from a sustainability perspective. In this problem, a truck and a drone simultaneously serve customers. Due to the limited battery and load capacity, the drone temporarily launches from and returns to the truck [...] Read more.
This study investigates the traveling salesman problem with drone (TSP-D) from a sustainability perspective. In this problem, a truck and a drone simultaneously serve customers. Due to the limited battery and load capacity, the drone temporarily launches from and returns to the truck after each customer visit. Previous studies indicate the potential of deploying drones to reduce delivery time and carbon emissions. However, they assume that the drone battery is swapped after each flight. In this study, we analyze the carbon emissions of the TSP-D under the recharging policy and provide a comparative analysis with the swapping policy. In the recharging policy, the drone is recharged simultaneously on top of the truck while the truck travels. A simulated annealing algorithm is proposed to solve this problem. The computational results demonstrate that the recharging policy can provide faster delivery and lower emissions than the swapping policy if the recharging is fast enough. Full article
(This article belongs to the Special Issue Advances of Drones in Logistics)
Show Figures

Figure 1

Figure 1
<p>An illustrative example of the TSP-D.</p>
Full article ">Figure 2
<p>An illustrative example of the solution vector.</p>
Full article ">Figure 3
<p>An illustrative example of the tour vector.</p>
Full article ">Figure 4
<p>The TSP-D solution after swap_in().</p>
Full article ">Figure 5
<p>The TSP-D solution after swap_cross().</p>
Full article ">Figure 6
<p>The TSP-D solution after swap_out().</p>
Full article ">Figure 7
<p>Three cases in swap_position().</p>
Full article ">Figure 8
<p>The TSP-D solution after swap_position().</p>
Full article ">Figure 9
<p>The TSP-D solution after relocating a truck-only node as an in-tandem node.</p>
Full article ">Figure 10
<p>The TSP-D solution after relocating a truck-only node in another tour as a truck-only node.</p>
Full article ">Figure 11
<p>Illustrative examples for relocate_out().</p>
Full article ">Figure 12
<p>The TSP-D solution after the insertion of an in-tandem node.</p>
Full article ">Figure 13
<p>The TSP-D solution after the insertion of a tour.</p>
Full article ">Figure 14
<p>The TSP-D solution after group().</p>
Full article ">Figure 15
<p>The TSP-D solution after destroy.</p>
Full article ">Figure 16
<p>Comparison of average distance per visit traveled by the truck and the drone for 50-customer data.</p>
Full article ">Figure 17
<p>Comparison of average distance per visit traveled by the truck and the drone for 100-customer data.</p>
Full article ">
25 pages, 3769 KiB  
Article
Harmonized Skies: A Survey on Drone Acceptance across Europe
by Maria Stolz, Anne Papenfuß, Franziska Dunkel and Eva Linhuber
Drones 2024, 8(3), 107; https://doi.org/10.3390/drones8030107 - 20 Mar 2024
Cited by 2 | Viewed by 3097
Abstract
This study investigated the public acceptance of drones in six European countries. For this purpose, an online questionnaire was created, which was completed by 2998 participants. The general attitude towards drones, concerns, approval for different use cases, minimum tolerable flight altitude, acceptable flight [...] Read more.
This study investigated the public acceptance of drones in six European countries. For this purpose, an online questionnaire was created, which was completed by 2998 participants. The general attitude towards drones, concerns, approval for different use cases, minimum tolerable flight altitude, acceptable flight areas, and the impact of personal and demographic attributes on drone acceptance were analyzed. Overall, attitudes towards drones were quite positive in the entire sample and even improved slightly in a second measurement at the end of the questionnaire. However, the results also show that acceptance strongly depends on the use case. Drones for civil and public applications are more widely accepted than those for private and commercial applications. Moreover, the population still has high concerns about privacy and safety. Knowledge about drones, interest in technologies, and age proved essential to predicting acceptance. Thus, tailored communication strategies, for example, through social media, can enhance public awareness and acceptance. Full article
(This article belongs to the Collection Feature Papers of Drones Volume II)
Show Figures

Figure 1

Figure 1
<p>Mean values of attitude toward drones at the beginning and end of the survey for the participating countries and the total sample. Whiskers indicate standard deviations.</p>
Full article ">Figure 2
<p>Participants’ approval for different drone applications. Whiskers indicate standard deviations.</p>
Full article ">Figure 3
<p>Approval for public and civil use cases compared to approval for private and commercial use cases in the different countries and the total sample. Whiskers indicate standard deviations.</p>
Full article ">Figure 4
<p>Participants’ single concerns about drones. Whiskers indicate standard deviations.</p>
Full article ">Figure 5
<p>Mean values of the scale related to concerns for the participating countries and the total sample. Whiskers indicate standard deviations.</p>
Full article ">Figure 6
<p>The word cloud shows the risks and concerns people associate with drones. The bigger the words, the more frequently they were stated by the participants.</p>
Full article ">Figure 7
<p>People’s approval for drones flying in different community types. Whiskers indicate standard deviations.</p>
Full article ">Figure 8
<p>People’s approval for drones flying in different city areas. Whiskers indicate standard deviations.</p>
Full article ">Figure 9
<p>Relative frequencies of acceptable minimum flight altitudes compared between public, private, and commercial drones.</p>
Full article ">Figure 10
<p>Correlations among the factors of the random forest model.</p>
Full article ">Figure 11
<p>The Feature Importance Plots indicate the relative importance of each feature according to its impact on the target variable. A relative importance of a feature of 0 corresponds to one percent and 1 corresponds to 100 percent.</p>
Full article ">Figure 12
<p>The Partial Dependence Plots show the predicted values of the target variables for each feature. The predicted values are presented on the <span class="html-italic">y</span>-axis, and the features’ categories are on the <span class="html-italic">x</span>-axis.</p>
Full article ">
14 pages, 9816 KiB  
Article
UAV Photogrammetric Surveys for Tree Height Estimation
by Giuseppina Vacca and Enrica Vecchi
Drones 2024, 8(3), 106; https://doi.org/10.3390/drones8030106 - 20 Mar 2024
Cited by 5 | Viewed by 2424
Abstract
In the context of precision agriculture (PA), geomatic surveys exploiting UAV (unmanned aerial vehicle) platforms allow the dimensional characterization of trees. This paper focuses on the use of low-cost UAV photogrammetry to estimate tree height, as part of a project for the phytoremediation [...] Read more.
In the context of precision agriculture (PA), geomatic surveys exploiting UAV (unmanned aerial vehicle) platforms allow the dimensional characterization of trees. This paper focuses on the use of low-cost UAV photogrammetry to estimate tree height, as part of a project for the phytoremediation of contaminated soils. Two study areas with different characteristics in terms of mean tree height (5 m; 0.7 m) are chosen to test the procedure even in a challenging context. Three campaigns are performed in an olive grove (Area 1) at different flying altitudes (30 m, 40 m, and 50 m), and one UAV flight is available for Area 2 (42 m of altitude), where three species are present: oleander, lentisk, and poplar. The workflow involves the elaboration of the UAV point clouds through the SfM (structure from motion) approach, digital surface models (DSMs), vegetation filtering, and a GIS-based analysis to obtain canopy height models (CHMs) for height extraction based on a local maxima approach. UAV-derived heights are compared with in-field measurements, and promising results are obtained for Area 1, confirming the applicability of the procedure for tree height extraction, while the application in Area 2 (shorter tree seedlings) is more problematic. Full article
(This article belongs to the Section Drones in Agriculture and Forestry)
Show Figures

Figure 1

Figure 1
<p>National and local contextualization of the study areas located in the South of Sardinia, Italy. (<b>a</b>) refers to Area 1 (olive grove site); (<b>b</b>) refers to Area 2 (plantation of poplar, lentisk, and oleander). Basemap: Google Satellite. The map was generated using Qgis software (version 3.28.15) and the coordinates are aligned to the ETRS89-ETRF2000-UTM32 reference system (EPSG: 6707).</p>
Full article ">Figure 2
<p>Tree rows and UAV campaigns in the study areas: (<b>a</b>) refers to Area 1; (<b>b</b>) refers to Area 2, during the GNSS-NRTK survey of the GCPs.</p>
Full article ">Figure 3
<p>Example of a circular target used as a ground control point (GCP).</p>
Full article ">Figure 4
<p>Dense point cloud of Flight 1 (Area 1).</p>
Full article ">Figure 5
<p>Schematic representation of the workflow for tree height extraction.</p>
Full article ">Figure 6
<p>Frequency distribution histograms of the residuals between the extracted and measured tree heights: (<b>a</b>) Area 1—Flights 1, 2, and 3; (<b>b</b>) Area 2—Flight a.</p>
Full article ">Figure 7
<p>Box and whisker plots showing the variance of the measured and estimated tree heights for the three flights in Area 1.</p>
Full article ">Figure 8
<p>Relationship between estimated and measured tree heights, linear regression, and related parameters for the three UAV campaigns in Area 1.</p>
Full article ">
17 pages, 1208 KiB  
Article
The Sound of Surveillance: Enhancing Machine Learning-Driven Drone Detection with Advanced Acoustic Augmentation
by Sebastian Kümmritz
Drones 2024, 8(3), 105; https://doi.org/10.3390/drones8030105 - 19 Mar 2024
Cited by 3 | Viewed by 2587
Abstract
In response to the growing challenges in drone security and airspace management, this study introduces an advanced drone classifier, capable of detecting and categorizing Unmanned Aerial Vehicles (UAVs) based on acoustic signatures. Utilizing a comprehensive database of drone sounds across EU-defined classes (C0 [...] Read more.
In response to the growing challenges in drone security and airspace management, this study introduces an advanced drone classifier, capable of detecting and categorizing Unmanned Aerial Vehicles (UAVs) based on acoustic signatures. Utilizing a comprehensive database of drone sounds across EU-defined classes (C0 to C3), this research leverages machine learning (ML) techniques for effective UAV identification. The study primarily focuses on the impact of data augmentation methods—pitch shifting, time delays, harmonic distortion, and ambient noise integration—on classifier performance. These techniques aim to mimic real-world acoustic variations, thus enhancing the classifier’s robustness and practical applicability. Results indicate that moderate levels of augmentation significantly improve classification accuracy. However, excessive application of these methods can negatively affect performance. The study concludes that sophisticated acoustic data augmentation can substantially enhance ML-driven drone detection, providing a versatile and efficient tool for managing drone-related security risks. This research contributes to UAV detection technology, presenting a model that not only identifies but also categorizes drones, underscoring its potential for diverse operational environments. Full article
(This article belongs to the Special Issue Advances in Detection, Security, and Communication for UAV)
Show Figures

Figure 1

Figure 1
<p>Visualization of a drone’s flight path over the Fraunhofer IVI test oval, with color-coded altitude indicators and microphone positions.</p>
Full article ">Figure 2
<p>Confusion matrices for four different classifiers (without augmentation), trained under identical conditions.</p>
Full article ">Figure 3
<p>Top: Spectrogram of the audio signal capturing a drone’s acoustic signature (drone model: ‘HP-X4 2020’; drone class: C3) during the outdoor experiment. Bottom: Classification results over time, showing the classifier’s predictions (based on the 2nd classifier with seed 1 in <a href="#drones-08-00105-t003" class="html-table">Table 3</a>).</p>
Full article ">Figure 4
<p>Classification results over time for a C3 (‘HP-X4 2020’) drone showing the classifiers predictions, trained with different degrees of noise amplitude.</p>
Full article ">Figure 5
<p>Classification results over time, showing the classifier’s predictions with an augmentation with a pitching of about +/−1.4 semitones (blue circles) and no augmentation (black dots).</p>
Full article ">Figure 6
<p>Classification results over time, showing the classifier’s predictions with a random delay augmentation (blue circles) and no augmentation (black dots).</p>
Full article ">Figure 7
<p>FFT analysis of drone acoustic signatures: effects of rotor change and comparison between different drone models.</p>
Full article ">
19 pages, 4884 KiB  
Article
Improved YOLOv7 Target Detection Algorithm Based on UAV Aerial Photography
by Zhen Bai, Xinbiao Pei, Zheng Qiao, Guangxin Wu and Yue Bai
Drones 2024, 8(3), 104; https://doi.org/10.3390/drones8030104 - 19 Mar 2024
Cited by 7 | Viewed by 2415
Abstract
With the rapid development of remote sensing technology, remote sensing target detection faces many problems; for example, there is still no good solution for small targets with complex backgrounds and simple features. In response to the above, we have added dynamic snake convolution [...] Read more.
With the rapid development of remote sensing technology, remote sensing target detection faces many problems; for example, there is still no good solution for small targets with complex backgrounds and simple features. In response to the above, we have added dynamic snake convolution (DSC) to YOLOv7. In addition, SPPFCSPC is used instead of the original spatial pyramid pooling structure; the original loss function was replaced with the EIoU loss function. This study was evaluated on UAV image data (VisDrone2019), which were compared with mainstream algorithms, and the experiments showed that this algorithm has a good average accuracy. Compared to the original algorithm, the mAP0.5 of the present algorithm is improved by 4.3%. Experiments proved that this algorithm outperforms other algorithms. Full article
Show Figures

Figure 1

Figure 1
<p>YOLOv7 network structure diagram.</p>
Full article ">Figure 2
<p>Schematic diagram of the dynamic serpentine convolution kernel coordinates computation and optional receptive fields.</p>
Full article ">Figure 3
<p>Feeling fields for standard convolution, variability convolution, and dynamic serpentine convolution.</p>
Full article ">Figure 4
<p>Improved SPP network.</p>
Full article ">Figure 5
<p>Loss function.</p>
Full article ">Figure 6
<p>Improved YOLOv7 overall network.</p>
Full article ">Figure 7
<p>Comparison curve of mAP between the benchmark algorithm and the improved algorithm during the training process.</p>
Full article ">Figure 8
<p>Effect of the improved algorithm in different scenarios.</p>
Full article ">Figure 8 Cont.
<p>Effect of the improved algorithm in different scenarios.</p>
Full article ">Figure 9
<p>Small targets (baseline algorithm).</p>
Full article ">Figure 10
<p>Small targets (improved algorithm).</p>
Full article ">Figure 11
<p>Complex background (baseline algorithm).</p>
Full article ">Figure 12
<p>Complex background (improved algorithm).</p>
Full article ">Figure 13
<p>Target occlusion (baseline algorithm).</p>
Full article ">Figure 14
<p>Target occlusion (improved algorithm).</p>
Full article ">Figure 15
<p>Dark background (baseline algorithm).</p>
Full article ">Figure 16
<p>Dark background (improved algorithm).</p>
Full article ">
25 pages, 9288 KiB  
Article
Modeling, Guidance, and Robust Cooperative Control of Two Quadrotors Carrying a “Y”-Shaped-Cable-Suspended Payload
by Erquan Wang, Jinyang Sun, Yuanyuan Liang, Boyu Zhou, Fangfei Jiang and Yang Zhu
Drones 2024, 8(3), 103; https://doi.org/10.3390/drones8030103 - 19 Mar 2024
Cited by 2 | Viewed by 1872
Abstract
This paper investigates the problem of cooperative payload delivery by two quadrotors with a novel “Y”-shaped cable that improves payload carrying and dropping efficiency. Compared with the existing “V”-shaped suspension, the proposed suspension method adds another payload swing degree of freedom to the [...] Read more.
This paper investigates the problem of cooperative payload delivery by two quadrotors with a novel “Y”-shaped cable that improves payload carrying and dropping efficiency. Compared with the existing “V”-shaped suspension, the proposed suspension method adds another payload swing degree of freedom to the quadrotor–payload system, making the modeling and control of such a system more challenging. In the modeling, the payload swing motion is decomposed into a forward–backward process and a lateral process, and the swing motion is then transmitted to the dynamics of the two quadrotors by converting it into disturbance cable pulling forces. A novel guidance and control framework is proposed, where a guidance law is designed to not only achieve formation transformation but also generate a local reference for the quadrotor, which does not have access to the global reference, based on which a cooperative controller is developed by incorporating an uncertainty and disturbance estimator to actively compensate for payload swing disturbance to achieve the desired formation trajectory tracking performance. A singular perturbation theory-based analysis shows that the proposed parameter mapping method, which unifies the parameter tuning of different control channels, allows us to tune a single parameter, ε, to quantitatively enhance both the formation control performance and system robustness. Simulation results verify the effectiveness of the proposed approach in different scenarios. Full article
(This article belongs to the Special Issue UAV Trajectory Generation, Optimization and Cooperative Control)
Show Figures

Figure 1

Figure 1
<p>A comparison among the “V”-shaped [<a href="#B4-drones-08-00103" class="html-bibr">4</a>,<a href="#B5-drones-08-00103" class="html-bibr">5</a>,<a href="#B6-drones-08-00103" class="html-bibr">6</a>], trapezoid-shaped [<a href="#B7-drones-08-00103" class="html-bibr">7</a>,<a href="#B8-drones-08-00103" class="html-bibr">8</a>,<a href="#B9-drones-08-00103" class="html-bibr">9</a>], and proposed “Y”-shaped suspension method.</p>
Full article ">Figure 2
<p>Frames for the modeling.</p>
Full article ">Figure 3
<p>The forward–backward and lateral processes of the “Y”-shaped-cable-suspended payload swing motion.</p>
Full article ">Figure 4
<p>The communication network topology.</p>
Full article ">Figure 5
<p>The proposed guidance and robust cooperative control scheme for the quadrotor formation.</p>
Full article ">Figure 6
<p>Scenario 1: flight trajectories of the formation and other states of the quadrotors and the payload.</p>
Full article ">Figure 7
<p>Scenario 1: UDE disturbance estimation signals.</p>
Full article ">Figure 8
<p>Scenario 2: flight trajectories of the formation and other states of the quadrotors and the payload.</p>
Full article ">Figure 9
<p>Scenario 2: UDE disturbance estimation signals.</p>
Full article ">Figure 10
<p>Scenario 3: flight trajectories of the formation and other states of the quadrotors and the payload.</p>
Full article ">Figure 11
<p>Scenario 3: UDE disturbance estimation signals.</p>
Full article ">Figure 12
<p>Scenario 4: formation size and UDE disturbance estimation signals under different <math display="inline"><semantics> <mi>ε</mi> </semantics></math> parameters.</p>
Full article ">
22 pages, 10049 KiB  
Article
Design, Modeling, and Control of a Composite Tilt-Rotor Unmanned Aerial Vehicle
by Zhuang Liang, Li Fan, Guangwei Wen and Zhixiong Xu
Drones 2024, 8(3), 102; https://doi.org/10.3390/drones8030102 - 16 Mar 2024
Cited by 3 | Viewed by 5030
Abstract
Tilt-rotor unmanned aerial vehicles combine the advantages of multirotor and fixed-wing aircraft, offering features like rapid takeoff and landing, extended endurance, and wide flight conditions. This article provides a summary of the design, modeling, and control of a composite tilt-rotor. During modeling process, [...] Read more.
Tilt-rotor unmanned aerial vehicles combine the advantages of multirotor and fixed-wing aircraft, offering features like rapid takeoff and landing, extended endurance, and wide flight conditions. This article provides a summary of the design, modeling, and control of a composite tilt-rotor. During modeling process, aerodynamic modeling was performed on the tilting and non-tilting parts based on the subcomponent modeling method, and CFD simulation analysis was conducted on the entire unmanned aerial vehicle to obtain its accurate aerodynamic characteristics. In the process of modeling the motor propeller, the reduction of motor thrust and torque due to forward flow and tilt angle velocity is thoroughly examined, which is usually ignored in most tilt UAV propeller models. In the controller design, this paper proposes a fusion ADRC control strategy suitable for vertical takeoff and landing of this type of tiltrotor. The control system framework is built using Simulink, and the control algorithm’s efficiency has been verified through simulation testing. Through the proposed control scheme, it is possible for the composite tiltrotor unmanned aerial vehicle to smoothly transition between multirotor and fixed-wing flight modes. Full article
(This article belongs to the Special Issue A UAV Platform for Flight Dynamics and Control System)
Show Figures

Figure 1

Figure 1
<p>Overview of Target Aircraft Design.</p>
Full article ">Figure 2
<p>Mission profile.</p>
Full article ">Figure 3
<p>Tilting rotor drive mechanism and drive process.</p>
Full article ">Figure 4
<p>Coordinate System Definition.</p>
Full article ">Figure 5
<p>Force analysis of tilting propeller.</p>
Full article ">Figure 6
<p>Definition of distance between tilting propeller and center of gravity.</p>
Full article ">Figure 7
<p>Design block diagram of hierarchical control system.</p>
Full article ">Figure 8
<p>The structure of the ADRC controller [<a href="#B29-drones-08-00102" class="html-bibr">29</a>].</p>
Full article ">Figure 9
<p>Tilt-rotor UAV simulation system in MATLAB Simulink.</p>
Full article ">Figure 10
<p>CFD simulation values for (<b>a</b>) lift coefficient <math display="inline"><semantics> <mrow> <msub> <mi>C</mi> <mi>L</mi> </msub> </mrow> </semantics></math>, (<b>b</b>) drag coefficient <math display="inline"><semantics> <mrow> <msub> <mi>C</mi> <mi>D</mi> </msub> </mrow> </semantics></math>, and (<b>c</b>) sideslip coefficient <math display="inline"><semantics> <mrow> <msub> <mi>C</mi> <mi>Y</mi> </msub> </mrow> </semantics></math> of the fixed parts of the target tiltrotor.</p>
Full article ">Figure 11
<p>CFD simulation values for (<b>a</b>) pitch moment coefficient <math display="inline"><semantics> <mrow> <msub> <mi>C</mi> <mi>m</mi> </msub> </mrow> </semantics></math>, (<b>b</b>) sideslip moment coefficient <math display="inline"><semantics> <mrow> <msub> <mi>C</mi> <mi>n</mi> </msub> </mrow> </semantics></math>, and (<b>c</b>) rolling moment coefficient <math display="inline"><semantics> <mrow> <msub> <mi>C</mi> <mi>l</mi> </msub> </mrow> </semantics></math> of the fixed parts of the target tiltrotor.</p>
Full article ">Figure 12
<p>Flight state response curve of attitude in velocity response control simulation without disturbance.</p>
Full article ">Figure 13
<p>Flight state response curve of velocity in velocity response control simulation without disturbance.</p>
Full article ">Figure 14
<p>Flight state response curve of Rotors’ RPM in velocity response control simulation without disturbance.</p>
Full article ">Figure 15
<p>Flight state response curve of attitude in velocity response control simulation with disturbance.</p>
Full article ">Figure 16
<p>3D trajectory tracking in UAV position mode.</p>
Full article ">Figure 17
<p>Flight state response curve of attitude in position mode control simulation.</p>
Full article ">Figure 18
<p>Flight state response curve of velocity in position mode control simulation.</p>
Full article ">Figure 19
<p>Flight state response curve of position in position mode control simulation.</p>
Full article ">
25 pages, 15207 KiB  
Article
Design of Pseudo-Command Restricted Controller for Tailless Unmanned Aerial Vehicles Based on Attainable Moment Set
by Linxiao Han, Jianbo Hu, Yingyang Wang, Jiping Cong and Peng Zhang
Drones 2024, 8(3), 101; https://doi.org/10.3390/drones8030101 - 15 Mar 2024
Viewed by 1350
Abstract
This work investigates the pseudo-command restricted problem for tailless unmanned aerial vehicles with snake-shaped maneuver flight missions. The main challenge of designing such a pseudo-command restricted controller lies in the fact that the necessity of control allocation means it will be difficult to [...] Read more.
This work investigates the pseudo-command restricted problem for tailless unmanned aerial vehicles with snake-shaped maneuver flight missions. The main challenge of designing such a pseudo-command restricted controller lies in the fact that the necessity of control allocation means it will be difficult to provide a precise envelope of pseudo-command to the flight controller; designing a compensation system to deal with insufficient capabilities beyond this envelope is another challenge. The envelope of pseudo-command can be expressed by attainable moment sets, which leave some open problems, such as how to obtain the attainable moment sets online and how to reduce the computational complexity of the algorithm, as well as how to ensure independent control allocation and the convexity of attainable moments sets. In this article, an innovative algorithm is proposed for the calculation of attainable moment sets, which can be implemented by fitting wind tunnel data into a function to solve the problems presented above. Furthermore, the algorithm is independent of control allocation and can be obtained online. Moreover, based on the above attainable moment sets algorithm, a flight performance assurance system is designed, which not only guarantees that the command is constrained within the envelope so that its behavior is more predictable, but also supports adaptive compensation for the pseudo-command restricted controller. Finally, the effectiveness of the AMS algorithm and the advantages of the pseudo-command restricted control system are validated through two sets of independent simulations. Full article
Show Figures

Figure 1

Figure 1
<p>Top view of ICE.</p>
Full article ">Figure 2
<p>AMB of <math display="inline"><semantics> <mi>L</mi> </semantics></math> that changes in real-time with flight states.</p>
Full article ">Figure 3
<p>Comparison of constraint effects between Equations (18) and (19).</p>
Full article ">Figure 4
<p>FPA-NDI control system.</p>
Full article ">Figure 5
<p>Simulation analysis of AMB.</p>
Full article ">Figure 6
<p>Calculation of <math display="inline"><semantics> <mrow> <msub> <mi>L</mi> <mrow> <mi>max</mi> <mo>/</mo> <mi>min</mi> </mrow> </msub> </mrow> </semantics></math> and comparison with <math display="inline"><semantics> <mi>L</mi> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>L</mi> <mi mathvariant="normal">c</mi> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 7
<p>Calculation of <math display="inline"><semantics> <mrow> <msub> <mi>M</mi> <mrow> <mi>max</mi> <mo>/</mo> <mi>min</mi> </mrow> </msub> </mrow> </semantics></math> and comparison with <math display="inline"><semantics> <mi>M</mi> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>M</mi> <mi mathvariant="normal">c</mi> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 8
<p>Calculation of <math display="inline"><semantics> <mrow> <msub> <mi>N</mi> <mrow> <mi>max</mi> <mo>/</mo> <mi>min</mi> </mrow> </msub> </mrow> </semantics></math> and comparison with <math display="inline"><semantics> <mi>N</mi> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>N</mi> <mi mathvariant="normal">c</mi> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 9
<p>Curves of <math display="inline"><semantics> <mi>μ</mi> </semantics></math>, <math display="inline"><semantics> <mi>α</mi> </semantics></math>, and <math display="inline"><semantics> <mi>β</mi> </semantics></math>.</p>
Full article ">Figure 10
<p>Comparative simulation results of <math display="inline"><semantics> <mi>μ</mi> </semantics></math> and <math display="inline"><semantics> <mover accent="true"> <mi>μ</mi> <mo stretchy="false">˜</mo> </mover> </semantics></math>.</p>
Full article ">Figure 11
<p>Comparative simulation results of <math display="inline"><semantics> <mi>α</mi> </semantics></math> and <math display="inline"><semantics> <mover accent="true"> <mi>α</mi> <mo stretchy="false">˜</mo> </mover> </semantics></math>.</p>
Full article ">Figure 12
<p>Comparative simulation results of <math display="inline"><semantics> <mi>β</mi> </semantics></math> and <math display="inline"><semantics> <mover accent="true"> <mi>β</mi> <mo stretchy="false">˜</mo> </mover> </semantics></math>.</p>
Full article ">Figure 13
<p>Comparative simulation results of roll rate <math display="inline"><semantics> <mi>p</mi> </semantics></math>.</p>
Full article ">Figure 14
<p>Comparative simulation results of pitch rate <math display="inline"><semantics> <mi>q</mi> </semantics></math>.</p>
Full article ">Figure 15
<p>Comparative simulation results of yaw rate <math display="inline"><semantics> <mi>r</mi> </semantics></math>.</p>
Full article ">Figure 16
<p>Comparative simulation results of tracking errors <math display="inline"><semantics> <mover accent="true"> <mi>p</mi> <mo stretchy="false">˜</mo> </mover> </semantics></math>, <math display="inline"><semantics> <mover accent="true"> <mi>q</mi> <mo stretchy="false">˜</mo> </mover> </semantics></math>, <math display="inline"><semantics> <mover accent="true"> <mi>r</mi> <mo stretchy="false">˜</mo> </mover> </semantics></math>.</p>
Full article ">Figure 17
<p>Comparative simulation results between <math display="inline"><semantics> <mi>L</mi> </semantics></math> and AMB.</p>
Full article ">Figure 18
<p>Comparative simulation results between <math display="inline"><semantics> <mi>M</mi> </semantics></math> and AMB.</p>
Full article ">Figure 19
<p>Comparative simulation results between <math display="inline"><semantics> <mi>N</mi> </semantics></math> and AMB.</p>
Full article ">Figure 20
<p>FPA system error compensation term <math display="inline"><semantics> <mstyle mathvariant="bold-italic" mathsize="normal"> <mi>ς</mi> </mstyle> </semantics></math>.</p>
Full article ">Figure 21
<p>Comparative simulation results of <math display="inline"><semantics> <mi>υ</mi> </semantics></math>.</p>
Full article ">Figure 22
<p>Snake-shaped maneuver flight trajectory with 4 perspectives.</p>
Full article ">Figure 23
<p>Comparative simulation results of <math display="inline"><semantics> <mi>ϕ</mi> </semantics></math>.</p>
Full article ">Figure 24
<p>Comparative simulation results of <math display="inline"><semantics> <mi>θ</mi> </semantics></math>.</p>
Full article ">Figure 25
<p>Comparative simulation results of <math display="inline"><semantics> <mi>Ψ</mi> </semantics></math>.</p>
Full article ">Figure 26
<p>Comparative simulation results of LEF rudder deviation.</p>
Full article ">Figure 27
<p>Comparative simulation results of SSD &amp; PF rudders deviation.</p>
Full article ">Figure 28
<p>Comparative simulation results of AMT &amp; ELE rudders deviation.</p>
Full article ">
18 pages, 9809 KiB  
Article
Design and Demonstration of a Tandem Dual-Rotor Aerial–Aquatic Vehicle
by Sihuan Wu, Maosen Shao, Sifan Wu, Zhilin He, Hui Wang, Jinxiu Zhang and Yue You
Drones 2024, 8(3), 100; https://doi.org/10.3390/drones8030100 - 15 Mar 2024
Cited by 4 | Viewed by 2607
Abstract
Aerial–aquatic vehicles (AAVs) hold great promise for marine applications, offering adaptability to diverse environments by seamlessly transitioning between underwater and aerial operations. Nevertheless, the design of AAVs poses inherent challenges, owing to the distinct characteristics of different fluid media. This article introduces a [...] Read more.
Aerial–aquatic vehicles (AAVs) hold great promise for marine applications, offering adaptability to diverse environments by seamlessly transitioning between underwater and aerial operations. Nevertheless, the design of AAVs poses inherent challenges, owing to the distinct characteristics of different fluid media. This article introduces a novel solution in the form of a tandem dual-rotor aerial–aquatic vehicle, strategically engineered to overcome these challenges. The proposed vehicle boasts a slender and streamlined body, enhancing its underwater mobility while utilizing a tandem rotor for aerial maneuvers. Outdoor scene tests were conducted to assess the tandem dual-rotor AAV’s diverse capabilities, including flying, hovering, and executing repeated cross-media locomotion. Notably, its versatility was further demonstrated through swift surface swimming on water. In addition to aerial evaluations, an underwater experiment was undertaken to evaluate the AAV’s ability to traverse narrow underwater passages. This capability was successfully validated through the creation of a narrow underwater gap. The comprehensive exploration of the tandem dual-rotor AAV’s potential is presented in this article, encompassing its foundational principles, overall design, simulation analysis, and avionics system design. The preliminary research and design outlined herein offer a proof of concept for the tandem dual-rotor AAV, establishing a robust foundation for AAVs seeking optimal performance in both water and air environments. This contribution serves as a valuable reference solution for the advancement of AAV technology. Full article
Show Figures

Figure 1

Figure 1
<p>Schematic diagram of tandem dual-rotor aerial–aquatic vehicle process and application scenario.</p>
Full article ">Figure 2
<p>A tandem twin-rotor AAV design layout display and size annotation. (<b>a</b>) Overall 3D structure and external device location display. (<b>b</b>) Schematic diagram of internal structure and equipment installation. (<b>c</b>) Top view and size display. (<b>d</b>) Rear view and size display.</p>
Full article ">Figure 3
<p>Model of the tandem twin-rotor AAV with the marks on the body coordinate systems and inertial coordinate system.</p>
Full article ">Figure 4
<p>Principles of tandem dual-rotor AAV motion control. (<b>a</b>) Principle of roll motion. (<b>b</b>) Principle of yaw motion. (<b>c</b>) Principle of pitch motion.</p>
Full article ">Figure 5
<p>Principles of underwater motion control.</p>
Full article ">Figure 6
<p>Control block diagram of the tandem twin-rotor AAV.</p>
Full article ">Figure 7
<p>PID control simulation of AAV flight and hover process of the AAV.</p>
Full article ">Figure 8
<p>Control simulation result of the tandem twin-rotor AAV. (<b>a</b>) PID control simulation of AAV three-dimensional trajectory tracking (<b>b</b>) trajectory tracking error.</p>
Full article ">Figure 9
<p>Prototype and mass distribution of the tandem twin-rotor AAV. (<b>a</b>) No buoyancy material housing prototype; the shell of the design was replaced by a block buoyant material, and two states of propeller folding and unfolding are shown. (<b>b</b>) Mass distribution and mass proportion.</p>
Full article ">Figure 10
<p>Composition of integrated electronic system of the tandem twin-rotor AAV.</p>
Full article ">Figure 11
<p>(<b>a</b>) Static thrust and power consumption of the propulsion system at different PWM inputs. (<b>b</b>) Actual PWM output of the two motors during surface takeoff and air flight. (<b>c</b>) Schematic diagram of the effective area of two non-overlapping twin rotors.</p>
Full article ">Figure 12
<p>Outdoor hover test of the tandem twin-rotor AAV. (<b>a</b>) Hover flight test over water. (<b>b</b>) Latitude, longitude, and altitude data graph, and hover attitude data graph, roll, pitch and yaw.</p>
Full article ">Figure 13
<p>Full-mission experiment. (<b>a</b>) The process of water entry. (<b>b</b>) Depth and attitude data diagram of the water entry and water exit process; blue represents the process underwater. (<b>c</b>) The process of water exit.</p>
Full article ">Figure 14
<p>Outdoor test of the tandem twin-rotor AAV. (Ⅰ) Take off from rocks on the ground. (Ⅱ) Flight and flight perspective. (Ⅲ) Water entry and underwater perspective. (Ⅳ) Water exit and underwater perspective. (Ⅴ) Return back.</p>
Full article ">Figure 15
<p>(<b>a</b>) The process of an “S”-shaped path swimming on the water surface. (<b>b</b>) Data diagram of the tandem twin-rotor AAV attitude and tail thruster tilting angle.</p>
Full article ">Figure 16
<p>The process of moving underwater and crossing the gap. (<b>a</b>) The process of through the narrow gap and circular ring. (<b>b</b>) A data diagram of AAV depth information during underwater motion, and a data diagram of attitude when crossing the narrow gap.</p>
Full article ">
15 pages, 2235 KiB  
Article
Unmanned Aircraft Systems in Road Assessment: A Novel Approach to the Pavement Condition Index and VIZIR Methodologies
by Diana Marcela Ortega Rengifo, Jose Capa Salinas, Javier Alexander Perez Caicedo and Manuel Alejandro Rojas Manzano
Drones 2024, 8(3), 99; https://doi.org/10.3390/drones8030099 - 14 Mar 2024
Viewed by 2410
Abstract
This paper presents an innovative approach to road assessment, focusing on enhancing the Pavement Condition Index (PCI) and Visión Inspection de Zones et Itinéraires Á Risque (VIZIR) methodologies by integrating Unmanned Aircraft System (UAS) technology. The research was conducted in an urban setting, [...] Read more.
This paper presents an innovative approach to road assessment, focusing on enhancing the Pavement Condition Index (PCI) and Visión Inspection de Zones et Itinéraires Á Risque (VIZIR) methodologies by integrating Unmanned Aircraft System (UAS) technology. The research was conducted in an urban setting, utilizing a UAS to capture high-resolution imagery, which was subsequently processed to generate detailed orthomosaics of road surfaces. This study critically analyzed the discrepancies between traditional field measurements and UAS-derived data in pavement condition assessment. The study findings demonstrate that photogrammetry-derived data from UAS offer at least similar or, in some cases, improved information on the collection of a comprehensive state of roadways, particularly in local and collector roads. Furthermore, this study proposed key modifications to the existing methodologies, including dividing the road network into segments for more precise and relevant data collection. These enhancements aim to address the limitations of current practices in capturing the diverse and dynamic conditions of urban infrastructure. Integrating UAS technology improves the measurement of pavement condition assessments and offers a more efficient, cost-effective, and scalable approach to urban infrastructure management. The implications of this study are significant for urban planners and policymakers, providing a robust framework for future infrastructure assessment and maintenance strategies. Full article
Show Figures

Figure 1

Figure 1
<p>Methodology followed in this study for phases I, II, and III.</p>
Full article ">Figure 2
<p>Summary of PCI methodology.</p>
Full article ">Figure 3
<p>Summary of VIZIR methodology.</p>
Full article ">Figure 4
<p>DJI Mavic Air used in this study.</p>
Full article ">Figure 5
<p>Flight plan over the study area at an altitude of 5.00 m (16.40 ft).</p>
Full article ">Figure 6
<p>Flight plan over the study area at an altitude of 3.00 m (9.84 ft).</p>
Full article ">Figure 7
<p>Pavement section captured from traditional visual inspection.</p>
Full article ">Figure 8
<p>Three-dimensional models of the area of study generated by images captured by UAS.</p>
Full article ">Figure 9
<p>Measurement comparison using traditional tools (<b>left</b>) and computational tools (<b>right</b>). The measurements to the right show a perimeter of 50.40 m and area of 41.20 m<sup>2</sup>.</p>
Full article ">Figure 10
<p>Defect comparison between visual and UAS inspection measurements.</p>
Full article ">
21 pages, 7888 KiB  
Article
Toward Virtual Testing of Unmanned Aerial Spraying Systems Operating in Vineyards
by Manuel Carreño Ruiz, Nicoletta Bloise, Giorgio Guglieri and Domenic D’Ambrosio
Drones 2024, 8(3), 98; https://doi.org/10.3390/drones8030098 - 13 Mar 2024
Viewed by 2111
Abstract
In recent times, the objective of reducing the environmental impact of the agricultural industry has led to the mechanization of the sector. One of the consequences of this is the everyday increasing use of Unmanned Aerial Systems (UAS) for different tasks in agriculture, [...] Read more.
In recent times, the objective of reducing the environmental impact of the agricultural industry has led to the mechanization of the sector. One of the consequences of this is the everyday increasing use of Unmanned Aerial Systems (UAS) for different tasks in agriculture, such as spraying operations, mapping, or diagnostics, among others. Aerial spraying presents an inherent problem associated with the drift of small droplets caused by their entrainment in vortical structures such as tip vortices produced at the tip of rotors and wings. This problem is aggravated by other dynamic physical phenomena associated with the actual spray operation, such as liquid sloshing in the tank, GPS inaccuracies, wind gusts, and autopilot corrections, among others. This work focuses on analyzing the impact of nozzle position and liquid sloshing on droplet deposition through numerical modeling. To achieve this, the paper presents a novel six degrees of freedom numerical model of a DJI Matrice 600 equipped with a spray system. The spray is modeled using Lagrangian particles and the liquid sloshing is modeled with an interface-capturing method known as Volume of Fluid (VOF) approach. The model is tested in a spraying operation at a constant velocity of 2 m/s in a virtual vineyard. The maneuver is achieved using a PID controller that drives the angular rates of the rotors. This spraying mission simulator was used to obtain insights into optimal nozzle selection and positioning by quantifying the amount of droplet deposition. Full article
(This article belongs to the Special Issue Feature Papers for Drones in Agriculture and Forestry Section)
Show Figures

Figure 1

Figure 1
<p>Top view of DJI Matrice 600 (Body-Fixed axes) with nozzles aligned with wind direction.</p>
Full article ">Figure 2
<p>Hydraulic circuit of spraying system.</p>
Full article ">Figure 3
<p>Hexarotor sketch in the body frame.</p>
Full article ">Figure 4
<p>CAD model of the DJI equipped with spray system.</p>
Full article ">Figure 5
<p>Robust cascade PID position and attitude control.</p>
Full article ">Figure 6
<p>Sketch of a standard vineyard in which we based our numerical model.</p>
Full article ">Figure 7
<p>Geometry used in our numerical model.</p>
Full article ">Figure 8
<p>Computational domain and boundary conditions.</p>
Full article ">Figure 9
<p>Computational grid.</p>
Full article ">Figure 10
<p>Lagrangian particles injected by an 80 degrees hollow cone nozzle at 2 bar.</p>
Full article ">Figure 11
<p>Comparison of: (<b>a</b>) x-velocity and (<b>b</b>) <math display="inline"><semantics> <mi>θ</mi> </semantics></math> for the two models.</p>
Full article ">Figure 12
<p>Configuration of the virtual UAS testing environment.</p>
Full article ">Figure 13
<p>Comparative response between sloshing and non-sloshing simulation for different PID gain scheduling strategies.</p>
Full article ">Figure 14
<p>Snapshots of the mission at different times. Velocity magnitude is shown on the UAS symmetry plane, and the particles are colored depending on their diameter. The volume fraction of water inside the tank is also presented.</p>
Full article ">Figure 15
<p>Ground droplet deposition colored by diameters for nozzles: (<b>a</b>) 40° under leading rotor with standard orientation, (<b>b</b>) 40° under leading rotor with normal orientation, (<b>c</b>) 80° under leading rotor with standard orientation, (<b>d</b>) 40° under rear rotor with standard orientation, and (<b>e</b>) 80° under rear rotor with standard orientation. Top view.</p>
Full article ">Figure 16
<p>Ground droplet deposition volume for nozzles: (<b>a</b>) 40° under leading rotor with standard orientation, (<b>b</b>) 40° under leading rotor with normal orientation, (<b>c</b>) 80° under leading rotor with standard orientation, (<b>d</b>) 40° under rear rotor with standard orientation, and (<b>e</b>) 80° under rear rotor with standard orientation.</p>
Full article ">
24 pages, 8099 KiB  
Article
Species-Level Classification of Peatland Vegetation Using Ultra-High-Resolution UAV Imagery
by Gillian Simpson, Caroline J. Nichol, Tom Wade, Carole Helfter, Alistair Hamilton and Simon Gibson-Poole
Drones 2024, 8(3), 97; https://doi.org/10.3390/drones8030097 - 13 Mar 2024
Cited by 4 | Viewed by 2560
Abstract
Peatland restoration projects are being employed worldwide as a form of climate change mitigation due to their potential for long-term carbon sequestration. Monitoring these environments (e.g., cover of keystone species) is therefore essential to evaluate success. However, existing studies have rarely examined peatland [...] Read more.
Peatland restoration projects are being employed worldwide as a form of climate change mitigation due to their potential for long-term carbon sequestration. Monitoring these environments (e.g., cover of keystone species) is therefore essential to evaluate success. However, existing studies have rarely examined peatland vegetation at fine scales due to its strong spatial heterogeneity and seasonal canopy development. The present study collected centimetre-scale multispectral Uncrewed Aerial Vehicle (UAV) imagery with a Parrot Sequoia camera (2.8 cm resolution; Parrot Drones SAS, Paris, France) in a temperate peatland over a complete growing season. Supervised classification algorithms were used to map the vegetation at the single-species level, and the Maximum Likelihood classifier was found to perform best at the site level (69% overall accuracy). The classification accuracy increased with the spatial resolution of the input data, and a large reduction in accuracy was observed when employing imagery of >11 cm resolution. Finally, the most accurate classifications were produced using imagery collected during the peak (July–August) or early growing season (start of May). These findings suggest that despite the strong heterogeneity of peatlands, these environments can be mapped at the species level using UAVs. Such an approach would benefit studies estimating peatland carbon emissions or using the cover of keystone species to evaluate restoration projects. Full article
Show Figures

Figure 1

Figure 1
<p>UAV surveying at Auchencorth Moss. Shown in (<b>a</b>) is the custom-built Tarot T680 Pro (Wenzhou Tarot Aviation Technology Co., Wenzhou, China) and mounted Parrot Sequoia camera positioned over a spectral calibration panel (photo credit: G. Simpson). Shown on the right are photographs taken with the Mavic 2 Pro (SZ DJI Technology Co., Ltd., Shenzhen, China): (<b>b</b>) is an aerial photograph of the survey area; and (<b>c</b>) shows one of the individual images acquired during the RGB survey, which highlights the strong spatial heterogeneity of the site.</p>
Full article ">Figure 2
<p>Diurnal overview of the solar conditions on the multispectral survey days. Shown are: the period of UAV image acquisition (yellow shaded areas); half-hourly incoming photosynthetically active radiation (PAR) measured at the site (black dots; SKP215, Skye Instruments Ltd., Llandrindod Wells, UK), and solar zenith angle (blue line) calculated using the NOAA Solar Calculator (<a href="https://gml.noaa.gov/grad/solcalc/" target="_blank">https://gml.noaa.gov/grad/solcalc/</a>, accessed on 21 January 2022). From top to bottom are the survey days: 14 May (overcast), 2 June (clear), 23 July (clear), 3 August (overcast), 25 August (clear), 20 September (clear), and 15 October 2021 (clear).</p>
Full article ">Figure 3
<p>Photographs of the eleven dominant upper-canopy species identified by the ground survey. Photo credit: G. Simpson.</p>
Full article ">Figure 4
<p>Outline of the data used to explore the impact of the spatial and temporal resolutions of the UAV imagery on the classification accuracy.</p>
Full article ">Figure 5
<p>RGB (true-colour composite, (<b>a</b>)) and multispectral (false-colour composite, (<b>b</b>); RGB = b4-b2-b1) orthomosaics of the study area taken on 3 August 2021. The RGB image on the left (<b>a</b>) shows the location of training (red) and validation (grey) data points used in the classification analysis.</p>
Full article ">Figure 6
<p>Spectral signatures of the dominant vegetation species at Auchencorth Moss. The reflectance data shown are from the field spectroscopy measurements conducted with the ASD on 23 July 2021. The gaps in the presented spectra result from the removal of bands affected by noise and atmospheric water absorption features. The wavelengths sampled by the four Parrot Sequoia bands (green, red, red-edge, and NIR, respectively) are depicted by the vertical grey bars towards the left-hand side of the plot.</p>
Full article ">Figure 7
<p>Impact of the spatial resolution on the image classification. Shown are false-colour composites of the multispectral imagery (left; RGB = b4-b2-b1) used as input for the Maximum Likelihood (ML) classification (right; see legend). A coarsening image resolution is shown from top to bottom (2.8 cm GSD to 22.4 cm GSD). Note, these classifications were conducted using the GNSS survey point data for ground validation.</p>
Full article ">Figure 8
<p>Multispectral imagery of Auchencorth Moss over the 2021 growing season. Shown are false-colour composites (RGB = b4-b2-b1) of the survey area acquired on three dates: 14 May (<b>a</b>); 3 August (<b>b</b>); and 15 October (<b>c</b>).</p>
Full article ">Figure 9
<p>Temporal vegetation classifications of Auchencorth Moss over the 2021 growing season. Shown are the results of the Maximum Likelihood (ML) classifier run with imagery acquired on 14 May (<b>a</b>); 3 August (<b>b</b>); and 15 October (<b>c</b>). Note, the presented classifications varied in accuracy and were conducted using the GNSS survey point data for ground validation.</p>
Full article ">
21 pages, 841 KiB  
Article
Generalized Labeled Multi-Bernoulli Filter-Based Passive Localization and Tracking of Radiation Sources Carried by Unmanned Aerial Vehicles
by Jun Zhao, Renzhou Gui and Xudong Dong
Drones 2024, 8(3), 96; https://doi.org/10.3390/drones8030096 - 12 Mar 2024
Viewed by 1565
Abstract
This paper discusses a key technique for passive localization and tracking of radiation sources, which obtains the motion trajectory of radiation sources carried by unmanned aerial vehicles (UAVs) by continuously or periodically localizing it without the active participation of the radiation sources. However, [...] Read more.
This paper discusses a key technique for passive localization and tracking of radiation sources, which obtains the motion trajectory of radiation sources carried by unmanned aerial vehicles (UAVs) by continuously or periodically localizing it without the active participation of the radiation sources. However, the existing methods have some limitations in complex signal environments and non-stationary wireless propagation that impact the accuracy of localization and tracking. To address these challenges, this paper extends the δ-generalized labeled multi-Bernoulli (GLMB) filter to the scenario of passive localization and tracking based on the random finite-set (RFS) framework and provides the extended Kalman filter (EKF) and unscented Kalman filter (UKF) implementations of the δ-GLMB filter, which fully take into account the nonlinear motion of the radiation source. By modeling the “obstacle scenario” and the influence of external factors (e.g., weather, terrain), our proposed GLMB filter can accurately track the target and capture its motion trajectory. Simulation results verify the effectiveness of the GLMB filter in target identification and state tracking. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The passive localization model between the moving BS and multi-radiation sources carried by UAVs.</p>
Full article ">Figure 2
<p>The directed graph comprises nodes <math display="inline"><semantics> <mrow> <msub> <mi>l</mi> <mn>1</mn> </msub> <mo>,</mo> <mo>⋯</mo> <mo>,</mo> <msub> <mi>l</mi> <mfenced open="|" close="|"> <mi>I</mi> </mfenced> </msub> </mrow> </semantics></math> along with their associated cost function values <math display="inline"><semantics> <mrow> <msup> <mi>c</mi> <mfenced separators="" open="(" close=")"> <mrow> <mi>I</mi> <mo>,</mo> <mi>ϑ</mi> </mrow> </mfenced> </msup> <mfenced separators="" open="(" close=")"> <msub> <mi>l</mi> <mn>1</mn> </msub> </mfenced> <mo>,</mo> <mo>⋯</mo> <mo>,</mo> <msup> <mi>c</mi> <mfenced separators="" open="(" close=")"> <mrow> <mi>I</mi> <mo>,</mo> <mi>ϑ</mi> </mrow> </mfenced> </msup> <mfenced separators="" open="(" close=")"> <msub> <mi>l</mi> <mfenced open="|" close="|"> <mi>I</mi> </mfenced> </msub> </mfenced> </mrow> </semantics></math>. Here, <math display="inline"><semantics> <mi mathvariant="bold">S</mi> </semantics></math> and <math display="inline"><semantics> <mi mathvariant="bold">E</mi> </semantics></math> denote the starting and ending nodes, respectively.</p>
Full article ">Figure 3
<p>Multi-target trajectories. The start/stop positions of each trajectory are denoted as <math display="inline"><semantics> <mrow> <mi mathvariant="normal">o</mi> <mo>/</mo> </mrow> </semantics></math>△, and the stationery BS is denoted as □ in red.</p>
Full article ">Figure 4
<p>Estimated and truth trajectories.</p>
Full article ">Figure 5
<p>The OSPA distance, localization, and cardinality (<math display="inline"><semantics> <mrow> <mi>p</mi> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>c</mi> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> m) for the <math display="inline"><semantics> <mi>δ</mi> </semantics></math>-GLMB-UKF, CPHD-UKF, and CPHD-UKF filters. (<b>a</b>) OSPA distance; (<b>b</b>) OSPA localization; (<b>c</b>) OSPA cardinality.</p>
Full article ">Figure 6
<p>Cardinality for the <math display="inline"><semantics> <mi>δ</mi> </semantics></math>-GLMB-UKF, CPHD-UKF, and CPHD-UKF filters versus time.</p>
Full article ">Figure 7
<p>Multi-target trajectories on the plane. The start/stop positions of each trajectory are denoted as <math display="inline"><semantics> <mrow> <mi mathvariant="normal">o</mi> <mo>/</mo> </mrow> </semantics></math>△, and the moving BS is denoted as □ in red.</p>
Full article ">Figure 8
<p>Estimated and truth trajectories for <math display="inline"><semantics> <mrow> <mi>x</mi> <mo>−</mo> <mi>y</mi> </mrow> </semantics></math> coordinates versus time.</p>
Full article ">Figure 9
<p>Cardinality for the <math display="inline"><semantics> <mi>δ</mi> </semantics></math>-GLMB-UKF, CPHD-UKF, and CPHD-UKF filters versus time.</p>
Full article ">Figure 10
<p>The OSPA distance, localization, and cardinality (<math display="inline"><semantics> <mrow> <mi>p</mi> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>c</mi> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> m) for the <math display="inline"><semantics> <mi>δ</mi> </semantics></math>-GLMB-UKF, CPHD-UKF, and CPHD-UKF filters. (<b>a</b>) OSPA distance; (<b>b</b>) OSPA localization; (<b>c</b>) OSPA cardinality.</p>
Full article ">
17 pages, 3793 KiB  
Article
Deep Deterministic Policy Gradient (DDPG) Agent-Based Sliding Mode Control for Quadrotor Attitudes
by Wenjun Hu, Yueneng Yang and Zhiyang Liu
Drones 2024, 8(3), 95; https://doi.org/10.3390/drones8030095 - 12 Mar 2024
Cited by 4 | Viewed by 2190
Abstract
A novel reinforcement deep learning deterministic policy gradient agent-based sliding mode control (DDPG-SMC) approach is proposed to suppress the chattering phenomenon in attitude control for quadrotors, in the presence of external disturbances. First, the attitude dynamics model of the quadrotor under study is [...] Read more.
A novel reinforcement deep learning deterministic policy gradient agent-based sliding mode control (DDPG-SMC) approach is proposed to suppress the chattering phenomenon in attitude control for quadrotors, in the presence of external disturbances. First, the attitude dynamics model of the quadrotor under study is derived, and the attitude control problem is described using formulas. Second, a sliding mode controller, including its sliding mode surface and reaching law, is chosen for the nonlinear dynamic system. The stability of the designed SMC system is validated through the Lyapunov stability theorem. Third, a reinforcement learning (RL) agent based on deep deterministic policy gradient (DDPG) is trained to adaptively adjust the switching control gain. During the training process, the input signals for the agent are the actual and desired attitude angles, while the output action is the time-varying control gain. Finally, the trained agent mentioned above is utilized in the SMC as a parameter regulator to facilitate the adaptive adjustment of the switching control gain associated with the reaching law. The simulation results validate the robustness and effectiveness of the proposed DDPG-SMC method. Full article
(This article belongs to the Special Issue Advances in Quadrotor Unmanned Aerial Vehicles)
Show Figures

Figure 1

Figure 1
<p>Attitude motion of the quadrotor in coordinate frames.</p>
Full article ">Figure 2
<p>Block diagram of SMC.</p>
Full article ">Figure 3
<p>Block diagram of DDPG-SMC.</p>
Full article ">Figure 4
<p>The architecture of the DDPG-based parameter regulator.</p>
Full article ">Figure 5
<p>The structure of the Critic network.</p>
Full article ">Figure 6
<p>The structure of the Actor network.</p>
Full article ">Figure 7
<p>Each episode reward for gain-learning with the DDPG agent.</p>
Full article ">Figure 8
<p>Simulation results of SMC.</p>
Full article ">Figure 8 Cont.
<p>Simulation results of SMC.</p>
Full article ">Figure 9
<p>Simulation results of AFGS-SMC.</p>
Full article ">Figure 10
<p>Simulation results of DDPG-SMC.</p>
Full article ">
26 pages, 2043 KiB  
Article
Towards mmWave Altimetry for UAS: Exploring the Potential of 77 GHz Automotive Radars
by Maaz Ali Awan, Yaser Dalveren, Ali Kara and Mohammad Derawi
Drones 2024, 8(3), 94; https://doi.org/10.3390/drones8030094 - 11 Mar 2024
Cited by 6 | Viewed by 2960
Abstract
Precise altitude data are indispensable for flight navigation, particularly during the autonomous landing of unmanned aerial systems (UASs). Conventional light and barometric sensors employed for altitude estimation are limited by poor visibility and temperature conditions, respectively, whilst global positioning system (GPS) receivers provide [...] Read more.
Precise altitude data are indispensable for flight navigation, particularly during the autonomous landing of unmanned aerial systems (UASs). Conventional light and barometric sensors employed for altitude estimation are limited by poor visibility and temperature conditions, respectively, whilst global positioning system (GPS) receivers provide the altitude from the mean sea level (MSL) marred with a slow update rate. To cater to the landing safety requirements, UASs necessitate precise altitude information above ground level (AGL) impervious to environmental conditions. Radar altimeters, a mainstay in commercial aviation for at least half a century, realize these requirements through minimum operational performance standards (MOPSs). More recently, the proliferation of 5G technology and interference with the universally allocated band for radar altimeters from 4.2 to 4.4 GHz underscores the necessity to explore novel avenues. Notably, there is no dedicated MOPS tailored for radar altimeters of UASs. To gauge the performance of a radar altimeter offering for UASs, existing MOPSs are the de facto choice. Historically, frequency-modulated continuous wave (FMCW) radars have been extensively used in a broad spectrum of ranging applications including radar altimeters. Modern monolithic millimeter wave (mmWave) automotive radars, albeit designed for automotive applications, also employ FMCW for precise ranging with a cost-effective and compact footprint. Given the technology maturation with excellent size, weight, and power (SWaP) metrics, there is a growing trend in industry and academia to explore their efficacy beyond the realm of the automotive industry. To this end, their feasibility for UAS altimetry remains largely untapped. While the literature on theoretical discourse is prevalent, a specific focus on mmWave radar altimetry is lacking. Moreover, clutter estimation with hardware specifications of a pure look-down mmWave radar is unreported. This article argues the applicability of MOPSs for commercial aviation for adaptation to a UAS use case. The theme of the work is a tutorial based on a simplified mathematical and theoretical discussion on the understanding of performance metrics and inherent intricacies. A systems engineering approach for deriving waveform specifications from operational requirements of a UAS is offered. Lastly, proposed future research directions and insights are included. Full article
Show Figures

Figure 1

Figure 1
<p>System-level diagram of an FMCW radar [<a href="#B31-drones-08-00094" class="html-bibr">31</a>].</p>
Full article ">Figure 2
<p>(<b>a</b>) Slope of <math display="inline"><semantics> <mrow> <mi>f</mi> <mfenced separators="|"> <mrow> <mi>t</mi> </mrow> </mfenced> </mrow> </semantics></math>, and (<b>b</b>) FMCW chirp in time domain [<a href="#B28-drones-08-00094" class="html-bibr">28</a>].</p>
Full article ">Figure 3
<p>IF signal generation [<a href="#B28-drones-08-00094" class="html-bibr">28</a>].</p>
Full article ">Figure 4
<p>Area illuminated by radar antenna [<a href="#B57-drones-08-00094" class="html-bibr">57</a>].</p>
Full article ">Figure 5
<p>FMCW chirp frame [<a href="#B28-drones-08-00094" class="html-bibr">28</a>].</p>
Full article ">Figure 6
<p>(<b>a</b>) Concept of AoA estimation and (<b>b</b>) maximum angular FOV [<a href="#B28-drones-08-00094" class="html-bibr">28</a>].</p>
Full article ">Figure 7
<p>Rationale for update rate requirement.</p>
Full article ">
24 pages, 6562 KiB  
Article
Hybrid Mode: Routinization of the Transition Mode as the Third Common Mode for Compound VTOL Drones
by Jiahao Hu, Jingbo Wei, Kun Liu, Xiaobin Yu, Mingzhi Cao and Zijie Qin
Drones 2024, 8(3), 93; https://doi.org/10.3390/drones8030093 - 8 Mar 2024
Cited by 2 | Viewed by 2536
Abstract
Fixed-wing Vertical Takeoff and Landing (VTOL) drones have been widely researched and applied because they combine the advantages of both rotorcraft and fixed-wing drones. However, the research on the transition mode of this type of drone has mainly focused on completing the process [...] Read more.
Fixed-wing Vertical Takeoff and Landing (VTOL) drones have been widely researched and applied because they combine the advantages of both rotorcraft and fixed-wing drones. However, the research on the transition mode of this type of drone has mainly focused on completing the process quickly and stably, and the application potential of this mode has not been given much attention. The objective of this paper is to routinize the transition mode of compound VTOL drones, i.e., this mode works continuously for a longer period of time as a third commonly used mode besides multi-rotor and fixed-wing modes, which is referred to as the hybrid mode. For this purpose, we perform detailed dynamics modeling of the drone in this mode and use saturated PID controllers to control the altitude, velocity, and attitude of the drone. In addition, for more stable altitude control in hybrid mode, we identify the relevant parameters for the lift of the fixed-wings and the thrust of the actuators. Simulation and experimental results show that the designed control method can effectively control the compound VTOL drone in hybrid mode. Moreover, it is proven that flight in hybrid mode can reduce the flight energy consumption to some extent. Full article
Show Figures

Figure 1

Figure 1
<p>Hybrid mode control logic: (<b>a</b>) Vertical motion; (<b>b</b>) Forward motion; (<b>c</b>) Lateral motion; (<b>d</b>) Yaw motion.</p>
Full article ">Figure 1 Cont.
<p>Hybrid mode control logic: (<b>a</b>) Vertical motion; (<b>b</b>) Forward motion; (<b>c</b>) Lateral motion; (<b>d</b>) Yaw motion.</p>
Full article ">Figure 2
<p>(<b>a</b>) Coordinate system and forces; (<b>b</b>) Multi-rotor actuators configuration.</p>
Full article ">Figure 3
<p>(<b>a</b>) The multi-rotor actuator model; (<b>b</b>) Velocity controller of fixed-wing lift identification.</p>
Full article ">Figure 4
<p>Control Architecture.</p>
Full article ">Figure 5
<p>(<b>a</b>) Position controller architecture; (<b>b</b>) Attitude controller architecture; (<b>c</b>)Heading frame.</p>
Full article ">Figure 6
<p>Simulation platform.</p>
Full article ">Figure 7
<p>Simulation results for Case 1: (<b>a</b>) Attitude; (<b>b</b>) Airspeed and altitude; (<b>c</b>) Multi-rotor throttle, tail-thrust throttle, and aerodynamic lift.</p>
Full article ">Figure 8
<p>Simulation results for Case 2: (<b>a</b>) Attitude; (<b>b</b>) Airspeed and vertical velocity.</p>
Full article ">Figure 9
<p>Simulation results for Case 3: (<b>a</b>) Attitude; (<b>b</b>) Airspeed, lateral velocity, and altitude.</p>
Full article ">Figure 10
<p>Simulation results for Case 4: (<b>a</b>) Attitude; (<b>b</b>) Airspeed and altitude.</p>
Full article ">Figure 11
<p>(<b>a</b>) Compound VTOL drone prototype; (<b>b</b>) Real flight experiment.</p>
Full article ">Figure 12
<p>(<b>a</b>) Thrust test platform; (<b>b</b>) Identification results of actuator model.</p>
Full article ">Figure 13
<p>(<b>a</b>) Identification result of fixed-wing lift; (<b>b</b>) Example of height control result.</p>
Full article ">Figure 14
<p>(<b>a</b>) Multi-rotor actuator inclination; (<b>b</b>) Example of enhanced yaw torque result.</p>
Full article ">Figure 15
<p>Experiment results for Case 1: (<b>a</b>) Attitude; (<b>b</b>) Attitude error; (<b>c</b>) Airspeed and altitude; (<b>d</b>) Airspeed and altitude error; (<b>e</b>) Multi-rotor and tail-thrust throttle.</p>
Full article ">Figure 15 Cont.
<p>Experiment results for Case 1: (<b>a</b>) Attitude; (<b>b</b>) Attitude error; (<b>c</b>) Airspeed and altitude; (<b>d</b>) Airspeed and altitude error; (<b>e</b>) Multi-rotor and tail-thrust throttle.</p>
Full article ">Figure 16
<p>Experiment results for Case 2: (<b>a</b>) Attitude; (<b>b</b>) Attitude error; (<b>c</b>) Airspeed and vertical velocity; (<b>d</b>) Airspeed and vertical velocity error.</p>
Full article ">Figure 16 Cont.
<p>Experiment results for Case 2: (<b>a</b>) Attitude; (<b>b</b>) Attitude error; (<b>c</b>) Airspeed and vertical velocity; (<b>d</b>) Airspeed and vertical velocity error.</p>
Full article ">Figure 17
<p>Experiment results for Case 3: (<b>a</b>) Attitude; (<b>b</b>) Attitude error; (<b>c</b>) Airspeed, lateral velocity, and altitude; (<b>d</b>) Airspeed, lateral velocity and altitude error.</p>
Full article ">Figure 18
<p>Experiment results for Case 4: (<b>a</b>) Attitude; (<b>b</b>) Attitude error; (<b>c</b>) Airspeed and altitude; (<b>d</b>) Airspeed and altitude error.</p>
Full article ">Figure 19
<p>Energy consumption at different velocity.</p>
Full article ">
24 pages, 8258 KiB  
Article
Quantity Monitor Based on Differential Weighing Sensors for Storage Tank of Agricultural UAV
by Junhao Huang, Weizhuo He, Deshuai Yang, Jianqin Lin, Yuanzhen Ou, Rui Jiang and Zhiyan Zhou
Drones 2024, 8(3), 92; https://doi.org/10.3390/drones8030092 - 7 Mar 2024
Cited by 1 | Viewed by 1969
Abstract
Nowadays, unmanned aerial vehicles (UAVs) play a pivotal role in agricultural production. In scenarios involving the release of particulate materials, the precision of quantity monitors for the storage tank of UAVs directly impacts its operational accuracy. Therefore, this paper introduces a novel noise-mitigation [...] Read more.
Nowadays, unmanned aerial vehicles (UAVs) play a pivotal role in agricultural production. In scenarios involving the release of particulate materials, the precision of quantity monitors for the storage tank of UAVs directly impacts its operational accuracy. Therefore, this paper introduces a novel noise-mitigation design for agricultural UAVs’ quantity monitors, utilizing differential weighing sensors. The design effectively addresses three primary noise sources: sensor-intrinsic noise, vibration noise, and weight-loading uncertainty. Additionally, two comprehensive data processing methods are proposed for noise reduction: the first combines the Butterworth low-pass filter, the Kalman filter, and the moving average filter (BKM), while the second integrates the Least Mean Squares (LMS) adaptive filter, the Kalman filter, and the moving average filter (LKM). Rigorous data processing has been conducted, and the monitor’s performance has been assessed in three UAV typical states: static, hovering, and flighting. Specifically, compared to the BKM, the LKM’s maximum relative error ranges between 1.24% and 2.74%, with an average relative error of 0.31%~0.58% when the UAV was in a hovering state. In flight mode, the LKM’s maximum relative error varies from 1.68% to 10.06%, while the average relative error ranges between 0.74% and 2.54%. Furthermore, LKM can effectively suppress noise interference near 75 Hz and 150 Hz. The results reveal that the LKM technology demonstrated superior adaptability to noise and effectively mitigates its impact in the quantity monitoring for storage tank of agricultural UAVs. Full article
(This article belongs to the Special Issue Drones in Sustainable Agriculture)
Show Figures

Figure 1

Figure 1
<p>The UAV shot seeding device for test.</p>
Full article ">Figure 2
<p>Two data processing schemes: (<b>a</b>) scheme 1 with Butterworth low-pass filter and (<b>b</b>) scheme 2 with LMS adaptive filter.</p>
Full article ">Figure 3
<p>The installation method of differential weighing sensors: (<b>a</b>) the installation of the main sensors and (<b>b</b>) the installation of the noise monitoring sensor.</p>
Full article ">Figure 3 Cont.
<p>The installation method of differential weighing sensors: (<b>a</b>) the installation of the main sensors and (<b>b</b>) the installation of the noise monitoring sensor.</p>
Full article ">Figure 4
<p>The test sites: (<b>a</b>) Mark Village, Dongchong Town, Nansha, Guangzhou, China, (<b>b</b>) Dagang Village, Zhucun Town, Zengcheng, Guangzhou, China, and (<b>c</b>) Zengcheng Teaching and Research Base of South China Agricultural University, Guangzhou, China.</p>
Full article ">Figure 5
<p>(<b>a</b>) Data comparison and (<b>b</b>) FFT spectrum of UAV in different states.</p>
Full article ">Figure 6
<p>Butterworth low-pass filtering data.</p>
Full article ">Figure 7
<p>Weight variation in different step size factors.</p>
Full article ">Figure 8
<p>Data comparison after adding Kalman filter.</p>
Full article ">Figure 9
<p>Increase in the data comparison after moving average filtering.</p>
Full article ">Figure 10
<p>Static filtering data comparison.</p>
Full article ">Figure 11
<p>Comparison of processing data under different loads: (<b>a</b>) 5 kg, (<b>b</b>) 10 kg, (<b>c</b>) 15 kg, (<b>d</b>) 20 kg, and (<b>e</b>) 25 kg.</p>
Full article ">Figure 12
<p>FFT spectrum of each processing data under different loads: (<b>a</b>) 5 kg, (<b>b</b>) 10 kg, (<b>c</b>) 15 kg, (<b>d</b>) 20 kg, and (<b>e</b>) 25 kg.</p>
Full article ">Figure 13
<p>Data comparison under different loads and different flight speeds: (<b>a</b>–<b>c</b>) 5 kg (1~3 m/s), (<b>d</b>–<b>f</b>) 10 kg (1~3 m/s), (<b>g</b>–<b>i</b>) 15 kg (1~3 m/s), (<b>j</b>–<b>l</b>) 20 kg (1~3 m/s), and (<b>m</b>–<b>o</b>) 25 kg (1~3 m/s).</p>
Full article ">Figure 13 Cont.
<p>Data comparison under different loads and different flight speeds: (<b>a</b>–<b>c</b>) 5 kg (1~3 m/s), (<b>d</b>–<b>f</b>) 10 kg (1~3 m/s), (<b>g</b>–<b>i</b>) 15 kg (1~3 m/s), (<b>j</b>–<b>l</b>) 20 kg (1~3 m/s), and (<b>m</b>–<b>o</b>) 25 kg (1~3 m/s).</p>
Full article ">Figure 14
<p>Data of field tests: (<b>a</b>) test 1, (<b>b</b>) test 2, and (<b>c</b>) test 3; and results of FFT analysis: (<b>d</b>) test 1, (<b>e</b>) test 2, and (<b>f</b>) test 3.</p>
Full article ">Figure 14 Cont.
<p>Data of field tests: (<b>a</b>) test 1, (<b>b</b>) test 2, and (<b>c</b>) test 3; and results of FFT analysis: (<b>d</b>) test 1, (<b>e</b>) test 2, and (<b>f</b>) test 3.</p>
Full article ">
15 pages, 17419 KiB  
Article
Assessment of the Health Status of Old Trees of Platycladus orientalis L. Using UAV Multispectral Imagery
by Daihao Yin, Yijun Cai, Yajing Li, Wenshan Yuan and Zhong Zhao
Drones 2024, 8(3), 91; https://doi.org/10.3390/drones8030091 - 7 Mar 2024
Cited by 2 | Viewed by 2085
Abstract
Assessing the health status of old trees is crucial for the effective protection and health management of old trees. In this study, we utilized an unmanned aerial vehicle (UAV) equipped with multispectral cameras to capture images for the rapid assessment of the health [...] Read more.
Assessing the health status of old trees is crucial for the effective protection and health management of old trees. In this study, we utilized an unmanned aerial vehicle (UAV) equipped with multispectral cameras to capture images for the rapid assessment of the health status of old trees. All trees were classified according to health status into three classes: healthy, declining, and severe declining trees, based on the above-ground parts of the trees. Two traditional machine learning algorithms, Support Vector Machines (SVM) and Random Forest (RF), were employed to assess their health status. Both algorithms incorporated selected variables, as well as additional variables (aspect and canopy area). The results indicated that the inclusion of these additional variables improved the overall accuracy of the models by 8.3% to 13.9%, with kappa values ranging from 0.166 and 0.233. Among the models tested, the A-RF model (RF with aspect and canopy area variables) demonstrated the highest overall accuracy (75%) and kappa (0.571), making it the optimal choice for assessing the health condition of old trees. Overall, this research presents a novel and cost-effective approach to assessing the health status of old trees. Full article
Show Figures

Figure 1

Figure 1
<p>Location of the study area.</p>
Full article ">Figure 2
<p>Health status of old trees and individual tree segments. (<b>a</b>) The distribution of health status of old trees; (<b>b</b>) individual tree segments.</p>
Full article ">Figure 3
<p>Schematic diagram of aspect division [<a href="#B43-drones-08-00091" class="html-bibr">43</a>].</p>
Full article ">Figure 4
<p>Boruta result plot for data. Blue boxplots correspond to the minimal, average, and maximum Z score of a shadow attribute. Red, yellow, and green boxplots represent Z scores of, respectively, rejected, tentative, and confirmed attributes.</p>
Full article ">Figure 5
<p>Spatial distribution of old trees with different health status.</p>
Full article ">
23 pages, 5781 KiB  
Article
Multi-Level Hazard Detection Using a UAV-Mounted Multi-Sensor for Levee Inspection
by Shan Su, Li Yan, Hong Xie, Changjun Chen, Xiong Zhang, Lyuzhou Gao and Rongling Zhang
Drones 2024, 8(3), 90; https://doi.org/10.3390/drones8030090 - 6 Mar 2024
Cited by 3 | Viewed by 2210
Abstract
This paper introduces a developed multi-sensor integrated system comprising a thermal infrared camera, an RGB camera, and a LiDAR sensor, mounted on a lightweight unmanned aerial vehicle (UAV). This system is applied to the inspection tasks of levee engineering, enabling the real-time, rapid, [...] Read more.
This paper introduces a developed multi-sensor integrated system comprising a thermal infrared camera, an RGB camera, and a LiDAR sensor, mounted on a lightweight unmanned aerial vehicle (UAV). This system is applied to the inspection tasks of levee engineering, enabling the real-time, rapid, all-day, all-round, and non-contact acquisition of multi-source data for levee structures and their surrounding environments. Our aim is to address the inefficiencies, high costs, limited data diversity, and potential safety hazards associated with traditional methods, particularly concerning the structural safety of dam bodies. In the preprocessing stage of multi-source data, techniques such as thermal infrared data enhancement and multi-source data alignment are employed to enhance data quality and consistency. Subsequently, a multi-level approach to detecting and screening suspected risk areas is implemented, facilitating the rapid localization of potential hazard zones and assisting in assessing the urgency of addressing these concerns. The reliability of the developed multi-sensor equipment and the multi-level suspected hazard detection algorithm is validated through on-site levee engineering inspections conducted during flood disasters. The application reliably detects and locates suspected hazards, significantly reducing the time and resource costs associated with levee inspections. Moreover, it mitigates safety risks for personnel engaged in levee inspections. Therefore, this method provides reliable data support and technical services for levee inspection, hazard identification, flood control, and disaster reduction. Full article
Show Figures

Figure 1

Figure 1
<p>Establishing multi-sensor equipment on UAV platform. (<b>a</b>,<b>d</b>) AlphaAir 450 pocket LiDAR system; (<b>b</b>) FLIR VUE Pro R camera; (<b>c</b>) BB4 mini UAV; (<b>e</b>) multi-sensor equipment on UAV platform.</p>
Full article ">Figure 2
<p>The proposed workflow for levee inspection and risk assessment.</p>
Full article ">Figure 3
<p>Alignment error caused by camera trigger time delay.</p>
Full article ">Figure 4
<p>Dense Nested Interactive Module with a U-shape.</p>
Full article ">Figure 5
<p>Feature extraction using SuperPoint and feature matching based on slope consistency. (<b>a</b>) Result of feature extraction from the RGB image; (<b>b</b>) result of feature extraction from the thermal infrared image; (<b>c</b>) result of feature matching based on slope consistency.</p>
Full article ">Figure 6
<p>The first set of experiments on the temperature resolution of the thermal infrared camera. (<b>a</b>) Temperature measurement of the water in the three paper bowls during takeoff; (<b>b</b>) thermal infrared images captured during takeoff; (<b>c</b>) RGB image captured during landing; (<b>d</b>) thermal infrared images captured during landing; (<b>e</b>) temperature measurement of the water in the three paper bowls during landing.</p>
Full article ">Figure 7
<p>The second set of experiments on the temperature resolution of the thermal infrared camera. (<b>a</b>) Manually arranged experimental scene; (<b>b</b>) RGB image captured at 30 m altitude; (<b>c</b>) thermal infrared image captured at 30 m altitude.</p>
Full article ">Figure 8
<p>Thermal infrared image enhancement results. (<b>a</b>) Original thermal infrared image; (<b>b</b>) thermal infrared image after contrast stretching.</p>
Full article ">Figure 9
<p>Alignment results of thermal infrared, RGB, and point cloud images. (<b>a</b>) Fusion result of RGB and thermal infrared images; (<b>b</b>) thermal infrared image; (<b>c</b>) RGB image; (<b>d</b>) point cloud image.</p>
Full article ">Figure 10
<p>The on-site situation and inspection planning flight segment. (<b>a</b>–<b>d</b>) Inspection site; (<b>e</b>) approximate positions of the planned flight segment and data collection points.</p>
Full article ">Figure 11
<p>DNA-Net screening results for low-temperature regions. (<b>a</b>,<b>c</b>) Contrast-stretched thermal infrared image; (<b>b</b>,<b>d</b>) Low-temperature region mask obtained through DNA-Net screening.</p>
Full article ">Figure 12
<p>Multi-level inspection results. (<b>a</b>) Contrast-stretched thermal infrared image; (<b>b</b>) conditionally dilated low-temperature region mask; (<b>c</b>) projection of the conditionally dilated mask region in the point cloud image; (<b>d</b>) projection of the conditionally dilated mask region in the RGB image.</p>
Full article ">Figure 13
<p>The presentation of some abnormal results on partial DOM.</p>
Full article ">Figure 14
<p>Manual on-site inspection and confirmation. (<b>a</b>) Manual inspection and confirmation of the on-site thermal infrared image; (<b>b</b>) manual inspection and confirmation of the on-site RGB image; (<b>c</b>–<b>e</b>) inspection site.</p>
Full article ">
19 pages, 8294 KiB  
Article
Research of Slamming Load Characteristics during Trans-Media Aircraft Entry into Water
by Xinyu Liu, Liguo Tan, Xinbin Zhang and Liang Li
Drones 2024, 8(3), 89; https://doi.org/10.3390/drones8030089 - 6 Mar 2024
Cited by 2 | Viewed by 1443
Abstract
The trans-media aircraft water entry process generates strong slamming loads that will seriously affect the stability and safety of the aircraft. To address this problem, we design a fixed-wing aircraft configuration and employ numerical simulations with the volume of fluid (VOF) multiphase flow [...] Read more.
The trans-media aircraft water entry process generates strong slamming loads that will seriously affect the stability and safety of the aircraft. To address this problem, we design a fixed-wing aircraft configuration and employ numerical simulations with the volume of fluid (VOF) multiphase flow model, standard k-epsilon turbulence model, and dynamic mesh technique. We explore the characteristics of aircraft subjected to bang loads under different conditions. The results show the following: the pressure load on the aircraft surface increases with higher water entry velocity; larger entry angles lead to more drastic changes in the aircraft’s drag coefficient, demonstrating strong nonlinear characteristics; the greater the angle of attack into the water, the greater the pressure load on the root underneath the wing, with little effect on the pressure load on the head; and the water entry drag coefficient and average pressure load follow an increasing order of conical head, hemispherical head, and flat head. These findings provide theoretical references for studying the load characteristics during trans-media water entry of various flying bodies and optimizing fuselage structural strength. Full article
Show Figures

Figure 1

Figure 1
<p>Aircraft model diagram: (<b>a</b>) three-dimensional diagram; (<b>b</b>) top view; (<b>c</b>) front view.</p>
Full article ">Figure 2
<p>Computational domain: (<b>a</b>) three-dimensional diagram; (<b>b</b>) two-dimensional diagram.</p>
Full article ">Figure 3
<p>Model grid: (<b>a</b>) overall grid; (<b>b</b>) encrypted mesh for aircraft fuselage.</p>
Full article ">Figure 4
<p>Pressure maps obtained for different number of grids.</p>
Full article ">Figure 5
<p>Computational model.</p>
Full article ">Figure 6
<p>Diagram of the position of the ball falling into the water; (<b>a</b>) test t = 0.210 s; (<b>b</b>) test t = 0.341 s; (<b>c</b>) simulation t = 0.319 s; (<b>d</b>) simulation t = 0.453 s.</p>
Full article ">Figure 7
<p>Comparison of the results of the simulation with previous work [<a href="#B27-drones-08-00089" class="html-bibr">27</a>].</p>
Full article ">Figure 8
<p>Vehicle surface pressure cloud at different moments (V = 45 m/s, α = 30°, β = 0°): (<b>a</b>) T = 0.011 s; (<b>b</b>) T = 0.019 s; (<b>c</b>) T = 0.037 s.</p>
Full article ">Figure 9
<p>Map of monitoring sites.</p>
Full article ">Figure 10
<p>Pressure time curve at each monitoring point (V = 45 m/s, α = 30°, β = 0°).</p>
Full article ">Figure 11
<p>Pressure time-course curves at different speeds for monitoring point 1 (α = 30°, β = 0°).</p>
Full article ">Figure 12
<p>Aircraft velocity change curve (α = 30°, β = 0°).</p>
Full article ">Figure 13
<p>Pressure time-course curves for monitoring point 1 at different entry angles (V = 45 m/s, β = 0°).</p>
Full article ">Figure 14
<p>Velocity change curves at different entry angles (V = 45 m/s, β = 0°).</p>
Full article ">Figure 15
<p>Curves of variation in drag coefficient at different water entry angles (V = 45 m/s, β = 0°).</p>
Full article ">Figure 16
<p>Volume fractions 2D plot (V = 45 m/s, α = 60°): (<b>a</b>) angle of attack β = 0°; (<b>b</b>)angle of attack β = 45°.</p>
Full article ">Figure 16 Cont.
<p>Volume fractions 2D plot (V = 45 m/s, α = 60°): (<b>a</b>) angle of attack β = 0°; (<b>b</b>)angle of attack β = 45°.</p>
Full article ">Figure 17
<p>Aircraft surface pressure cloud at different moments (V = 45 m/s, α = 60°): (<b>a</b>) T = 0.007 s; (<b>b</b>) t = 0.01 s.</p>
Full article ">Figure 18
<p>Pressure–time curves for monitoring point 7 at different angles of attack into the water (V = 45 m/s, α = 60°).</p>
Full article ">Figure 19
<p>Pressure–time curves for monitoring point 8 at different angles of attack into the water (V = 45 m/s, α = 60°).</p>
Full article ">Figure 20
<p>Models of different head types of aircrafts: (<b>a</b>) flat; (<b>b</b>) conical; (<b>c</b>) hemispherical.</p>
Full article ">Figure 21
<p>Pressure clouds for different head types (V = 45 m/s, α = 60°, β = 0°): (<b>a</b>) flat; (<b>b</b>) conical; (<b>c</b>) hemispherical.</p>
Full article ">Figure 22
<p>Pressure time course curves for different head types (V = 45 m/s, α = 60°, β = 0°).</p>
Full article ">Figure 23
<p>Resistance coefficient curves for different head types (V = 45 m/s, α = 60°, β = 0°).</p>
Full article ">
20 pages, 3010 KiB  
Article
Yield Prediction Using NDVI Values from GreenSeeker and MicaSense Cameras at Different Stages of Winter Wheat Phenology
by Sándor Zsebő, László Bede, Gábor Kukorelli, István Mihály Kulmány, Gábor Milics, Dávid Stencinger, Gergely Teschner, Zoltán Varga, Viktória Vona and Attila József Kovács
Drones 2024, 8(3), 88; https://doi.org/10.3390/drones8030088 - 5 Mar 2024
Cited by 6 | Viewed by 3922
Abstract
This work aims to compare and statistically analyze Normalized Difference Vegetation Index (NDVI) values provided by GreenSeeker handheld crop sensor measurements and calculate NDVI values derived from the MicaSense RedEdge-MX Dual Camera, to predict in-season winter wheat (Triticum aestivum L.) yield, improving [...] Read more.
This work aims to compare and statistically analyze Normalized Difference Vegetation Index (NDVI) values provided by GreenSeeker handheld crop sensor measurements and calculate NDVI values derived from the MicaSense RedEdge-MX Dual Camera, to predict in-season winter wheat (Triticum aestivum L.) yield, improving a yield prediction model with cumulative growing degree days (CGDD) and days from sowing (DFS) data. The study area was located in Mosonmagyaróvár, Hungary. A small-scale field trial in winter wheat was constructed as a randomized block design including Environmental: N-135.3, P2O5-77.5, K2O-0; Balance: N-135.1, P2O5-91, K2O-0; Genezis: N-135, P2O5-75, K2O-45; and Control: N, P, K 0 kg/ha. The crop growth was monitored every second week between April and June 2022 and 2023, respectively. NDVI measurements recorded by GreenSeeker were taken at three pre-defined GPS points for each plot; NDVI values based on the MicaSense camera Red and NIR bands were calculated for the same points. Results showed a significant difference (p ≤ 0.05) between the Control and treated areas by GreenSeeker measurements and Micasense-based calculated NDVI values throughout the growing season, except for the heading stage. At the heading stage, significant differences could be measured by GreenSeeker. However, remotely sensed images did not show significant differences between the treated and Control parcels. Nevertheless, both sensors were found suitable for yield prediction, and 226 DAS was the most appropriate date for predicting winter wheat’s yield in treated plots based on NDVI values and meteorological data. Full article
(This article belongs to the Special Issue Advances of UAV in Precision Agriculture)
Show Figures

Figure 1

Figure 1
<p>Research field at Széchenyi István University, Mosonmagyaróvár, Hungary. The four treatments (Environmental, Balance, Genezis, and Control) are in different colors.</p>
Full article ">Figure 2
<p>Average air temperature (°C) and monthly precipitation (mm) at the experimental field from October to June in 2021–2022 and 2022–2023 growing season.</p>
Full article ">Figure 3
<p>(<b>a</b>) winter wheat yields between 2021–2022 and 2022–2023 period, (<b>b</b>) winter wheat yields between treatments (Control, Environmental, Balance, Genezis) in 2021–2022 and 2022–2023 growing season. a–significant difference (<span class="html-italic">p</span> ≥ 0.05), b–no significant difference.</p>
Full article ">Figure 4
<p>Relationship between winter wheat yield and NDVI measurements for all treatments (Control, Environmental, Balance, and Genezis) in the 2021–2022 and 2022–2023 periods, (<b>a</b>) GreenSeeker linear, (<b>b</b>) MicaSense linear, (<b>c)</b> GreenSeeker exponential, (<b>d</b>) MicaSense exponential, (<b>e</b>) GreenSeeker quadratic, (<b>f</b>) MicaSense quadratic equations.</p>
Full article ">Figure 5
<p>Modification of linear (<b>a</b>,<b>b</b>), exponential (<b>c</b>,<b>d</b>), and quadratic (<b>e</b>,<b>f</b>) yield prediction equations using CGDD and DFS data based on GreenSeeker (GS) measurements and MicaSense-derived (MS) NDVI values. The figures depict the measurement results for the 2021–2022 and 2022–2023 periods for both sensors. CGDD and DFS values represent the number of days from sowing to sensing, where GDD (growing degree days) &gt; 0 and the cumulative growing degree days (CGDD) from sowing to sensing, respectively.</p>
Full article ">Figure 6
<p>Relationship between the observed and predicted yields of the linear yield prediction model for estimating in-season yield of winter wheat based on (<b>a</b>) GreenSeeker data and (<b>b</b>) MicaSense camera data.</p>
Full article ">
4 pages, 154 KiB  
Editorial
Special Issue on Intelligent Image Processing and Sensing for Drones
by Seokwon Yeom
Drones 2024, 8(3), 87; https://doi.org/10.3390/drones8030087 - 4 Mar 2024
Viewed by 1896
Abstract
Recently, the use of drones or unmanned aerial vehicles (UAVs) for various purposes has been increasing [...] Full article
(This article belongs to the Special Issue Intelligent Image Processing and Sensing for Drones)
19 pages, 2558 KiB  
Article
Collision Avoidance Capabilities in High-Density Airspace Using the Universal Access Transceiver ADS-B Messages
by Coulton Karch, Jonathan Barrett, Jaron Ellingson, Cameron K. Peterson and V. Michael Contarino
Drones 2024, 8(3), 86; https://doi.org/10.3390/drones8030086 - 1 Mar 2024
Cited by 3 | Viewed by 2079
Abstract
The safe integration of a large number of unmanned aircraft systems (UASs) into the National Airspace System (NAS) is essential for advanced air mobility. This requires reliable air-to-air transmission systems and robust collision avoidance algorithms. Automatic Dependent Surveillance-Broadcast (ADS-B) is a potential solution [...] Read more.
The safe integration of a large number of unmanned aircraft systems (UASs) into the National Airspace System (NAS) is essential for advanced air mobility. This requires reliable air-to-air transmission systems and robust collision avoidance algorithms. Automatic Dependent Surveillance-Broadcast (ADS-B) is a potential solution for a dependable air-to-air messaging system, but its reliability when stressed with hundreds to thousands of vehicles operating simultaneously is in question. This paper presents an ADS-B model and analyzes the capabilities of the Universal Access Transceiver (UAT), which operates at a frequency of 978 MHz. We use a probabilistic collision avoidance algorithm to examine the impact of varying parameters, including the number of vehicles and the transmission power of the UAT, on the overall safety of the vehicles. Additionally, we investigate the root causes of co-channel interference, proposing enhancements for safe operations in environments with a high density of UAS. Simulation results show message success and collision rates. With our proposed enhancements, UAT ADS-B can provide a decentralized air traffic system that operates safely in high-density situations. Full article
Show Figures

Figure 1

Figure 1
<p>Architecture diagram of the full UAT ADS-B model.</p>
Full article ">Figure 2
<p>Successful message decode structure.</p>
Full article ">Figure 3
<p>The trajectories of 5 UASs (colored dots) overlaid with their message collisions (colored diamonds).</p>
Full article ">Figure 4
<p>Success probabilities for message decoding, contingent on the density of UAS and the transmission power of the UAT, using an enhanced model where the message lengths are adjusted to fit within a single MSO.</p>
Full article ">Figure 5
<p>Success probabilities for message decoding, contingent on the density of UAS and the transmission power of the UAT, using an enhanced model where the UAT can decode simultaneously arriving messages provided one has a 4 dB advantage over the others.</p>
Full article ">Figure 6
<p>Success probabilities for message decoding, contingent on the density of UAS and the transmission power of the UAT, using an enhanced model where both the message lengths are adjusted and a simultaneously arriving message can be decoded in an MSO collision.</p>
Full article ">Figure 7
<p>Average vehicle collisions per Monte Carlo simulation with 5000 vehicles in a radius of 2.3032 km, 0.01 W Transmit power, a max acceleration of 300 m/s<sup>2</sup>, and a time penalty of 200.</p>
Full article ">Figure 8
<p>The contents of one hardware unit.</p>
Full article ">Figure 9
<p>Racetrack flight path used in hardware tests.</p>
Full article ">
Previous Issue
Next Issue
Back to TopTop