Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
sensors-logo

Journal Browser

Journal Browser

UAV-Based Smart Sensor Systems and Applications

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: closed (30 September 2020) | Viewed by 56351

Special Issue Editors


E-Mail Website
Guest Editor
Department of Systems Engineering and Automation, Universidad Carlos III de Madrid, 28118 Madrid, Spain
Interests: autonomous aerial and ground vehicles; environment perception; navigation

E-Mail Website
Guest Editor
Intelligent Systems Lab, Universidad Carlos III de Madrid, Calle Butarque 15, Leganés, 28911 Madrid, Spain
Interests: real-time perception systems; computer vision; sensor fusion; autonomous ground vehicles; unmanned aerial vehicles; navigation
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Autonomous Mobile & Perception Lab AMPL, Universidad Carlos III de Madrid, 28911 Madrid, Spain
Interests: autonomous vehicles; computer vision; drones
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The field of research in Unmanned Aerial Vehicles (UAVs) is one that has grown the most in popularity and has shown a higher rate of results in recent years. The increasingly low cost of the vehicles themselves, the sensors needed to perceive the environment, the improvement of the communications between vehicles, as well as the increase in computing capacity of embedded computers, have achieved that an increasing and diverse number of applications can be addressed by using UAVs.

The aim of this Special Issue it to cover theoretical, experimental and operational aspects related to the field of the Unmanned Aerial vehicles (UAVs); in order to advance and promote the actual research works and to present to the scientific community the novel techniques on emerging UAV models and techniques.

Therefore, we invite applicants to look over the recent advances in Sensor Systems and Applications for UAVs and call for innovative works that explore frontiers and challenges in the field.

The topics of interest include but are not limited to the following:

  • Embedded/Onboard Systems
  • Navigation and localization systems
  • Perception and Sensing
  • Data processing and analysis methods for sensor information acquired by sensors installed on UAVs
  • Decision making on real aerial scenarios.
  • Multi-UAVs coordination and cooperation
  • New applications for UAVs
  • New UAVs software architectures
  • Sensor fusion for environment perception
  • UAVs design
  • UAVs mission planning

Please contact the Guest Editors if you have any questions about whether your proposed article would fit the scope of this Special Issue

Prof. Dr. Arturo de la Escalera Hueso
Dr. David Martín Gómez
Dr. Abdulla Al-Kaff
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • computer vision-inspired solutions
  • control
  • cyber-security
  • deep learning for aerial environment understanding
  • embedded systems
  • mechatronics
  • multi-UAS coordination
  • navigation
  • obstacle detection and avoidance
  • perception
  • ROS-based architectures for UAVs
  • sensor fusion
  • Unmanned Aerial Vehicles
  • Unmanned Aircraft Systems

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 6390 KiB  
Article
Cooperative UAV–UGV Autonomous Power Pylon Inspection: An Investigation of Cooperative Outdoor Vehicle Positioning Architecture
by Alvaro Cantieri, Matheus Ferraz, Guido Szekir, Marco Antônio Teixeira, José Lima, André Schneider Oliveira and Marco Aurélio Wehrmeister
Sensors 2020, 20(21), 6384; https://doi.org/10.3390/s20216384 - 9 Nov 2020
Cited by 32 | Viewed by 5528
Abstract
Realizing autonomous inspection, such as that of power distribution lines, through unmanned aerial vehicle (UAV) systems is a key research domain in robotics. In particular, the use of autonomous and semi-autonomous vehicles to execute the tasks of an inspection process can enhance the [...] Read more.
Realizing autonomous inspection, such as that of power distribution lines, through unmanned aerial vehicle (UAV) systems is a key research domain in robotics. In particular, the use of autonomous and semi-autonomous vehicles to execute the tasks of an inspection process can enhance the efficacy and safety of the operation; however, many technical problems, such as those pertaining to the precise positioning and path following of the vehicles, robust obstacle detection, and intelligent control, must be addressed. In this study, an innovative architecture involving an unmanned aircraft vehicle (UAV) and an unmanned ground vehicle (UGV) was examined for detailed inspections of power lines. In the proposed strategy, each vehicle provides its position information to the other, which ensures a safe inspection process. The results of real-world experiments indicate a satisfactory performance, thereby demonstrating the feasibility of the proposed approach. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Overview of the architecture.</p>
Full article ">Figure 2
<p>Architecture components. (<b>a</b>) Inspection site: base station, RTK-GPS, and UAV. (<b>b</b>) Detail of the Bebop drone with RTK-GPS module and Pioneer P3 with the augmented reality tag (AR-Tag).</p>
Full article ">Figure 3
<p>Autonomous flight limits for safe operation.</p>
Full article ">Figure 4
<p>Reference corrections in the x–y plane.</p>
Full article ">Figure 5
<p>Absolute horizontal accuracy estimation for different camera resolution.</p>
Full article ">Figure 6
<p>Horizontal orientation accuracy estimation for different camera resolutions.</p>
Full article ">Figure 7
<p>Height accuracy estimation for different camera resolutions.</p>
Full article ">Figure 8
<p>Figure showing snapshots of a flight round of the Bebop drone capturing the AR-Tag positioning. (<b>1</b>) Image frame transmitted with error, causing lost of accuracy; (<b>2</b>) Regular operation; (<b>3</b>) Tag near of detection limit; (<b>4</b>) Partial shadowing of the tag, causing lost of accuracy;</p>
Full article ">Figure 9
<p>UAV flight path comparing the AR-Tag error with the Bebop odometry error.</p>
Full article ">Figure 10
<p>Absolute horizontal error for the AR-Tag and Bebop odometry with the mean and standard deviation values. (<b>a</b>) AR-Tag absolute error statistics; (<b>b</b>) Bebop odometry error statistics.</p>
Full article ">Figure 11
<p>Horizontal orientation estimation. (<b>a</b>) AR-Tag vs. Bebop odometry orientation. (<b>b</b>) Absolute Bebop vs. AR-Tag orientation error.</p>
Full article ">Figure 12
<p>Height evaluation. (<b>a</b>) Bebop odometry height error; (<b>b</b>) AR-Tag height error; (<b>c</b>) height data from the AR-Tag, Bebop, and LIDAR-Lite compared.</p>
Full article ">Figure 13
<p>Position error of the unmanned ground vehicle (UGV) displacements: Position error of indoor experiment.</p>
Full article ">Figure 14
<p>Snapshots of the UGV control displacement using AR-Tag position data in an outdoor site. (<b>1</b>) Start of the UGV displacement; (<b>2</b>)–(<b>7</b>) The UGV is following the UAV; (<b>8</b>) The UGV reach the position below the UAV;</p>
Full article ">Figure 15
<p>Position error of the UGV displacements: position error of the outdoor experiment.</p>
Full article ">
18 pages, 6299 KiB  
Article
New Approach of UAV Movement Detection and Characterization Using Advanced Signal Processing Methods Based on UWB Sensing
by Angela Digulescu, Cristina Despina-Stoian, Denis Stănescu, Florin Popescu, Florin Enache, Cornel Ioana, Emanuel Rădoi, Iulian Rîncu and Alexandru Șerbănescu
Sensors 2020, 20(20), 5904; https://doi.org/10.3390/s20205904 - 19 Oct 2020
Cited by 16 | Viewed by 4995
Abstract
In the last years, the commercial drone/unmanned aerial vehicles market has grown due to their technological performances (provided by the multiple onboard available sensors), low price, and ease of use. Being very attractive for an increasing number of applications, their presence represents a [...] Read more.
In the last years, the commercial drone/unmanned aerial vehicles market has grown due to their technological performances (provided by the multiple onboard available sensors), low price, and ease of use. Being very attractive for an increasing number of applications, their presence represents a major issue for public or classified areas with a special status, because of the rising number of incidents. Our paper proposes a new approach for the drone movement detection and characterization based on the ultra-wide band (UWB) sensing system and advanced signal processing methods. This approach characterizes the movement of the drone using classical methods such as correlation, envelope detection, time-scale analysis, but also a new method, the recurrence plot analysis. The obtained results are compared in terms of movement map accuracy and required computation time in order to offer a future starting point for the drone intrusion detection. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) The PulsON 440 ultra-wide band (UWB) sensor used for drone movement characterization, (<b>b</b>) the sensor placement for the experimental setup.</p>
Full article ">Figure 2
<p>Drone possible movement scenarios.</p>
Full article ">Figure 3
<p>(<b>a</b>) Emitted signal, (<b>b</b>) amplitude spectrum of the emitted signal.</p>
Full article ">Figure 4
<p>(<b>a</b>) The UWB sensors placement, (<b>b</b>) experimental positioning of the drone and UWB sensors.</p>
Full article ">Figure 5
<p>(<b>a</b>) The Parrot Mambo FPV drone-top view, (<b>b</b>) the Parrot Mambo FPV drone-frontal view.</p>
Full article ">Figure 6
<p>The illustration of the recurrence plots analysis (RPA) method.</p>
Full article ">Figure 7
<p>The movement maps for the forward-back configuration obtained with each method: (<b>a</b>) The correlation method, (<b>b</b>) the envelope detection method, (<b>c</b>) the spectrogram method: Window size: 64, number of overlapped samples: 60, number of FFT (Fast Fourier Transform) points: 128, (<b>d</b>) the wavelet transform method: Mother Morlet wavelet, (<b>e</b>) the time-distributed recurrence (TDR) RPA method: m = 5, d = 1.</p>
Full article ">Figure 7 Cont.
<p>The movement maps for the forward-back configuration obtained with each method: (<b>a</b>) The correlation method, (<b>b</b>) the envelope detection method, (<b>c</b>) the spectrogram method: Window size: 64, number of overlapped samples: 60, number of FFT (Fast Fourier Transform) points: 128, (<b>d</b>) the wavelet transform method: Mother Morlet wavelet, (<b>e</b>) the time-distributed recurrence (TDR) RPA method: m = 5, d = 1.</p>
Full article ">Figure 8
<p>The movement maps for the up-down configuration obtained with each method: (<b>a</b>) The correlation method, (<b>b</b>) the envelope detection method, (<b>c</b>) the spectrogram method: Window size: 64, number of overlapped samples: 60, number of FFT points: 128, (<b>d</b>) the wavelet transform method: Mother Morlet wavelet, (<b>e</b>) the TDR RPA method: m = 5, d = 1.</p>
Full article ">Figure 9
<p>The movement maps for the left-right configuration obtained with each method: (<b>a</b>) The correlation method, (<b>b</b>) the envelope detection method, (<b>c</b>) the spectrogram method: Window size: 64, number of overlapped samples: 60, number of FFT points: 128, (<b>d</b>) the wavelet transform method: Mother Morlet wavelet, (<b>e</b>) the TDR RPA method: m = 5, d = 1.</p>
Full article ">Figure 9 Cont.
<p>The movement maps for the left-right configuration obtained with each method: (<b>a</b>) The correlation method, (<b>b</b>) the envelope detection method, (<b>c</b>) the spectrogram method: Window size: 64, number of overlapped samples: 60, number of FFT points: 128, (<b>d</b>) the wavelet transform method: Mother Morlet wavelet, (<b>e</b>) the TDR RPA method: m = 5, d = 1.</p>
Full article ">Figure 10
<p>The forward-back movement maps accuracy. (<b>a</b>) The subtraction of the envelope detection map and the wavelet transform map: The maps are almost identical, (<b>b</b>) the subtraction of the envelope detection map and the TDR RPA map: The envelope detection map contains more salt-and-pepper noise, (<b>c</b>) the 100th scan line of each corresponding movement map.</p>
Full article ">
16 pages, 2412 KiB  
Article
Power Control and Clustering-Based Interference Management for UAV-Assisted Networks
by Jinxi Zhang, Gang Chuai and Weidong Gao
Sensors 2020, 20(14), 3864; https://doi.org/10.3390/s20143864 - 10 Jul 2020
Cited by 10 | Viewed by 3337
Abstract
Unmanned Aerial Vehicle (UAV) has been widely used in various applications of wireless network. A system of UAVs has the function of collecting data, offloading traffic for ground Base Stations (BSs) and illuminating coverage holes. However, inter-UAV interference is easily introduced because of [...] Read more.
Unmanned Aerial Vehicle (UAV) has been widely used in various applications of wireless network. A system of UAVs has the function of collecting data, offloading traffic for ground Base Stations (BSs) and illuminating coverage holes. However, inter-UAV interference is easily introduced because of the huge number of LoS paths in the air-to-ground channel. In this paper, we propose an interference management framework for UAV-assisted networks, consisting of two main modules: power control and UAV clustering. The power control is executed first to adjust the power levels of UAVs. We model the problem of power control for UAV networks as a non-cooperative game which is proved to be an exact potential game and the Nash equilibrium is reached. Next, to further improve system user rate, coordinated multi-point (CoMP) technique is implemented. The cooperative UAV sets are established to serve users and thus transforming the interfering links into useful links. Affinity propagation is applied to build clusters of UAVs based on the interference strength. Simulation results show that the proposed algorithm integrating power control with CoMP can effectively reduce the interference and improve system sum-rate, compared to Non-CoMP scenario. The law of cluster formation is also obtained where the average cluster size and the number of clusters are affected by inter-UAV distance. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>System Model.</p>
Full article ">Figure 2
<p>Interference management process.</p>
Full article ">Figure 3
<p>Final Power distribution after PG.</p>
Full article ">Figure 4
<p>Comparison of cluster result for two cases: with and without minimum distance restriction. (<b>a</b>) Without distance restriction; (<b>b</b>) With distance restriction.</p>
Full article ">Figure 5
<p>Average cluster size and the number of clusters.</p>
Full article ">Figure 6
<p>CDF plot for three cases: Non-CoMP, Power control only and Power control + CoMP.</p>
Full article ">Figure 7
<p>Sum-rate comparison of CoMP and Non-CoMP under different number of users K.</p>
Full article ">
18 pages, 1154 KiB  
Article
An Efficient Distributed Area Division Method for Cooperative Monitoring Applications with Multiple UAVs
by José Joaquín Acevedo, Ivan Maza, Anibal Ollero and Begoña C. Arrue
Sensors 2020, 20(12), 3448; https://doi.org/10.3390/s20123448 - 18 Jun 2020
Cited by 13 | Viewed by 3268
Abstract
This article addresses the area division problem in a distributed manner providing a solution for cooperative monitoring missions with multiple UAVs. Starting from a sub-optimal area division, a distributed online algorithm is presented to accelerate the convergence of the system to the optimal [...] Read more.
This article addresses the area division problem in a distributed manner providing a solution for cooperative monitoring missions with multiple UAVs. Starting from a sub-optimal area division, a distributed online algorithm is presented to accelerate the convergence of the system to the optimal solution, following a frequency-based approach. Based on the “coordination variables” concept and on a strict neighborhood relation to share information (left, right, above and below neighbors), this technique defines a distributed division protocol to determine coherently the size and shape of the sub-area assigned to each UAV. Theoretically, the convergence time of the proposed solution depends linearly on the number of UAVs. Validation results, comparing the proposed approach with other distributed techniques, are provided to evaluate and analyze its performance following a convergence time criterion. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>A team of 6 UAVs in charge of monitoring a given area in a cooperative manner.</p>
Full article ">Figure 2
<p>Area division between two identical agents. Yellow, blue and gray indicate the area assigned to the first UAV, the second UAV and unassigned, respectively. Image on the left shows a sub-optimal division where <math display="inline"><semantics> <mrow> <msub> <mi>S</mi> <mn>1</mn> </msub> <mo>∪</mo> <msub> <mi>S</mi> <mn>2</mn> </msub> <mo>≠</mo> <mi>S</mi> </mrow> </semantics></math>. Image in the middle shows a sub-optimal division because <math display="inline"><semantics> <mrow> <msub> <mi>S</mi> <mn>1</mn> </msub> <mo>∩</mo> <msub> <mi>S</mi> <mn>2</mn> </msub> <mo>≠</mo> <mo>∅</mo> </mrow> </semantics></math>. Image on the right shows an optimal area division matching the conditions shown in Equation (<a href="#FD1-sensors-20-03448" class="html-disp-formula">1</a>).</p>
Full article ">Figure 3
<p>The area is divided into 6 non-overlapping sub-areas which are assigned to the different UAVs.</p>
Full article ">Figure 4
<p>Area division and grouping depending on the side with respect to the sub-area assigned to the UAV <math display="inline"><semantics> <msub> <mi>Q</mi> <mi>i</mi> </msub> </semantics></math>, considering a <math display="inline"><semantics> <mrow> <mn>4</mn> <mo>×</mo> <mn>5</mn> </mrow> </semantics></math> grid configuration.</p>
Full article ">Figure 5
<p>A synchronized multi-UAV system performing an area partitioning strategy, where neighbor UAVs patrol their closed paths in opposite directions. Each UAV meets periodically with each of its neighbors.</p>
Full article ">Figure 6
<p>Area division protocol used by each UAV <math display="inline"><semantics> <msub> <mi>Q</mi> <mi>i</mi> </msub> </semantics></math> to self-assign its own sub-area <math display="inline"><semantics> <msub> <mi>S</mi> <mi>i</mi> </msub> </semantics></math>. For each step, gray lines are used to make divisions and the green polygon is the chosen one.</p>
Full article ">Figure 7
<p>Required peer-to-peer communications to share an information between the two farthest UAVs, considering a <math display="inline"><semantics> <mrow> <mn>4</mn> <mo>×</mo> <mn>5</mn> </mrow> </semantics></math> grid configuration.</p>
Full article ">Figure 8
<p>Initial area division considered during the simulations. The solid black lines define the irregular area to monitor. The dashed black lines determine the initial sub-areas assigned to the UAVs and the dashed blue lines their initial coverage paths generated using the path planning method proposed in Acevedo et al. [<a href="#B23-sensors-20-03448" class="html-bibr">23</a>].</p>
Full article ">Figure 9
<p>Evolution along the time of the maximum difference between the assigned area and the optimal one, using different coordination methods. The gray dashed lines determine a maximum relative difference of 10%, 5% and 1%</p>
Full article ">Figure 10
<p>Final area division obtained during the simulation using the coordination algorithms: (<b>a</b>) based on the one-to-one coordination, and (<b>b</b>) based on the (2,2)-block-sharing strategy. (<b>c</b>) based on the column-row decoupling, and (<b>d</b>) based on the coordination variables. The dashed red lines determine the final sub-area shapes assigned to the UAVs, the dashed blue lines indicate their coverage paths generated using the path planning method proposed in Acevedo et al. [<a href="#B23-sensors-20-03448" class="html-bibr">23</a>] and the thick dashed black circles highlight two examples of the edges of shapes of the computed sub-areas.</p>
Full article ">Figure 11
<p>Average number of meetings (±its standard deviation) required to converge to the specified area partition strategy depending on the number of UAVs and considering a <math display="inline"><semantics> <mrow> <mn>1</mn> <mo>×</mo> <mi>c</mi> </mrow> </semantics></math> configuration and different coordination algorithms.</p>
Full article ">Figure 12
<p>Average number of meetings (±its standard deviation) required to converge to the specified area partition strategy considering a <math display="inline"><semantics> <mrow> <mi>r</mi> <mo>×</mo> <mi>r</mi> </mrow> </semantics></math> configuration and depending on the number of UAVs and different coordination algorithms.</p>
Full article ">
22 pages, 6506 KiB  
Article
Internet of Unmanned Aerial Vehicles: QoS Provisioning in Aerial Ad-Hoc Networks
by Kirshna Kumar, Sushil Kumar, Omprakash Kaiwartya, Ajay Sikandar, Rupak Kharel and Jaime Lloret Mauri
Sensors 2020, 20(11), 3160; https://doi.org/10.3390/s20113160 - 2 Jun 2020
Cited by 37 | Viewed by 4964
Abstract
Aerial ad-hoc networks have the potential to enable smart services while maintaining communication between the ground system and unmanned aerial vehicles (UAV). Previous research has focused on enabling aerial data-centric smart services while integrating the benefits of aerial objects such as UAVs in [...] Read more.
Aerial ad-hoc networks have the potential to enable smart services while maintaining communication between the ground system and unmanned aerial vehicles (UAV). Previous research has focused on enabling aerial data-centric smart services while integrating the benefits of aerial objects such as UAVs in hostile and non-hostile environments. Quality of service (QoS) provisioning in UAV-assisted communication is a challenging research theme in aerial ad-hoc networks environments. Literature on aerial ad hoc networks lacks cooperative service-oriented modeling for distributed network environments, relying on costly static base station-oriented centralized network environments. Towards this end, this paper proposes a quality of service provisioning framework for a UAV-assisted aerial ad hoc network environment (QSPU) focusing on reliable aerial communication. The UAV’s aerial mobility and service parameters are modelled considering highly dynamic aerial ad-hoc environments. UAV-centric mobility models are utilized to develop a complete aerial routing framework. A comparative performance evaluation demonstrates the benefits of the proposed aerial communication framework. It is evident that QSPU outperforms the state-of-the-art techniques in terms of a number of service-oriented performance metrics in a UAV-assisted aerial ad-hoc network environment. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>A communication system for unmanned aerial vehicle (UAV)-enabled aerial ad hoc networks (UAANETs).</p>
Full article ">Figure 2
<p>Superposition of neighboring sub UAANETs. Caption of (<b>a</b>) communication among UAVs with two levels and (<b>b</b>) communications among UAVs with single level.</p>
Full article ">Figure 3
<p>Service-oriented UAV route lifetime calculation.</p>
Full article ">Figure 4
<p>Leaky bucket strategy for UAV-centric communication.</p>
Full article ">Figure 5
<p>Structure of Internet Gateway (IG) advertisements (IGAD) message.</p>
Full article ">Figure 6
<p>Optimal route and Internet gateway (IG) selection process of QSPU.</p>
Full article ">Figure 7
<p>Impact of joining delay of UAVs on smart service delivery ratio (SSDR).</p>
Full article ">Figure 8
<p>Impact of number of UAVs on smart service delivery ratio.</p>
Full article ">Figure 9
<p>Impact of service load on smart service delivery ratio.</p>
Full article ">Figure 10
<p>Impact of joining delay of UAVs on service overhead of flying ad hoc networks.</p>
Full article ">Figure 11
<p>Impact of joining delay of UAVs on the connectivity in UAANETs.</p>
Full article ">Figure 12
<p>Impact of joining delay of UAVs on number of handoffs in UAANETs.</p>
Full article ">Figure 13
<p>Impact of number of UAVs on number of handoffs in UAANETs.</p>
Full article ">Figure 14
<p>Impact of number of UAVs on service delay in UAANETs.</p>
Full article ">Figure 15
<p>Variation in SSDR with service load.</p>
Full article ">Figure 16
<p>SSDR vs. number of UAVs.</p>
Full article ">Figure 17
<p>UAV handoffs vs. joining delay of UAVs.</p>
Full article ">Figure 18
<p>UAV handoffs vs. number of UAVs.</p>
Full article ">Figure 19
<p>Service delay vs. number of UAVs.</p>
Full article ">
35 pages, 16938 KiB  
Article
Design of Airport Obstacle-Free Zone Monitoring UAV System Based on Computer Vision
by Liang Wang, Jianliang Ai, Li Zhang and Zhenlin Xing
Sensors 2020, 20(9), 2475; https://doi.org/10.3390/s20092475 - 27 Apr 2020
Cited by 14 | Viewed by 4947
Abstract
In recent years, a rising number of incidents between Unmanned Aerial Vehicles (UAVs) and planes have been reported at airports and airfields. A design scheme for an airport obstacle-free zone monitoring UAV system based on computer vision is proposed. The system integrates the [...] Read more.
In recent years, a rising number of incidents between Unmanned Aerial Vehicles (UAVs) and planes have been reported at airports and airfields. A design scheme for an airport obstacle-free zone monitoring UAV system based on computer vision is proposed. The system integrates the functions of identification, tracking, and expelling and is mainly used for low-cost control of balloon airborne objects and small aircrafts. First, a quadcopter dynamic model and 2-Degrees of Freedom (2-DOF) Pan/Tilt/Zoom (PTZ) model are analyzed, and an attitude back-stepping controller based on disturbance compensation is designed. Second, a low and slow small-target self-identification and tracking technology is constructed against a complex environment. Based on the You Only Look Once (YOLO) and Kernel Correlation Filter (KCF) algorithms, an autonomous target recognition and high-speed tracking plan with great robustness and high reliability is designed. Third, a PTZ controller and automatic aiming strategy based on Anti-Windup Proportional Integral Derivative (PID) algorithm is designed, and a simplified, automatic-aiming expelling device, the environmentally friendly gel ball blaster, which features high speed and high accuracy, is built. The feasibility and stability of the project can be verified through prototype experiments. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>3D modeling.</p>
Full article ">Figure 2
<p>Photo of prototype.</p>
Full article ">Figure 3
<p>System architecture.</p>
Full article ">Figure 4
<p>Quadcopter force analysis diagram.</p>
Full article ">Figure 5
<p>Quadcopter coordinate system.</p>
Full article ">Figure 6
<p>Quadcopter plane: simplified structure.</p>
Full article ">Figure 7
<p>Gel ball blaster.</p>
Full article ">Figure 8
<p>Assembly drawing of continuous gel balls feeding device.</p>
Full article ">Figure 9
<p>Environmentally friendly gel balls [<a href="#B17-sensors-20-02475" class="html-bibr">17</a>].</p>
Full article ">Figure 10
<p>PTZ launcher.</p>
Full article ">Figure 11
<p>Ground station.</p>
Full article ">Figure 12
<p>Structural diagram of quadcopter attitude controller.</p>
Full article ">Figure 13
<p>YOLO network architecture.</p>
Full article ">Figure 14
<p>Rotorcraft positive specimen training set.</p>
Full article ">Figure 15
<p>Balloon positive specimen training set.</p>
Full article ">Figure 16
<p>Fixed-wing aircraft positive specimen training set.</p>
Full article ">Figure 17
<p>Target positive specimen training set.</p>
Full article ">Figure 18
<p>YOLO real-time recognition location UAV. (<b>a</b>–<b>d</b>) Four sets of localization experiments.</p>
Full article ">Figure 19
<p>2-DOF PTZ tracking coordinate diagram.</p>
Full article ">Figure 20
<p>Back-calculation anti-windup Proportional Integral Derivative (PID) structure.</p>
Full article ">Figure 21
<p>Structural diagram of quadcopter trajectory tracking controller.</p>
Full article ">Figure 22
<p>Curve of quadcopter trajectory tracking responses. (<b>a</b>) 3D curve of trajectory tracking. (<b>b</b>) Curve of trajectory tracking errors. (<b>c</b>) Curve of trajectory tracking along the <span class="html-italic">x</span>-axis. (<b>d</b>) Curve of trajectory tracking errors along the <span class="html-italic">x</span>-axis. (<b>e</b>) Curve of trajectory tracking along the <span class="html-italic">y</span>-axis. (<b>f</b>) Curve of trajectory tracking errors along the <span class="html-italic">y</span>-axis. (<b>g</b>) Curve of trajectory tracking along the <span class="html-italic">z</span>-axis. (<b>h</b>) Curve of trajectory tracking errors along the <span class="html-italic">z</span>-axis. (<b>i</b>) Curve of roll angle responses. (<b>j</b>) Curve of roll angle tracking errors. (<b>k</b>) Curve of pitch angle responses. (<b>l</b>) Curve of pitch angle tracking errors. (<b>m</b>) Curve of yaw angle responses. (<b>n</b>) Curve of yaw angle tracking errors.</p>
Full article ">Figure 22 Cont.
<p>Curve of quadcopter trajectory tracking responses. (<b>a</b>) 3D curve of trajectory tracking. (<b>b</b>) Curve of trajectory tracking errors. (<b>c</b>) Curve of trajectory tracking along the <span class="html-italic">x</span>-axis. (<b>d</b>) Curve of trajectory tracking errors along the <span class="html-italic">x</span>-axis. (<b>e</b>) Curve of trajectory tracking along the <span class="html-italic">y</span>-axis. (<b>f</b>) Curve of trajectory tracking errors along the <span class="html-italic">y</span>-axis. (<b>g</b>) Curve of trajectory tracking along the <span class="html-italic">z</span>-axis. (<b>h</b>) Curve of trajectory tracking errors along the <span class="html-italic">z</span>-axis. (<b>i</b>) Curve of roll angle responses. (<b>j</b>) Curve of roll angle tracking errors. (<b>k</b>) Curve of pitch angle responses. (<b>l</b>) Curve of pitch angle tracking errors. (<b>m</b>) Curve of yaw angle responses. (<b>n</b>) Curve of yaw angle tracking errors.</p>
Full article ">Figure 22 Cont.
<p>Curve of quadcopter trajectory tracking responses. (<b>a</b>) 3D curve of trajectory tracking. (<b>b</b>) Curve of trajectory tracking errors. (<b>c</b>) Curve of trajectory tracking along the <span class="html-italic">x</span>-axis. (<b>d</b>) Curve of trajectory tracking errors along the <span class="html-italic">x</span>-axis. (<b>e</b>) Curve of trajectory tracking along the <span class="html-italic">y</span>-axis. (<b>f</b>) Curve of trajectory tracking errors along the <span class="html-italic">y</span>-axis. (<b>g</b>) Curve of trajectory tracking along the <span class="html-italic">z</span>-axis. (<b>h</b>) Curve of trajectory tracking errors along the <span class="html-italic">z</span>-axis. (<b>i</b>) Curve of roll angle responses. (<b>j</b>) Curve of roll angle tracking errors. (<b>k</b>) Curve of pitch angle responses. (<b>l</b>) Curve of pitch angle tracking errors. (<b>m</b>) Curve of yaw angle responses. (<b>n</b>) Curve of yaw angle tracking errors.</p>
Full article ">Figure 23
<p>Target tracking test: (<b>a</b>) balloon in the air; (<b>b</b>) fixed-wing model aircraft in flight.</p>
Full article ">Figure 24
<p>Simulation test of PTZ controller trajectory tracking: (<b>a</b>) shape of square; (<b>b</b>) shape of Z.</p>
Full article ">Figure 25
<p>Indoor static test.</p>
Full article ">Figure 26
<p>Outdoor dynamic expelling test.</p>
Full article ">Figure 27
<p>Target image.</p>
Full article ">Figure 28
<p>Target in grayscale.</p>
Full article ">
32 pages, 1281 KiB  
Article
Latency Compensated Visual-Inertial Odometry for Agile Autonomous Flight
by Kyuman Lee and Eric N. Johnson
Sensors 2020, 20(8), 2209; https://doi.org/10.3390/s20082209 - 14 Apr 2020
Cited by 3 | Viewed by 4138
Abstract
In visual-inertial odometry (VIO), inertial measurement unit (IMU) dead reckoning acts as the dynamic model for flight vehicles while camera vision extracts information about the surrounding environment and determines features or points of interest. With these sensors, the most widely used algorithm for [...] Read more.
In visual-inertial odometry (VIO), inertial measurement unit (IMU) dead reckoning acts as the dynamic model for flight vehicles while camera vision extracts information about the surrounding environment and determines features or points of interest. With these sensors, the most widely used algorithm for estimating vehicle and feature states for VIO is an extended Kalman filter (EKF). The design of the standard EKF does not inherently allow for time offsets between the timestamps of the IMU and vision data. In fact, sensor-related delays that arise in various realistic conditions are at least partially unknown parameters. A lack of compensation for unknown parameters often leads to a serious impact on the accuracy of VIO systems and systems like them. To compensate for the uncertainties of the unknown time delays, this study incorporates parameter estimation into feature initialization and state estimation. Moreover, computing cross-covariance and estimating delays in online temporal calibration correct residual, Jacobian, and covariance. Results from flight dataset testing validate the improved accuracy of VIO employing latency compensated filtering frameworks. The insights and methods proposed here are ultimately useful in any estimation problem (e.g., multi-sensor fusion scenarios) where compensation for partially unknown time delays can enhance performance. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>A schematic of the sequential measurement update.</p>
Full article ">Figure 2
<p>Data streams of the IMU and the delayed vision data.</p>
Full article ">Figure 3
<p>Examples of interpolation and slerp. (<b>a</b>) Linear interpolation, (<b>b</b>) Quaternion slerp.</p>
Full article ">Figure 4
<p>A schematic of a modified measurement update using covariance correction.</p>
Full article ">Figure 5
<p>Three corrections in the latency compensated VIO.</p>
Full article ">Figure 6
<p>A flow chart of the overall process of the latency compensated VIO.</p>
Full article ">Figure 7
<p>Estimation of total delays in simulation. (<b>a</b>) A static delay, (<b>b</b>) Varying delays.</p>
Full article ">Figure 8
<p>ROS rqt graph of the latency compensated VIO linked to the EuRoC dataset.</p>
Full article ">Figure 9
<p>Top-down view of flight trajectory of the EuRoC V1 medium dataset by the latency compensated VIO.</p>
Full article ">Figure 10
<p>Position and estimation error of the EuRoC V1 medium dataset by the latency compensated VIO. (<b>a</b>) Position <span class="html-italic">x</span>, (<b>b</b>) Position <span class="html-italic">y</span>, (<b>c</b>) Position <span class="html-italic">z</span>.</p>
Full article ">Figure 11
<p>Box plot of absolute estimation error of position of the EuRoC V1 medium dataset by the latency compensated VIO.</p>
Full article ">Figure 12
<p>Estimation of unknown part of time delays of the EuRoC V1 medium dataset.</p>
Full article ">Figure 13
<p>Estimation of time delays of the EuRoC V1 medium dataset. (<b>a</b>) Readable delays, (<b>b</b>) Estimated total delays.</p>
Full article ">Figure A1
<p>Optimality of the latency compensated VIO.</p>
Full article ">
24 pages, 956 KiB  
Article
Robust Outlier-Adaptive Filtering for Vision-Aided Inertial Navigation
by Kyuman Lee and Eric N. Johnson
Sensors 2020, 20(7), 2036; https://doi.org/10.3390/s20072036 - 4 Apr 2020
Cited by 9 | Viewed by 4231
Abstract
With the advent of unmanned aerial vehicles (UAVs), a major area of interest in the research field of UAVs has been vision-aided inertial navigation systems (V-INS). In the front-end of V-INS, image processing extracts information about the surrounding environment and determines features or [...] Read more.
With the advent of unmanned aerial vehicles (UAVs), a major area of interest in the research field of UAVs has been vision-aided inertial navigation systems (V-INS). In the front-end of V-INS, image processing extracts information about the surrounding environment and determines features or points of interest. With the extracted vision data and inertial measurement unit (IMU) dead reckoning, the most widely used algorithm for estimating vehicle and feature states in the back-end of V-INS is an extended Kalman filter (EKF). An important assumption of the EKF is Gaussian white noise. In fact, measurement outliers that arise in various realistic conditions are often non-Gaussian. A lack of compensation for unknown noise parameters often leads to a serious impact on the reliability and robustness of these navigation systems. To compensate for uncertainties of the outliers, we require modified versions of the estimator or the incorporation of other techniques into the filter. The main purpose of this paper is to develop accurate and robust V-INS for UAVs, in particular, those for situations pertaining to such unknown outliers. Feature correspondence in image processing front-end rejects vision outliers, and then a statistic test in filtering back-end detects the remaining outliers of the vision data. For frequent outliers occurrence, variational approximation for Bayesian inference derives a way to compute the optimal noise precision matrices of the measurement outliers. The overall process of outlier removal and adaptation is referred to here as “outlier-adaptive filtering”. Even though almost all approaches of V-INS remove outliers by some method, few researchers have treated outlier adaptation in V-INS in much detail. Here, results from flight datasets validate the improved accuracy of V-INS employing the proposed outlier-adaptive filtering framework. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>A schematic of the sequential measurement update.</p>
Full article ">Figure 2
<p>Close loop steps of outlier rejection in image processing front-end.</p>
Full article ">Figure 3
<p>A block diagram of the vision-aided inertial navigation system employing the outlier-adaptive filtering.</p>
Full article ">Figure 4
<p>A flow chart of the overall process of the outlier-adaptive filtering.</p>
Full article ">Figure 5
<p>Top-down view of flight trajectory of the EuRoC V1 difficult dataset by the outlier-adaptive filter.</p>
Full article ">Figure 6
<p>Position and estimation error of the EuRoC V1 difficult dataset by the outlier-adaptive filter.</p>
Full article ">Figure 7
<p>Box plot of absolute estimation error of the position of the EuRoC V1 difficult dataset by the outlier-adaptive filter.</p>
Full article ">
20 pages, 4653 KiB  
Article
Weak Knock Characteristic Extraction of a Two-Stroke Spark Ignition UAV Engine Burning RP-3 Kerosene Fuel Based on Intrinsic Modal Functions Energy Method
by Jing Sheng, Rui Liu and Guoman Liu
Sensors 2020, 20(4), 1148; https://doi.org/10.3390/s20041148 - 19 Feb 2020
Cited by 2 | Viewed by 3307
Abstract
To solve the problem of the weak knock characteristic extraction for a port-injected two-stoke spark ignition (SI) unmanned aerial vehicle (UAV) engine burning aviation kerosene fuel, which is also known as the Rocket Propellant 3 (RP-3), the Intrinsic modal Functions Energy (IMFE) method [...] Read more.
To solve the problem of the weak knock characteristic extraction for a port-injected two-stoke spark ignition (SI) unmanned aerial vehicle (UAV) engine burning aviation kerosene fuel, which is also known as the Rocket Propellant 3 (RP-3), the Intrinsic modal Functions Energy (IMFE) method is proposed according to the orthogonality of the intrinsic modal functions (IMFs). In this method, engine block vibration signals of the two-stroke SI UAV engine are decomposed into a finite number of intrinsic modal function (IMF) components. Then, the energy weight value of each IMF component is calculated, and the IMF component with the largest energy weight value is selected as the dominant characteristic component. The knock characteristic frequency of the two-stroke SI UAV engine is obtained by analyzing the frequency spectrum of the dominant characteristic component. A simulation experiment is designed and the feasibility of the algorithm is verified. The engine block vibration signals of the two-stroke SI UAV engine at 5100 rpm and 5200 rpm were extracted by this method. The results showed that the knock characteristic frequencies of engine block vibration signals at 5100 rpm and 5200 rpm were 3.320 kHz and 3.125 kHz, respectively. The Wavelet Packet Energy method was used to extract the characteristics of the same engine block vibration signal at 5200 rpm, and the same result as the IMFE method is obtained, which verifies the effectiveness of the IMFE method. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>Time-domain waveform of the simulation component signal <math display="inline"><semantics> <mrow> <mi>k</mi> <mo stretchy="false">(</mo> <mi>t</mi> <mo stretchy="false">)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 2
<p>The frequency of the simulation component signal <math display="inline"><semantics> <mrow> <mi>k</mi> <mo stretchy="false">(</mo> <mi>t</mi> <mo stretchy="false">)</mo> </mrow> </semantics></math> obtained by Fast Fourier Transform (FFT).</p>
Full article ">Figure 3
<p>Time-domain waveform of simulation signal <math display="inline"><semantics> <mrow> <mi>S</mi> <mo stretchy="false">(</mo> <mi>t</mi> <mo stretchy="false">)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>The time-domain diagrams of eight Intrinsic modal Function (IMF) components.</p>
Full article ">Figure 5
<p>The energy weight value of eight Intrinsic modal Function (IMF) components.</p>
Full article ">Figure 6
<p>Time-domain waveform of the IMF1 component.</p>
Full article ">Figure 7
<p>The frequency of the IMF1 component obtained by Fast Fourier Transform (FFT).</p>
Full article ">Figure 8
<p>Schematic of the engine test bench.</p>
Full article ">Figure 9
<p>Image of the engine test bench.</p>
Full article ">Figure 10
<p>Curve of the cylinder pressure signal with strong knock and non-knock.</p>
Full article ">Figure 11
<p>Time-domain waveform of the cylinder pressure signal and the engine block vibration signal under a weak knock condition at 5100 rpm. (<b>a</b>) Time-domain waveform of the cylinder pressure signal under a weak knock condition at 5100 rpm. (<b>b</b>) Time-domain waveform of the engine block vibration signal under weak knock condition at 5100 rpm.</p>
Full article ">Figure 12
<p>The intrinsic modal function (IMF) component of the engine block vibration signal under a weak knock condition at 5100 rpm.</p>
Full article ">Figure 13
<p>Each IMF component energy weight value of the engine block vibration signal under a weak knock condition at 5100 rpm.</p>
Full article ">Figure 14
<p>Frequency-domain diagram of the IMF2 component of the engine block vibration signal under a weak knock condition at 5100 rpm.</p>
Full article ">Figure 15
<p>Time-domain waveform of the cylinder pressure signal and the engine block vibration signal under a weak knock condition at 5200 rpm. (<b>a</b>) Time-domain waveform of the cylinder pressure signal under a weak knock condition at 5200 rpm. (<b>b</b>) Time-domain waveform of the engine block vibration signal under a weak knock condition at 5200 rpm.</p>
Full article ">Figure 16
<p>IMF component of the engine block vibration signal under a weak knock condition at 5200 rpm.</p>
Full article ">Figure 17
<p>Each IMF component energy weight value of the engine block vibration signal under a weak knock condition at 5200 rpm.</p>
Full article ">Figure 18
<p>Frequency-domain diagram of the IMF2 component of the engine block vibration signal under a weak knock condition at 5200 rpm.</p>
Full article ">Figure 19
<p>Block diagram of the Wavelet Packet Energy method used to extract the knock characteristics of the engine block vibration signal under a weak knock condition at 5200 rpm.</p>
Full article ">Figure 20
<p>The reconstruction signals corresponding to eight nodes in the third layer.</p>
Full article ">Figure 21
<p>The energy percentage for the reconstructed signal of eight nodes in three layers.</p>
Full article ">Figure 22
<p>Frequency characteristics of reconstructed signals of node S (3,0).</p>
Full article ">
22 pages, 11307 KiB  
Article
A Semi-Physical Platform for Guidance and Formations of Fixed-Wing Unmanned Aerial Vehicles
by Jun Yang, Arun Geo Thomas, Satish Singh, Simone Baldi and Ximan Wang
Sensors 2020, 20(4), 1136; https://doi.org/10.3390/s20041136 - 19 Feb 2020
Cited by 16 | Viewed by 6386
Abstract
Unmanned Aerial Vehicles (UAVs) have multi-domain applications, fixed-wing UAVs being a widely used class. Despite the ongoing research on the topics of guidance and formation control of fixed-wing UAVs, little progress is known on implementation of semi-physical validation platforms (software-in-the-loop or hardware-in-the-loop) for [...] Read more.
Unmanned Aerial Vehicles (UAVs) have multi-domain applications, fixed-wing UAVs being a widely used class. Despite the ongoing research on the topics of guidance and formation control of fixed-wing UAVs, little progress is known on implementation of semi-physical validation platforms (software-in-the-loop or hardware-in-the-loop) for such complex autonomous systems. A semi-physical simulation platform should capture not only the physical aspects of UAV dynamics, but also the cybernetics aspects such as the autopilot and the communication layers connecting the different components. Such a cyber-physical integration would allow validation of guidance and formation control algorithms in the presence of uncertainties, unmodelled dynamics, low-level control loops, communication protocols and unreliable communication: These aspects are often neglected in the design of guidance and formation control laws for fixed-wing UAVs. This paper describes the development of a semi-physical platform for multi-fixed wing UAVs where all the aforementioned points are carefully integrated. The environment adopts Raspberry Pi’s programmed in C++, which can be interfaced to standard autopilots (PX4) as a companion computer. Simulations are done in a distributed setting with a server program designed for the purpose of routing data between nodes, handling the user inputs and configurations of the UAVs. Gazebo-ROS is used as a 3D visualization tool. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>Hardware to implement guidance and formation control algorithms.</p>
Full article ">Figure 2
<p>System architecture for Unmanned Aerial Vehicle (UAV) formation control.</p>
Full article ">Figure 3
<p>Gazebo with ground control station.</p>
Full article ">Figure 4
<p>Launching five UAVs.</p>
Full article ">Figure 5
<p>UAVs during flight and loitering.</p>
Full article ">Figure 6
<p>Companion computer (RPI 3 b+) connection to autopilot (Pixhawk2).</p>
Full article ">Figure 7
<p>Hardware-in-the-loop (HITL) simulation with companion computer.</p>
Full article ">Figure 8
<p>State diagram for packet reception.</p>
Full article ">Figure 9
<p>Main thread operation on received bytes.</p>
Full article ">Figure 10
<p>Operation of data processor thread.</p>
Full article ">Figure 11
<p>Body frame in a fixed-wing UAV.</p>
Full article ">Figure 12
<p>Attitude control loop for fixed-wing UAVs in PX4.</p>
Full article ">Figure 13
<p>A simple communication graph with path planner, leader and follower.</p>
Full article ">Figure 14
<p>Communication graphs for inverted T, inverted V and Y formations.</p>
Full article ">Figure 15
<p>UAVS taking-off from different positions at different time instants.</p>
Full article ">Figure 16
<p>After take-off, the UAVs reach different loitering points.</p>
Full article ">Figure 17
<p>Different stages of the transition from inverted T to inverted V formation.</p>
Full article ">Figure 18
<p>Different stages of the transition from inverted Y to Y formation.</p>
Full article ">Figure 19
<p>UAVs have completed the transition from V to Y formation and start loitering.</p>
Full article ">
21 pages, 8136 KiB  
Article
Single Neural Adaptive PID Control for Small UAV Micro-Turbojet Engine
by Wei Tang, Lijian Wang, Jiawei Gu and Yunfeng Gu
Sensors 2020, 20(2), 345; https://doi.org/10.3390/s20020345 - 8 Jan 2020
Cited by 32 | Viewed by 9578
Abstract
The micro-turbojet engine (MTE) is especially suitable for unmanned aerial vehicles (UAVs). Because the rotor speed is proportional to the thrust force, the accurate speed tracking control is indispensable for MTE. Thanks to its simplicity, the proportional–integral–derivative (PID) controller is commonly used for [...] Read more.
The micro-turbojet engine (MTE) is especially suitable for unmanned aerial vehicles (UAVs). Because the rotor speed is proportional to the thrust force, the accurate speed tracking control is indispensable for MTE. Thanks to its simplicity, the proportional–integral–derivative (PID) controller is commonly used for rotor speed regulation. However, the PID controller cannot guarantee superior performance over the entire operation range due to the time-variance and strong nonlinearity of MTE. The gain scheduling approach using a family of linear controllers is recognized as an efficient alternative, but such a solution heavily relies on the model sets and pre-knowledge. To tackle such challenges, a single neural adaptive PID (SNA-PID) controller is proposed herein for rotor speed control. The new controller featuring with a single-neuron network is able to adaptively tune the gains (weights) online. The simple structure of the controller reduces the computational load and facilitates the algorithm implementation on low-cost hardware. Finally, the proposed controller is validated by numerical simulations and experiments on the MTE in laboratory conditions, and the results show that the proposed controller achieves remarkable effectiveness for speed tracking control. In comparison with the PID controller, the proposed controller yields 54% and 66% reductions on static tracking error under two typical cases. Full article
(This article belongs to the Special Issue UAV-Based Smart Sensor Systems and Applications)
Show Figures

Figure 1

Figure 1
<p>Turbojet engine structure.</p>
Full article ">Figure 2
<p>NT-50A micro-turbojet engine (MTE).</p>
Full article ">Figure 3
<p>The installation of speed sensor KMZ10CM.</p>
Full article ">Figure 4
<p>The control system for MTE.</p>
Full article ">Figure 5
<p>Operation limits for engine control.</p>
Full article ">Figure 6
<p>The printed circuit board (PCB) of the designed ECU.</p>
Full article ">Figure 7
<p>Block diagram of MTE system with single neural adaptive proportional–integral–derivative (SNA-PID) controller.</p>
Full article ">Figure 8
<p>Rotation speed and the duty cycle of pulse width modulation (PWM).</p>
Full article ">Figure 9
<p>Comparison between real and simulated response. (<b>a</b>) 51,900–55,500; (<b>b</b>) 55,500–58,900; (<b>c</b>) 58,900–61,900; (<b>d</b>) 61,900–65,000; (<b>e</b>) 65,000–67,800; (<b>f</b>) 67,800–70,800.</p>
Full article ">Figure 9 Cont.
<p>Comparison between real and simulated response. (<b>a</b>) 51,900–55,500; (<b>b</b>) 55,500–58,900; (<b>c</b>) 58,900–61,900; (<b>d</b>) 61,900–65,000; (<b>e</b>) 65,000–67,800; (<b>f</b>) 67,800–70,800.</p>
Full article ">Figure 10
<p>Tracking performance comparison between SNA-PID control and conventional PID control.</p>
Full article ">Figure 11
<p>Experimental setup.</p>
Full article ">Figure 12
<p>The user interface for experiments.</p>
Full article ">Figure 13
<p>The rotor speed with SNA-PID control vs. conventional PID control. (<b>a</b>) Case I; (<b>b</b>) Case II.</p>
Full article ">Figure 14
<p>Mean square error (MSE) of the two controllers’ speed tracking. (<b>a</b>) The MSE during the whole process; (<b>b</b>) the MSE in steady-state.</p>
Full article ">Figure 15
<p>The time history of three weights in the two cases. (<b>a</b>) <math display="inline"><semantics> <mrow> <msubsup> <mi>w</mi> <mn>1</mn> <mo>′</mo> </msubsup> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <msubsup> <mi>w</mi> <mn>2</mn> <mo>′</mo> </msubsup> </mrow> </semantics></math>; (<b>c</b>) <math display="inline"><semantics> <mrow> <msubsup> <mi>w</mi> <mn>3</mn> <mo>′</mo> </msubsup> </mrow> </semantics></math>.</p>
Full article ">
Back to TopTop