Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (111)

Search Parameters:
Keywords = aerial inertial navigation system

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 4331 KiB  
Article
A Method for Measuring the Error Rules in Visual Inertial Odometry Based on Scene Matching Corrections
by Haiqiao Liu, Zichao Gong, Jinxu Shen, Ya Li and Qing Long
Micromachines 2024, 15(11), 1362; https://doi.org/10.3390/mi15111362 - 11 Nov 2024
Viewed by 449
Abstract
To address problems in the integrated navigation error law of unmanned aerial vehicles (UAVs), this paper proposes a method for measuring the error rule in visual inertial odometry based on scene matching corrections. The method involves several steps to build the solution. Firstly, [...] Read more.
To address problems in the integrated navigation error law of unmanned aerial vehicles (UAVs), this paper proposes a method for measuring the error rule in visual inertial odometry based on scene matching corrections. The method involves several steps to build the solution. Firstly, separate models were constructed for the visual navigation model, the Micro-Electromechanical System (MEMS) navigation model, and the scene matching correction model. Secondly, an integrated navigation error measurement model based on scene matching corrections and MEMS navigation was established (the MEMS+SM model). Finally, an integrated navigation error measurement model based on scene matching corrections, visual navigation, and MEMS navigation was constructed (the VN+MEMS+SM model). In the experimental part, this paper first calculates the average error of the VN+MEMS+SM model and the MEMS+SM model under different scene matching accuracies, scene matching times, and MEMS accuracies. The results indicate that, when the scene matching accuracy is less than 10 m and the scene matching time is less than 10 s, the errors of the VN+MEMS+SM model and the MEMS+SM model are approximately equal. Furthermore, the relationship between the scene matching time and the scene matching accuracy in the EMS+SM model was calculated. The results show that, when the scene matching time is 10 s, the critical values of the image matching accuracies required to achieve average errors of 10 m, 30 m, and 50 m are approximately 160 m, 240 m, and 310 m. Additionally, when the MEMS accuracy is 150, the scene matching accuracy is 50 m, and the scene matching time exceeds 135 s, the average error of the VN+MEMS+SM model will be smaller than that of the MEMS+SM model. Full article
Show Figures

Figure 1

Figure 1
<p>Total flow chart.</p>
Full article ">Figure 2
<p>Errors of different models.</p>
Full article ">Figure 3
<p>The average error of the MEMS+SM model under different working conditions.</p>
Full article ">Figure 3 Cont.
<p>The average error of the MEMS+SM model under different working conditions.</p>
Full article ">Figure 4
<p>The critical value of the scene matching error at different scene matching times.</p>
Full article ">Figure 5
<p>The average error of the VN+MEMS+SM model under different working conditions.</p>
Full article ">Figure 5 Cont.
<p>The average error of the VN+MEMS+SM model under different working conditions.</p>
Full article ">Figure 6
<p>The average error and one standard deviation of the VN+MEMS+SM model and MEMS+SM model.</p>
Full article ">Figure 7
<p>The threshold of the VN+MEMS+SM model and MEMS+SM model.</p>
Full article ">Figure 8
<p>Variation of error with flight time.</p>
Full article ">
13 pages, 3686 KiB  
Communication
A Novel Robust Position Integration Optimization-Based Alignment Method for In-Flight Coarse Alignment
by Xiaoge Ning, Jixun Huang and Jianxun Li
Sensors 2024, 24(21), 7000; https://doi.org/10.3390/s24217000 - 31 Oct 2024
Viewed by 424
Abstract
In-flight alignment is a critical milestone for inertial navigation system/global navigation satellite system (INS/GNSS) applications in unmanned aerial vehicles (UAVs). The traditional position integration formula for in-flight coarse alignment requires the GNSS velocity data to be valid throughout the alignment period, which greatly [...] Read more.
In-flight alignment is a critical milestone for inertial navigation system/global navigation satellite system (INS/GNSS) applications in unmanned aerial vehicles (UAVs). The traditional position integration formula for in-flight coarse alignment requires the GNSS velocity data to be valid throughout the alignment period, which greatly limits the engineering applicability of the method. In this paper, a new robust position integration optimization-based alignment (OBA) method for in-flight coarse alignment is presented to solve the problem of in-flight alignment under a prolonged ineffective GNSS. In this methodology, to achieve a higher alignment accuracy in case the GNSS is not effective throughout the alignment period, the integration of GNSS velocity into the local-level navigation frame is replaced by the GNSS position in the Earth-centered, Earth-fixed frame, which avoids the need for complete GNSS velocity data. The simulation and flight test results show that the new robust position integration method proposed in this paper achieves higher stability and robustness than the conventional position integration OBA method and can achieve an alignment accuracy of 0.2° even when the GNSS is partially time-invalidated. Thus, this greatly extends the application of the OBA method for in-flight alignment. Full article
Show Figures

Figure 1

Figure 1
<p><math display="inline"><semantics> <mrow> <msup> <mstyle mathvariant="bold" mathsize="normal"> <mi>x</mi> </mstyle> <mi>e</mi> </msup> </mrow> </semantics></math> at different moments.</p>
Full article ">Figure 2
<p>Diagram of the robust position integration formula method.</p>
Full article ">Figure 3
<p>(<b>a</b>) Simulation velocity; (<b>b</b>) simulation attitude; (<b>c</b>) simulation trajectory position.</p>
Full article ">Figure 3 Cont.
<p>(<b>a</b>) Simulation velocity; (<b>b</b>) simulation attitude; (<b>c</b>) simulation trajectory position.</p>
Full article ">Figure 4
<p>Curves of the alignment attitude error of the two methods for the first simulation condition.</p>
Full article ">Figure 5
<p>Curves of the alignment attitude error of the two methods for the second simulation condition.</p>
Full article ">Figure 6
<p>(<b>a</b>) Flight attitude; (<b>b</b>) flight velocity; (<b>c</b>) flight trajectory.</p>
Full article ">Figure 6 Cont.
<p>(<b>a</b>) Flight attitude; (<b>b</b>) flight velocity; (<b>c</b>) flight trajectory.</p>
Full article ">Figure 7
<p>Curves of the alignment attitude error of the TPIF method and RPIF method flight data.</p>
Full article ">
46 pages, 13038 KiB  
Review
A Review on Deep Learning for UAV Absolute Visual Localization
by Andy Couturier and Moulay A. Akhloufi
Drones 2024, 8(11), 622; https://doi.org/10.3390/drones8110622 - 29 Oct 2024
Viewed by 1192
Abstract
In the past few years, the use of Unmanned Aerial Vehicles (UAVs) has expanded and now reached mainstream levels for applications such as infrastructure inspection, agriculture, transport, security, entertainment, real estate, environmental conservation, search and rescue, and even insurance. This surge in adoption [...] Read more.
In the past few years, the use of Unmanned Aerial Vehicles (UAVs) has expanded and now reached mainstream levels for applications such as infrastructure inspection, agriculture, transport, security, entertainment, real estate, environmental conservation, search and rescue, and even insurance. This surge in adoption can be attributed to the UAV ecosystem’s maturation, which has not only made these devices more accessible and cost effective but has also significantly enhanced their operational capabilities in terms of flight duration and embedded computing power. In conjunction with these developments, the research on Absolute Visual Localization (AVL) has seen a resurgence driven by the introduction of deep learning to the field. These new approaches have significantly improved localization solutions in comparison to the previous generation of approaches based on traditional computer vision feature extractors. This paper conducts an extensive review of the literature on deep learning-based methods for UAV AVL, covering significant advancements since 2019. It retraces key developments that have led to the rise in learning-based approaches and provides an in-depth analysis of related localization sources such as Inertial Measurement Units (IMUs) and Global Navigation Satellite Systems (GNSSs), highlighting their limitations and advantages for more effective integration with AVL. The paper concludes with an analysis of current challenges and proposes future research directions to guide further work in the field. Full article
Show Figures

Figure 1

Figure 1
<p>Simulation of IMUs displacement error over time.</p>
Full article ">Figure 2
<p>Types of GNSS signal reception.</p>
Full article ">Figure 3
<p>GNSS spoofing.</p>
Full article ">Figure 4
<p>State-of-the-art Absolute Visual Localization (AVL) framework.</p>
Full article ">Figure 5
<p>Absolute Visual Localization (AVL) methods and associated concepts.</p>
Full article ">
19 pages, 9520 KiB  
Article
Study of Global Navigation Satellite System Receivers’ Accuracy for Unmanned Vehicles
by Rosen Miletiev, Peter Z. Petkov, Rumen Yordanov and Tihomir Brusev
Sensors 2024, 24(18), 5909; https://doi.org/10.3390/s24185909 - 12 Sep 2024
Viewed by 656
Abstract
The development of unmanned ground vehicles and unmanned aerial vehicles requires high-precision navigation due to the autonomous motion and higher traffic intensity. The existing L1 band GNSS receivers are a good and cheap decision for smartphones, vehicle navigation, fleet management systems, etc., but [...] Read more.
The development of unmanned ground vehicles and unmanned aerial vehicles requires high-precision navigation due to the autonomous motion and higher traffic intensity. The existing L1 band GNSS receivers are a good and cheap decision for smartphones, vehicle navigation, fleet management systems, etc., but their accuracy is not good enough for many civilian purposes. At the same time, real-time kinematic (RTK) navigation allows for position precision in a sub-centimeter range, but the system cost significantly narrows this navigation to a very limited area of applications, such as geodesy. A practical solution includes the integration of dual-band GNSS receivers and inertial sensors to solve high-precision navigation tasks, but GNSS position accuracy may significantly affect IMU performance due to having a great impact on Kalman filter performance in unmanned vehicles. The estimation of dilution-of-precision (DOP) parameters is essential for the filter performance as the optimality of the estimation in the filter is closely connected to the quality of a priori information about the noise covariance matrix and measurement noise covariance. In this regard, the current paper analyzes the DOP parameters of the latest generation dual-band GNSS receivers and compares the results with the L1 ones. The study was accomplished using two types of antennas—L1/L5 band patch and wideband helix antennas, which were designed and assembled by the authors. In addition, the study is extended with a comparison of GNSS receivers from different generations but sold on the market by one of the world’s leading GNSS manufacturers. The analyses of dilution-of-precision (DOP) parameters show that the introduction of dual-band receivers may significantly increase the navigation precision in a sub-meter range, in addition to multi-constellation signal reception. The fast advances in the performance of the integrated CPU in GNSS receivers allow the number of correlations and tracking satellites to be increased from 8–10 to 24–30, which also significantly improves the position accuracy even of L1-band receivers. Full article
(This article belongs to the Special Issue GNSS Signals and Precise Point Positioning)
Show Figures

Figure 1

Figure 1
<p>PRN codes in GPS system.</p>
Full article ">Figure 2
<p>Power spectrum of GPS codes.</p>
Full article ">Figure 3
<p>Schematic of the system design.</p>
Full article ">Figure 4
<p>Outline mechanical design of the GNSS helix antenna.</p>
Full article ">Figure 5
<p>Three-dimensional radiation pattern, RHCP @ 1.2 GHz.</p>
Full article ">Figure 6
<p>Three-dimensional radiation pattern, RHCP @ 1.575 GHz.</p>
Full article ">Figure 7
<p>Cross-polar properties of the helix antenna.</p>
Full article ">Figure 8
<p>Return loss of the antenna, matched to 75 Ohm port.</p>
Full article ">Figure 9
<p>Phase center position estimation from the ground plane @ 1.2 GHz.</p>
Full article ">Figure 10
<p>Phase center position estimation from the ground plane @ 1.575 GHz.</p>
Full article ">Figure 11
<p>Ceramic patch GNSS L1/L5 antenna.</p>
Full article ">Figure 12
<p>U-blox MAX M10S (patch and helix antenna, respectively).</p>
Full article ">Figure 13
<p>MinewSemi MS32SN1 receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 14
<p>ATGM336H receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 15
<p>U-blox MAX M8Q (patch and helix antenna, respectively).</p>
Full article ">Figure 16
<p>MinewSemi ME32GR01 receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 16 Cont.
<p>MinewSemi ME32GR01 receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 17
<p>U-blox NEO F10T receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 17 Cont.
<p>U-blox NEO F10T receiver (patch and helix antenna, respectively).</p>
Full article ">
15 pages, 5689 KiB  
Article
Modelling Water Availability in Livestock Ponds by Remote Sensing: Enhancing Management in Iberian Agrosilvopastoral Systems
by Francisco Manuel Castaño-Martín, Álvaro Gómez-Gutiérrez and Manuel Pulido-Fernández
Remote Sens. 2024, 16(17), 3257; https://doi.org/10.3390/rs16173257 - 2 Sep 2024
Viewed by 656
Abstract
Extensive livestock farming plays a crucial role in the economy of agrosilvopastoral systems of the southwestern Iberian Peninsula (known as dehesas and montados in Spanish and Portuguese, respectively) as well as providing essential ecosystem services. The existence of livestock in these areas heavily [...] Read more.
Extensive livestock farming plays a crucial role in the economy of agrosilvopastoral systems of the southwestern Iberian Peninsula (known as dehesas and montados in Spanish and Portuguese, respectively) as well as providing essential ecosystem services. The existence of livestock in these areas heavily relies on the effective management of natural resources (annual pastures and water stored in ponds built ad hoc). The present work aims to assess the water availability in these ponds by developing equations to estimate the water volume based on the surface area, which can be quantified by means of remote sensing techniques. For this purpose, field surveys were carried out in September 2021, 2022 and 2023 at ponds located in representative farms, using unmanned aerial vehicles (UAVs) equipped with RGB sensors and survey-grade global navigation satellite systems and inertial measurement units (GNSS-IMU). These datasets were used to produce high-resolution 3D models by means of Structure-from-Motion and Multi-View Stereo photogrammetry, facilitating the estimation of the stored water volume within a Geographic Information System (GIS). The Volume–Area–Height relationships were calibrated to allow conversions between these parameters. Regression analyses were performed using the maximum volume and area data to derive mathematical models (power and quadratic functions) that resulted in significant statistical relationships (r2 > 0.90, p < 0.0001). The root mean square error (RMSE) varied from 1.59 to 17.06 m3 and 0.16 to 3.93 m3 for the power and quadratic function, respectively. Both obtained equations (i.e., power and quadratic general functions) were applied to the estimated water storage in similar water bodies using available aerial or satellite imagery for the period from 1984 to 2021. Full article
Show Figures

Figure 1

Figure 1
<p>Geographical location of the pilot farms and their watering ponds in Extremadura (<b>A</b>). (<b>B</b>) Parapuños de Doña María (PAR4 and PAR6), (<b>C</b>) La Brava (BRA8), (<b>D</b>) La Barrosa (BAR6).</p>
Full article ">Figure 2
<p>Example of image data obtained of the workflow in pond no. 8 of La Brava. (<b>A</b>) Orthophotography, (<b>B</b>) DEM and (<b>C</b>) Cloud point RGB image [<a href="#B30-remotesensing-16-03257" class="html-bibr">30</a>].</p>
Full article ">Figure 3
<p>Illustrative scatterplots of the V-A-h parameters measured (represented by black dots) and obtained power (blue) and quadratic (red) regression lines. V-A: Par6, V-h: Par4, A-h: Bra8.</p>
Full article ">Figure 4
<p>Scatterplots of the V-A-h parameters measured (represented by dots) and obtained power (blue) and quadratic (red) relationships using the whole dataset (11 ponds).</p>
Full article ">Figure 5
<p>Comparison of the RMSE values from using the different general equations.</p>
Full article ">Figure 6
<p>Errors as function of the volume of water stored. Column (<b>A</b>): V-A relationships. Column (<b>B</b>): V-h relationships. Column (<b>C</b>): A-h relationships.</p>
Full article ">Figure 7
<p>Plots of the analysis of errors as function of the volume of water stored using general equations in V-A relationships of the whole data available.</p>
Full article ">Figure 8
<p>Temporal evolution of the average water-storing capacity of each farm.</p>
Full article ">Figure 9
<p>Power (blue bar) and quadratic (red bar) functions applied to Vmax for all ponds in the pilot farms. The real maximum volume (green bar) is also shown for the four ponds where the real volume was available.</p>
Full article ">
24 pages, 1413 KiB  
Article
Loop Detection Method Based on Neural Radiance Field BoW Model for Visual Inertial Navigation of UAVs
by Xiaoyue Zhang, Yue Cui, Yanchao Ren, Guodong Duan and Huanrui Zhang
Remote Sens. 2024, 16(16), 3038; https://doi.org/10.3390/rs16163038 - 19 Aug 2024
Viewed by 681
Abstract
The loop closure detection (LCD) methods in Unmanned Aerial Vehicle (UAV) Visual Inertial Navigation System (VINS) are often affected by issues such as insufficient image texture information and limited observational perspectives, resulting in constrained UAV positioning accuracy and reduced capability to perform complex [...] Read more.
The loop closure detection (LCD) methods in Unmanned Aerial Vehicle (UAV) Visual Inertial Navigation System (VINS) are often affected by issues such as insufficient image texture information and limited observational perspectives, resulting in constrained UAV positioning accuracy and reduced capability to perform complex tasks. This study proposes a Bag-of-Words (BoW) LCD method based on Neural Radiance Field (NeRF), which estimates camera poses from existing images and achieves rapid scene reconstruction through NeRF. A method is designed to select virtual viewpoints and render images along the flight trajectory using a specific sampling approach to expand the limited observational angles, mitigating the impact of image blur and insufficient texture information at specific viewpoints while enlarging the loop closure candidate frames to improve the accuracy and success rate of LCD. Additionally, a BoW vector construction method that incorporates the importance of similar visual words and an adapted virtual image filtering and comprehensive scoring calculation method are designed to determine loop closures. Applied to VINS-Mono and ORB-SLAM3, and compared with the advanced BoW model LCDs of the two systems, results indicate that the NeRF-based BoW LCD method can detect more than 48% additional accurate loop closures, while the system’s navigation positioning error mean is reduced by over 46%, validating the effectiveness and superiority of the proposed method and demonstrating its significant importance for improving the navigation accuracy of VINS. Full article
Show Figures

Figure 1

Figure 1
<p>Framework of the BoW LCD method based on NeRF and its position in VINS.</p>
Full article ">Figure 2
<p>Positions of the central pixel and surrounding pixels.</p>
Full article ">Figure 3
<p>Example of Instant-NGP virtual view camera pose.</p>
Full article ">Figure 4
<p>Quadtree uniform feature point extraction.</p>
Full article ">Figure 5
<p>Comparison of reconstructed data of three pose estimation schemes.</p>
Full article ">Figure 6
<p>Feature matching effect between real image (<b>left</b>) and synthetic image (<b>right</b>).</p>
Full article ">Figure 7
<p>The loop closure frame detection results for the two approaches used in VINS-Mono.</p>
Full article ">Figure 8
<p>The loop closure frame detection results for the two approaches used in ORB-SLAM3.</p>
Full article ">Figure 9
<p>Example of additional loopback matching results.</p>
Full article ">Figure 10
<p>The ground truth and trajectory of two methods in VINS-Mono.</p>
Full article ">Figure 11
<p>Statistical data of the APE with image frame index in VINS-Mono.</p>
Full article ">Figure 12
<p>The APE statistics of BoW LCD method.</p>
Full article ">Figure 13
<p>The APE statistics of NeRF-based BoW model LCD method.</p>
Full article ">Figure 14
<p>The distribution image of APE in the VINS-Mono system with the color of the trajectory.</p>
Full article ">Figure 15
<p>The ground truth and trajectory of two methods in ORB-SLAM3.</p>
Full article ">Figure 16
<p>Statistical data of the APE with image frame index in ORB-SLAM3.</p>
Full article ">Figure 17
<p>The APE statistics of BoW LCD method in ORB-SLAM3.</p>
Full article ">Figure 18
<p>The APE statistics of NeRF-based BoW model LCD method in ORB-SLAM3.</p>
Full article ">Figure 19
<p>The distribution image of APE in the ORB-SLAM3 system with the color of the trajectory.</p>
Full article ">Figure 20
<p>Distribution of detected loop closures as a function of threshold r.</p>
Full article ">
18 pages, 7322 KiB  
Article
Aerial Map-Based Navigation by Ground Object Pattern Matching
by Youngjoo Kim, Seungho Back, Dongchan Song and Byung-Yoon Lee
Drones 2024, 8(8), 375; https://doi.org/10.3390/drones8080375 - 5 Aug 2024
Viewed by 1114
Abstract
This paper proposes a novel approach to map-based navigation for unmanned aircraft. The proposed approach employs pattern matching of ground objects, not feature-to-feature or image-to-image matching, between an aerial image and a map database. Deep learning-based object detection converts the ground objects into [...] Read more.
This paper proposes a novel approach to map-based navigation for unmanned aircraft. The proposed approach employs pattern matching of ground objects, not feature-to-feature or image-to-image matching, between an aerial image and a map database. Deep learning-based object detection converts the ground objects into labeled points, and the objects’ configuration is used to find the corresponding location in the map database. Using the deep learning technique as a tool for extracting high-level features reduces the image-based localization problem to a pattern-matching problem. The pattern-matching algorithm proposed in this paper does not require altitude information or a camera model to estimate the horizontal geographical coordinates of the vehicle. Moreover, it requires significantly less storage because the map database is represented as a set of tuples, each consisting of a label, latitude, and longitude. Probabilistic data fusion with the inertial measurements by the Kalman filter is incorporated to deliver a comprehensive navigational solution. Flight experiments demonstrate the effectiveness of the proposed system in real-world environments. The map-based navigation system successfully provides the position estimates with RMSEs within 3.5 m at heights over 90 m without the aid of the GNSS. Full article
Show Figures

Figure 1

Figure 1
<p>Block diagram of the proposed map-based navigation system.</p>
Full article ">Figure 2
<p>The configuration of labeled ground objects is used to match the aerial image with the database. The red rectangles denote the result of object detection on the aerial image. The red cross denotes the center of the image. The black dots denote the labeled objects, which are an image representation of the meta image. The meta image is represented as a set of tuples in the map-based navigation system. Each tuple corresponds to a black dot and contains the object’s label and horizontal coordinates.</p>
Full article ">Figure 3
<p>Example of training dataset (images and labels) for ground object recognition. (<b>a</b>) 25 cm/pixel, <math display="inline"><semantics> <mrow> <mn>512</mn> <mo>×</mo> <mn>512</mn> </mrow> </semantics></math>; (<b>b</b>) 25 cm/pixel, <math display="inline"><semantics> <mrow> <mn>1024</mn> <mo>×</mo> <mn>1024</mn> </mrow> </semantics></math>; (<b>c</b>) 12 cm/pixel, <math display="inline"><semantics> <mrow> <mn>512</mn> <mo>×</mo> <mn>512</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>Examples of instance segmentation in the validation dataset.</p>
Full article ">Figure 5
<p>Exemplary results of image processing on aerial images. Green dots denote buildings, and blue dots denote greenhouses. The set of dots, also referred to as a meta image, is used in map matching.</p>
Full article ">Figure 6
<p>The polar coordinates of the image objects with respect to the image center.</p>
Full article ">Figure 7
<p>Graphical representation of circle intersection of Algorithm 2.</p>
Full article ">Figure 8
<p>Multicopter drone used in the flight experiments.</p>
Full article ">Figure 9
<p>Software architecture and the data flow of the companion computer and the flight controller.</p>
Full article ">Figure 10
<p>Graphical representation of database generation of a test area. The aerial image (<b>left</b>) is converted to a meta image (<b>right</b>) offline. The map database in the form of a meta image is stored on the drone.</p>
Full article ">Figure 11
<p>Drone trajectory (red: true; blue: estimated) on the aerial image of Area 1.</p>
Full article ">Figure 12
<p>Drone trajectory (red: true; blue: estimated) on the aerial image of Area 2.</p>
Full article ">Figure 13
<p>Drone trajectory (red: true; blue: estimated) on the aerial image of Area 3.</p>
Full article ">Figure 14
<p>Position estimates over time compared with the RTK GNSS/INS estimates in Area 1. The position error denotes the horizontal distance error. The number of objects used in pattern matching is also shown.</p>
Full article ">Figure 15
<p>Position estimates over time compared with the RTK GNSS/INS estimates in Area 2. The position error denotes the horizontal distance error. The number of objects used in pattern matching is also shown.</p>
Full article ">Figure 16
<p>Position estimates over time compared with the RTK GNSS/INS estimates in Area 3. The position error denotes the horizontal distance error. The number of objects used in pattern matching is also shown.</p>
Full article ">Figure 17
<p>Velocity estimates over time compared with the RTK GNSS/INS estimates in Area 1.</p>
Full article ">
24 pages, 4644 KiB  
Article
An Adaptive Cooperative Localization Method for Heterogeneous Air-to-Ground Robots Based on Relative Distance Constraints in a Satellite-Denial Environment
by Shidong Han, Zhi Xiong and Chenfa Shi
Sensors 2024, 24(14), 4543; https://doi.org/10.3390/s24144543 - 13 Jul 2024
Viewed by 605
Abstract
Cooperative localization (CL) for air-to-ground robots in a satellite-denial environment has become a current research hotspot. The traditional distance-based heterogeneous multiple-robot CL method requires at least four unmanned aerial vehicles (UAVs) with known positions. When the number of known-position UAVs in a cluster [...] Read more.
Cooperative localization (CL) for air-to-ground robots in a satellite-denial environment has become a current research hotspot. The traditional distance-based heterogeneous multiple-robot CL method requires at least four unmanned aerial vehicles (UAVs) with known positions. When the number of known-position UAVs in a cluster collaborative network is insufficient, the traditional distance-based CL method has a certain inapplicability. A novel adaptive CL method for air-to-ground robots based on relative distance constraints is proposed in this paper. Based on a dynamically changing number of known-position UAVs in the cluster collaborative network, the adaptive fusion estimation threshold is set. When the number of known-position UAVs in the cluster cooperative network is large, the real-time dynamic topology characteristics of multiple robots’ spatial geometric configurations are considered. The optimal spatial geometric configuration between UAVs and unmanned ground vehicles (UGVs) is utilized to achieve a high-precision CL solution for UGVs. Otherwise, in the event that the number of known-position UAVs in a cluster collaborative network is insufficient, distance observation constraint information between UAVs and UGVs is retained in real time. Position observation equations for UGVs’ inertial navigation system (INS) have been constructed using inertial-based high-precision relative position constraints and relative distance constraints from historical to current times. The experimental results show that the proposed method achieves adaptive fusion estimation with a dynamically changing number of known-position UAVs in the cluster collaborative network, effectively verifying the effectiveness of the proposed method. Full article
(This article belongs to the Section Sensors and Robotics)
Show Figures

Figure 1

Figure 1
<p>Target-driven cooperative localization scenario for air-to-ground cluster robots based on relative distance constraints.</p>
Full article ">Figure 2
<p>The principle schematic diagram of proposed method in this paper.</p>
Full article ">Figure 3
<p>The spatial position relationship between UGV and two known-position UAVs at adjacent times.</p>
Full article ">Figure 4
<p>The time sequence schematic diagram of the inertial sensor and relative distance observation between UAV and UGV.</p>
Full article ">Figure 5
<p>The spatial position relationship between UGV and single known-position UAV at three consecutive relative distance sampling periods.</p>
Full article ">Figure 6
<p>The various navigation sensors mounted on UAV and UGV in the experiment.</p>
Full article ">Figure 7
<p>The cooperative localization scenario between UAVs and UGV based on relative distance constraints on outdoor campus playground.</p>
Full article ">Figure 8
<p>Three-dimensional autonomous motion trajectories of UAV and UGV nodes in experiment.</p>
Full article ">Figure 9
<p>The GDOP value between UGV and UAV nodes in the experiment.</p>
Full article ">Figure 10
<p>The positioning error curves of UGV by using traditional distance-based CL method, proposed method, and mainstream CL method.</p>
Full article ">Figure 11
<p>The positioning error curves of UGV using traditional distance-based CL method, proposed method, and mainstream CL method, under dynamic cooperative network with two known-position UAVs.</p>
Full article ">Figure 12
<p>The positioning error curves of UGV to be located by using proposed method and mainstream CL method when there are two known-position UAVs in dynamic cooperative network.</p>
Full article ">Figure 13
<p>The positioning error curves of UGV to be located by using traditional distance-based CL method, proposed method, and mainstream CL method, under dynamic cooperative network with single known-position UAV.</p>
Full article ">Figure 14
<p>Positioning error curves of UGV to be located by using proposed method and mainstream CL method, under dynamic cooperative network with single known-position UAV.</p>
Full article ">
25 pages, 5745 KiB  
Article
Research on Kalman Filter Fusion Navigation Algorithm Assisted by CNN-LSTM Neural Network
by Kai Chen, Pengtao Zhang, Liang You and Jian Sun
Appl. Sci. 2024, 14(13), 5493; https://doi.org/10.3390/app14135493 - 25 Jun 2024
Viewed by 843
Abstract
In response to the challenge of single navigation methods failing to meet the high precision requirements for unmanned aerial vehicle (UAV) navigation in complex environments, a novel algorithm that integrates Global Navigation Satellite System/Inertial Navigation System (GNSS/INS) navigation information is proposed to enhance [...] Read more.
In response to the challenge of single navigation methods failing to meet the high precision requirements for unmanned aerial vehicle (UAV) navigation in complex environments, a novel algorithm that integrates Global Navigation Satellite System/Inertial Navigation System (GNSS/INS) navigation information is proposed to enhance the positioning accuracy and robustness of UAV navigation systems. First, the fundamental principles of Kalman filtering and its application in navigation are introduced. Second, the basic principles of Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks and their applications in the navigation domain are elaborated. Subsequently, an algorithm based on a CNN and LSTM-assisted Kalman filtering fusion navigation is proposed. Finally, the feasibility and effectiveness of the proposed algorithm are validated through experiments. Experimental results demonstrate that the Kalman filtering fusion navigation algorithm assisted by a CNN and LSTM significantly improves the positioning accuracy and robustness of UAV navigation systems in highly interfered complex environments. Full article
(This article belongs to the Special Issue Advances in Unmanned Aerial Vehicle (UAV) System)
Show Figures

Figure 1

Figure 1
<p>Architecture of loosely coupled closed-loop system.</p>
Full article ">Figure 2
<p>Architecture of tightly coupled system.</p>
Full article ">Figure 3
<p>Model of an artificial neuron.</p>
Full article ">Figure 4
<p>Model of a multilayer feedforward neural network.</p>
Full article ">Figure 5
<p>ReLU function.</p>
Full article ">Figure 6
<p>LSTM neural network structure.</p>
Full article ">Figure 7
<p>Neural network framework combining CNN and LSTM.</p>
Full article ">Figure 8
<p>Experimental results of the loosely coupled closed-loop system and tightly coupled system.</p>
Full article ">Figure 9
<p>Two-dimensional and Z-direction error analysis plots for loosely coupled closed-loop system vs. tightly coupled system.</p>
Full article ">Figure 9 Cont.
<p>Two-dimensional and Z-direction error analysis plots for loosely coupled closed-loop system vs. tightly coupled system.</p>
Full article ">Figure 10
<p>CNN training process.</p>
Full article ">Figure 11
<p>Training process LOSS RMSE error graph.</p>
Full article ">Figure 12
<p>Training process of the neural network architecture combining CNN and LSTM.</p>
Full article ">Figure 13
<p>Training process LOSS RMSE error graph.</p>
Full article ">Figure 14
<p>Comparison of the effect before and after navigation using neural-network-assisted tightly coupled Kalman filtering.</p>
Full article ">Figure 15
<p>Two-dimensional and Z-direction error analysis plots before and after signal correction.</p>
Full article ">
15 pages, 4630 KiB  
Article
An Aeromagnetic Compensation Strategy for Large UAVs
by Liwei Ye, Zhentao Yu, Yaxun Zhang, Cheng Chi, Pu Cheng and Jie Chen
Sensors 2024, 24(12), 3775; https://doi.org/10.3390/s24123775 - 10 Jun 2024
Viewed by 888
Abstract
Aeromagnetic surveys are widely used in geological exploration, mineral resource assessment, environmental monitoring, military reconnaissance, and other areas. It is necessary to perform magnetic compensation for interference in these fields. In recent years, large unmanned aerial vehicles (UAVs) have been more suitable for [...] Read more.
Aeromagnetic surveys are widely used in geological exploration, mineral resource assessment, environmental monitoring, military reconnaissance, and other areas. It is necessary to perform magnetic compensation for interference in these fields. In recent years, large unmanned aerial vehicles (UAVs) have been more suitable for magnetic detection missions because of the greater loads they can carry. This article proposes some methods for the magnetic compensation of large multiload UAVs. Because of the interference of the large platform and instrument noise, the standard deviations (stds) of the compensation data used in this paper are larger. At the beginning of this article, using the traditional T-L model, we avoid the shortcomings of the anti-magnetic interference ability of triaxial magnetic gate magnetometers. The direction cosine information is obtained by using an inertial navigation system, the global positioning system, and a triaxial magnetic gate magnetometer. Then, we increase the amplitude of the maneuvers in the compensation process; this reduces the multicollinearity problems in the compensation matrix to a certain extent, but it also results in greater magnetic field interference. Lastly, we employ the method of Lasso regularization Newton iteration (LRNM). Compared to the traditional methods of least squares (LS) and singular value decomposition (SVD), LRNM provides improvements of 34% and 27%, respectively. In summary, this series of schemes can be used to perform effective compensation for large multi-load UAVs and improve the actual use of large UAVs, making them more accurate in the measurement of aeromagnetic survey data. Full article
(This article belongs to the Section Vehicular Sensing)
Show Figures

Figure 1

Figure 1
<p>Coordinate system of the T-L model.</p>
Full article ">Figure 2
<p>Geo-air coordinate variations.</p>
Full article ">Figure 3
<p>Overall aeromagnetic compensation flow chart.</p>
Full article ">Figure 4
<p>Flow chart of LRNM.</p>
Full article ">Figure 5
<p>Flight compensation circle, including calibration circle (<b>A</b>) and test circle (<b>B</b>).</p>
Full article ">Figure 6
<p>Data A, compensated by LRNM.</p>
Full article ">Figure 7
<p>Data B, compensated by LRNM.</p>
Full article ">Figure 8
<p>Comparison of three methods. Green represents LS, blue represents SVD, and red represents LRNM.</p>
Full article ">Figure 9
<p>Flat flight data.</p>
Full article ">Figure 10
<p>The flat flight data compared to three methods. Green is LS, blue is SVD, and red is LRNM.</p>
Full article ">
22 pages, 6864 KiB  
Article
Position Estimation Method for Small Drones Based on the Fusion of Multisource, Multimodal Data and Digital Twins
by Shaochun Qu, Jian Cui, Zijian Cao, Yongxing Qiao, Xuemeng Men and Yanfang Fu
Electronics 2024, 13(11), 2218; https://doi.org/10.3390/electronics13112218 - 6 Jun 2024
Cited by 1 | Viewed by 1507
Abstract
In response to the issue of low positioning accuracy and insufficient robustness in small UAVs (unmanned aerial vehicle) caused by sensor noise and cumulative motion errors during flight in complex environments, this paper proposes a multisource, multimodal data fusion method. Initially, it employs [...] Read more.
In response to the issue of low positioning accuracy and insufficient robustness in small UAVs (unmanned aerial vehicle) caused by sensor noise and cumulative motion errors during flight in complex environments, this paper proposes a multisource, multimodal data fusion method. Initially, it employs a multimodal data fusion of various sensors, including GPS (global positioning system), an IMU (inertial measurement unit), and visual sensors, to complement the strengths and weaknesses of each hardware component, thereby mitigating motion errors to enhance accuracy. To mitigate the impact of sudden changes in sensor data, a high-fidelity UAV model is established in the digital twin based on the real UAV parameters, providing a robust reference for data fusion. By utilizing the extended Kalman filter algorithm, it fuses data from both the real UAV and its digital twin, and the filtered positional information is fed back into the control system of the real UAV. This enables the real-time correction of UAV positional deviations caused by sensor noise and environmental disturbances. The multisource, multimodal fusion Kalman filter method proposed in this paper significantly improves the positioning accuracy of UAVs in complex scenarios and the overall stability of the system. This method holds significant value in maintaining high-precision positioning in variable environments and has important practical implications for enhancing UAV navigation and application efficiency. Full article
(This article belongs to the Section Computer Science & Engineering)
Show Figures

Figure 1

Figure 1
<p>Schematic diagram of the multisource, multimodal data fusion positioning method.</p>
Full article ">Figure 2
<p>Architecture of the multisource and multimodal data fusion system.</p>
Full article ">Figure 3
<p>Real drone and geometric model of the drone.</p>
Full article ">Figure 4
<p>Digital twin drone model.</p>
Full article ">Figure 5
<p>Double closed-loop cascade PID control loop.</p>
Full article ">Figure 6
<p>IMU pre-integration schematic diagram.</p>
Full article ">Figure 7
<p>Visual–inertial fusion process.</p>
Full article ">Figure 8
<p>Multisensor data fusion time alignment.</p>
Full article ">Figure 9
<p>Schematic diagram of the pose graph.</p>
Full article ">Figure 10
<p>UAV hardware system.</p>
Full article ">Figure 11
<p>Realistic image (<b>left</b>). Twin scene image (<b>right</b>).</p>
Full article ">Figure 12
<p>Data fusion experiment frame diagram.</p>
Full article ">Figure 13
<p>The error curves of integrated navigation in the x, y, and z directions. The blue line represents the navigation error of the visual and IMU fusion, the green line represents the navigation error of the GPS and IMU fusion, and the red line represents the navigation error of the GPS, IMU, and visual fusion.</p>
Full article ">Figure 14
<p>Visual positioning error and covariance line chart. The red line represents the covariance curve, and the blue line represents the error line.</p>
Full article ">Figure 15
<p>Multimodal data fusion navigation error and covariance line. The red line represents the covariance curve, and the blue line represents the error line.</p>
Full article ">Figure 16
<p>The error line graph comparing multimodal with multisource, multimodal (incorporating digital twin drone sources).</p>
Full article ">Figure 17
<p>Actual flight trajectories of the drone near tall buildings. (<b>a</b>) Comparison of the three-dimensional flight trajectory lines of the drone. The red dashed line represents the trajectory data from the PX4 flight control system, while the green line represents the trajectory data from the fusion-based system. (<b>b</b>) Comparison of the two-dimensional planar trajectories on the ground station. The blue line represents the trajectory data from the fusion-based system, and the red line represents the flight trajectory data provided by the PX4 flight control system, the area marked with an orange ellipse indicates the trajectory when the drone was near tall buildings and the number of satellites was fewer than six.</p>
Full article ">
22 pages, 4067 KiB  
Article
A Sensor Fusion Approach to Observe Quadrotor Velocity
by José Ramón Meza-Ibarra, Joaquín Martínez-Ulloa, Luis Alfonso Moreno-Pacheco and Hugo Rodríguez-Cortés
Sensors 2024, 24(11), 3605; https://doi.org/10.3390/s24113605 - 3 Jun 2024
Viewed by 829
Abstract
The growing use of Unmanned Aerial Vehicles (UAVs) raises the need to improve their autonomous navigation capabilities. Visual odometry allows for dispensing positioning systems, such as GPS, especially on indoor flights. This paper reports an effort toward UAV autonomous navigation by proposing a [...] Read more.
The growing use of Unmanned Aerial Vehicles (UAVs) raises the need to improve their autonomous navigation capabilities. Visual odometry allows for dispensing positioning systems, such as GPS, especially on indoor flights. This paper reports an effort toward UAV autonomous navigation by proposing a translational velocity observer based on inertial and visual measurements for a quadrotor. The proposed observer complementarily fuses available measurements from different domains and is synthesized following the Immersion and Invariance observer design technique. A formal Lyapunov-based observer error convergence to zero is provided. The proposed observer algorithm is evaluated using numerical simulations in the Parrot Mambo Minidrone App from Simulink-Matlab. Full article
(This article belongs to the Collection Navigation Systems and Sensors)
Show Figures

Figure 1

Figure 1
<p>Mambo Parrot’s coordinate frames.</p>
Full article ">Figure 2
<p>Pinhole camera principle [<a href="#B30-sensors-24-03605" class="html-bibr">30</a>].</p>
Full article ">Figure 3
<p>Sensor fusion.</p>
Full article ">Figure 4
<p>Measured acceleration <math display="inline"><semantics> <msubsup> <mi>a</mi> <mi>x</mi> <mi>b</mi> </msubsup> </semantics></math> and reconstructed acceleration <math display="inline"><semantics> <msubsup> <mover accent="true"> <mi>a</mi> <mo>¯</mo> </mover> <mi>x</mi> <mi>b</mi> </msubsup> </semantics></math>.</p>
Full article ">Figure 5
<p>Optical flow block design.</p>
Full article ">Figure 6
<p>Aircraft tracking: circular-type trajectory.</p>
Full article ">Figure 7
<p>Aircraft tracking: square-type trajectory.</p>
Full article ">Figure 8
<p>Specific force measured along the <math display="inline"><semantics> <mrow> <mn>0</mn> <msup> <mi>X</mi> <mi>b</mi> </msup> </mrow> </semantics></math> axis while the quadrotor follows a circular trajectory.</p>
Full article ">Figure 9
<p>Specific force measured along the <math display="inline"><semantics> <mrow> <mn>0</mn> <msup> <mi>X</mi> <mi>b</mi> </msup> </mrow> </semantics></math> axis while the quadrotor follows a square-type trajectory.</p>
Full article ">Figure 10
<p>Computed optical flow along the <math display="inline"><semantics> <mrow> <mn>0</mn> <msup> <mi>X</mi> <mi>b</mi> </msup> </mrow> </semantics></math> axis while the quadrotor tracks a circular trajectory.</p>
Full article ">Figure 11
<p>Computed optical flow along the <math display="inline"><semantics> <mrow> <mn>0</mn> <msup> <mi>X</mi> <mi>b</mi> </msup> </mrow> </semantics></math> axis while the quadrotor tracks a square-type trajectory.</p>
Full article ">Figure 12
<p>Observed speed <math display="inline"><semantics> <mrow> <mover accent="true"> <mi>u</mi> <mo>^</mo> </mover> <mo>+</mo> <msub> <mo>Γ</mo> <mn>11</mn> </msub> <msub> <mi>σ</mi> <mn>1</mn> </msub> </mrow> </semantics></math> (blue line) and speed computed by the Parrot Mambo simulator algorithm <span class="html-italic">u</span> (yellow line).</p>
Full article ">Figure 13
<p>Observed speed <math display="inline"><semantics> <mrow> <mover accent="true"> <mi>v</mi> <mo>^</mo> </mover> <mo>+</mo> <msub> <mo>Γ</mo> <mn>22</mn> </msub> <msub> <mi>σ</mi> <mn>2</mn> </msub> </mrow> </semantics></math> (blue line) and speed computed by the Parrot Mambo simulator algorithm <span class="html-italic">v</span> (yellow line).</p>
Full article ">Figure 14
<p>Observed speed <math display="inline"><semantics> <mrow> <mover accent="true"> <mi>u</mi> <mo>^</mo> </mover> <mo>+</mo> <msub> <mo>Γ</mo> <mn>11</mn> </msub> <msub> <mi>σ</mi> <mn>1</mn> </msub> </mrow> </semantics></math> (blue line) and speed computed by the Parrot Mambo simulator algorithm <span class="html-italic">u</span> (yellow line).</p>
Full article ">Figure 15
<p>Observed speed <math display="inline"><semantics> <mrow> <mover accent="true"> <mi>v</mi> <mo>^</mo> </mover> <mo>+</mo> <msub> <mo>Γ</mo> <mn>22</mn> </msub> <msub> <mi>σ</mi> <mn>2</mn> </msub> </mrow> </semantics></math> (blue line) and speed computed by the Parrot Mambo simulator algorithm <span class="html-italic">v</span> (yellow line).</p>
Full article ">Figure 16
<p>Observed speeds errors for the circular trajectory. <math display="inline"><semantics> <mover accent="true"> <mi>u</mi> <mo>˜</mo> </mover> </semantics></math> (blue line), <math display="inline"><semantics> <mover accent="true"> <mi>v</mi> <mo>˜</mo> </mover> </semantics></math> (red line).</p>
Full article ">Figure 17
<p>Observed speeds errors for the square-type trajectory. <math display="inline"><semantics> <mover accent="true"> <mi>u</mi> <mo>˜</mo> </mover> </semantics></math> (blue line), <math display="inline"><semantics> <mover accent="true"> <mi>v</mi> <mo>˜</mo> </mover> </semantics></math> (red line).</p>
Full article ">Figure 18
<p>Eigenvalues for different combinations of gains <math display="inline"><semantics> <msub> <mi>α</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>α</mi> <mn>2</mn> </msub> </semantics></math>.</p>
Full article ">Figure 19
<p>Eigenvalues vs. observer’s estimation error in the <math display="inline"><semantics> <mrow> <mn>0</mn> <msup> <mi>X</mi> <mi>b</mi> </msup> </mrow> </semantics></math> axis.</p>
Full article ">
28 pages, 7296 KiB  
Article
Autonomous Full 3D Coverage Using an Aerial Vehicle, Performing Localization, Path Planning, and Navigation towards Indoors Inventorying for the Logistics Domain
by Kosmas Tsiakas, Emmanouil Tsardoulias and Andreas L. Symeonidis
Robotics 2024, 13(6), 83; https://doi.org/10.3390/robotics13060083 - 23 May 2024
Viewed by 1151
Abstract
Over the last years, a rapid evolution of unmanned aerial vehicle (UAV) usage in various applications has been observed. Their use in indoor environments requires a precise perception of the surrounding area, immediate response to its changes, and, consequently, a robust position estimation. [...] Read more.
Over the last years, a rapid evolution of unmanned aerial vehicle (UAV) usage in various applications has been observed. Their use in indoor environments requires a precise perception of the surrounding area, immediate response to its changes, and, consequently, a robust position estimation. This paper provides an implementation of navigation algorithms for solving the problem of fast, reliable, and low-cost inventorying in the logistics industry. The drone localization is achieved with a particle filter algorithm that uses an array of distance sensors and an inertial measurement unit (IMU) sensor. Navigation is based on a proportional–integral–derivative (PID) position controller that ensures an obstacle-free path within the known 3D map. As for the full 3D coverage, an extraction of the targets and then their final succession towards optimal coverage is performed. Finally, a series of experiments are carried out to examine the robustness of the positioning system using different motion patterns and velocities. At the same time, various ways of traversing the environment are examined by using different configurations of the sensor that is used to perform the area coverage. Full article
(This article belongs to the Special Issue Autonomous Navigation of Mobile Robots in Unstructured Environments)
Show Figures

Figure 1

Figure 1
<p>Quadcopter format.</p>
Full article ">Figure 2
<p>PID operation (<a href="https://se.mathworks.com/matlabcentral/mlc-downloads/downloads/submissions/58257/versions/2/screenshot.png" target="_blank">https://se.mathworks.com/matlabcentral/mlc-downloads/downloads/submissions/58257/versions/2/screenshot.png</a>, Source) (accessed on 22 May 2024).</p>
Full article ">Figure 3
<p>Particle filter process.</p>
Full article ">Figure 4
<p>Range finder used by the UAV (<a href="https://www.terabee.com/shop/lidar-tof-multi-directional-arrays/teraranger-tower-evo/" target="_blank">https://www.terabee.com/shop/lidar-tof-multi-directional-arrays/teraranger-tower-evo/</a>, Source) (accessed on 22 May 2024).</p>
Full article ">Figure 5
<p>Measurement model description.</p>
Full article ">Figure 6
<p>Path smoothing with B-splines.</p>
Full article ">Figure 7
<p>Drone path planning and obstacle avoidance.</p>
Full article ">Figure 8
<p>Ongoing coverage.</p>
Full article ">Figure 9
<p>Full coverage.</p>
Full article ">Figure 10
<p>Point P remains visible from both heights, A and B, where <math display="inline"><semantics> <mrow> <mi>B</mi> <mo>=</mo> <mi>A</mi> <mo>+</mo> <msub> <mi>z</mi> <mrow> <mi>s</mi> <mi>t</mi> <mi>e</mi> <mi>p</mi> </mrow> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 11
<p>Direct visibility between nodes.</p>
Full article ">Figure 12
<p>Path transformation from 2D to 3D—horizontal vs. vertical method.</p>
Full article ">Figure 13
<p>Omitting redundant targets in path processing.</p>
Full article ">Figure 14
<p>Environments used for full-coverage experiments.</p>
Full article ">Figure 15
<p>Moving in a straight line.</p>
Full article ">Figure 16
<p>Moving in a spiral line.</p>
Full article ">Figure 17
<p>Moving in a meander.</p>
Full article ">Figure 18
<p>Localization errors in corridor/empty space.</p>
Full article ">Figure 19
<p>Localization errors in warehouse.</p>
Full article ">Figure 20
<p>Coverage percentage in relation to time for the corridor environment.</p>
Full article ">Figure 21
<p>Coverage percentage in relation to time for the warehouse environment.</p>
Full article ">
18 pages, 6001 KiB  
Article
Improving Target Geolocation Accuracy with Multi-View Aerial Images in Long-Range Oblique Photography
by Chongyang Liu, Yalin Ding, Hongwen Zhang, Jihong Xiu and Haipeng Kuang
Drones 2024, 8(5), 177; https://doi.org/10.3390/drones8050177 - 30 Apr 2024
Cited by 2 | Viewed by 1535
Abstract
Target geolocation in long-range oblique photography (LOROP) is a challenging study due to the fact that measurement errors become more evident with increasing shooting distance, significantly affecting the calculation results. This paper introduces a novel high-accuracy target geolocation method based on multi-view observations. [...] Read more.
Target geolocation in long-range oblique photography (LOROP) is a challenging study due to the fact that measurement errors become more evident with increasing shooting distance, significantly affecting the calculation results. This paper introduces a novel high-accuracy target geolocation method based on multi-view observations. Unlike the usual target geolocation methods, which heavily depend on the accuracy of GNSS (Global Navigation Satellite System) and INS (Inertial Navigation System), the proposed method overcomes these limitations and demonstrates an enhanced effectiveness by utilizing multiple aerial images captured at different locations without any additional supplementary information. In order to achieve this goal, camera optimization is performed to minimize the errors measured by GNSS and INS sensors. We first use feature matching between the images to acquire the matched keypoints, which determines the pixel coordinates of the landmarks in different images. A map-building process is then performed to obtain the spatial positions of these landmarks. With the initial guesses of landmarks, bundle adjustment is used to optimize the camera parameters and the spatial positions of the landmarks. After the camera optimization, a geolocation method based on line-of-sight (LOS) is used to calculate the target geolocation based on the optimized camera parameters. The proposed method is validated through simulation and an experiment utilizing unmanned aerial vehicle (UAV) images, demonstrating its efficiency, robustness, and ability to achieve high-accuracy target geolocation. Full article
(This article belongs to the Section Drone Design and Development)
Show Figures

Figure 1

Figure 1
<p>Framework of the proposed method.</p>
Full article ">Figure 2
<p>Intersection of LOS vector and ellipsoidal Earth model.</p>
Full article ">Figure 3
<p>Coordinate transformation.</p>
Full article ">Figure 4
<p>Illustration of multiple-view camera optimization.</p>
Full article ">Figure 5
<p>Illustration of track generation.</p>
Full article ">Figure 6
<p>Illustration of multiple-view midpoint method.</p>
Full article ">Figure 7
<p>Illustration of reprojection error. The red ray represents the path of reprojected to another matched image. The error between the matched point pairs is the reprojection error.</p>
Full article ">Figure 8
<p>Flight path in the geographic map. The red triangles represents the positions of the UAVs and the circles represent the positions of the targets when capturing images. The blue circle represents the target, which is about 50 km away from the UAVs, while the magenta one and the green one are about 100 km and 150 km away, respectively.</p>
Full article ">Figure 9
<p>One simulation result evaluated in CEP95 of using different numbers of images with slant ranges of 50 km, 100 km, and 150 km. The blue dots represent the LOS-based geolocation results, and the red crosses represent the optimized geolocation results.</p>
Full article ">Figure 10
<p>Simulation results of using different numbers of images with slant ranges of 50 km, 100 km, and 150 km. Note that when the number of used images equals to one, it indicates the original LOS-based geolocation result.</p>
Full article ">Figure 11
<p>Airborne camera.</p>
Full article ">Figure 12
<p>Example of UAV images. Multiple-view of images with the same target.</p>
Full article ">Figure 13
<p>Ground measuring for the targets.</p>
Full article ">Figure 14
<p>CEP95 results of UAV image experiment. From left to right are the results of the three targets.</p>
Full article ">
23 pages, 8365 KiB  
Article
Resilient Multi-Sensor UAV Navigation with a Hybrid Federated Fusion Architecture
by Sorin Andrei Negru, Patrick Geragersian, Ivan Petrunin and Weisi Guo
Sensors 2024, 24(3), 981; https://doi.org/10.3390/s24030981 - 2 Feb 2024
Cited by 2 | Viewed by 2047
Abstract
Future UAV (unmanned aerial vehicle) operations in urban environments demand a PNT (position, navigation, and timing) solution that is both robust and resilient. While a GNSS (global navigation satellite system) can provide an accurate position under open-sky assumptions, the complexity of urban operations [...] Read more.
Future UAV (unmanned aerial vehicle) operations in urban environments demand a PNT (position, navigation, and timing) solution that is both robust and resilient. While a GNSS (global navigation satellite system) can provide an accurate position under open-sky assumptions, the complexity of urban operations leads to NLOS (non-line-of-sight) and multipath effects, which in turn impact the accuracy of the PNT data. A key research question within the research community pertains to determining the appropriate hybrid fusion architecture that can ensure the resilience and continuity of UAV operations in urban environments, minimizing significant degradations of PNT data. In this context, we present a novel federated fusion architecture that integrates data from the GNSS, the IMU (inertial measurement unit), a monocular camera, and a barometer to cope with the GNSS multipath and positioning performance degradation. Within the federated fusion architecture, local filters are implemented using EKFs (extended Kalman filters), while a master filter is used in the form of a GRU (gated recurrent unit) block. Data collection is performed by setting up a virtual environment in AirSim for the visual odometry aid and barometer data, while Spirent GSS7000 hardware is used to collect the GNSS and IMU data. The hybrid fusion architecture is compared to a classic federated architecture (formed only by EKFs) and tested under different light and weather conditions to assess its resilience, including multipath and GNSS outages. The proposed solution demonstrates improved resilience and robustness in a range of degraded conditions while maintaining a good level of positioning performance with a 95th percentile error of 0.54 m for the square scenario and 1.72 m for the survey scenario. Full article
(This article belongs to the Special Issue New Methods and Applications for UAVs)
Show Figures

Figure 1

Figure 1
<p>Proposed hybrid federated multi-sensor fusion architecture.</p>
Full article ">Figure 2
<p>INS/GRU corrections—<b>left</b>; GRU diagram—<b>right</b>.</p>
Full article ">Figure 3
<p>UAV in a NED frame.</p>
Full article ">Figure 4
<p>Master filter architecture.</p>
Full article ">Figure 5
<p>HIL set-up.</p>
Full article ">Figure 6
<p>Waypoint survey trajectory—<b>left</b> (<b>a</b>); waypoint square trajectory—<b>right</b> (<b>b</b>).</p>
Full article ">Figure 7
<p>Toulouse in UE during the afternoon—<b>left</b>; Toulouse in UE during the evening—<b>right</b>.</p>
Full article ">Figure 8
<p>The UAV during fog operations—<b>left</b>; the UAV during dust operations—<b>right</b>.</p>
Full article ">Figure 9
<p>Calibration process for the monocular camera used in Unreal Engine.</p>
Full article ">Figure 10
<p>FF architecture with EKFs.</p>
Full article ">Figure 11
<p>Horizontal error comparisons during different light conditions considering a square trajectory.</p>
Full article ">Figure 12
<p>Horizontal error comparisons during different light and weather conditions considering a survey trajectory.</p>
Full article ">Figure 13
<p>Horizontal error in time for the survey trajectory with multipath and outages over the EKF1.</p>
Full article ">Figure 14
<p>Horizontal error in time for the square trajectory with multipath and outages over the EKF1.</p>
Full article ">Figure A1
<p>UAV’s body frame definition in AirSim.</p>
Full article ">
Back to TopTop