Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (438)

Search Parameters:
Keywords = IMU navigation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
23 pages, 5276 KiB  
Article
An Improved LKF Integrated Navigation Algorithm Without GNSS Signal for Vehicles with Fixed-Motion Trajectory
by Haosu Zhang, Zihao Wang, Shiyin Zhou, Zhiying Wei, Jianming Miao, Lingji Xu and Tao Liu
Electronics 2024, 13(22), 4498; https://doi.org/10.3390/electronics13224498 - 15 Nov 2024
Abstract
Without a GNSS (global navigation satellite system) signal, the integrated navigation system in vehicles with a fixed trajectory (e.g., railcars) is limited to the use of micro-electromechanical system-inertial navigation system (MEMS-INS) and odometer (ODO). Due to the significant measurement error of the MEMS [...] Read more.
Without a GNSS (global navigation satellite system) signal, the integrated navigation system in vehicles with a fixed trajectory (e.g., railcars) is limited to the use of micro-electromechanical system-inertial navigation system (MEMS-INS) and odometer (ODO). Due to the significant measurement error of the MEMS inertial device and the inability of ODO to output attitude, the positioning error is generally large. To address this problem, this paper presents a new integrated navigation algorithm based on a dynamically constrained Kalman model. By analyzing the dynamics of a railcar, several new observations have been investigated, including errors of up and lateral velocity, centripetal acceleration, centripetal D-value (difference value), and an up-gyro bias. The state transition matrix and observation matrix for the error state model are represented. To improve navigation accuracy, virtual noise technology is applied to correct errors of up and lateral velocity. The vehicle-running experiment conducted within 240 s demonstrates that the positioning error rate of the dead-reckoning method based on MEMS-INS is 83.5%, whereas the proposed method exhibits a rate of 4.9%. Therefore, the accuracy of positioning can be significantly enhanced. Full article
Show Figures

Figure 1

Figure 1
<p>Schematic diagram of the <span class="html-italic">b</span> frame of a railcar.</p>
Full article ">Figure 2
<p>(<b>a</b>) A schematic diagram of the railcar running in a straight line. (<b>b</b>) A schematic diagram of the turning movement of the railcar.</p>
Full article ">Figure 3
<p>A diagrammatic sketch of the angular speed of rotation around the <span class="html-italic">OZ<sub>b</sub></span> axis of the railcar. The solid arrow represents the previous true direction, the short dashed arrow represents the assumed translation direction at the current moment, and the long dashed line segment arrow represents the current true direction.</p>
Full article ">Figure 4
<p>A flow chart of an integrated navigation algorithm applicable to the railcar in SNRE.</p>
Full article ">Figure 5
<p>Physical photo of FOG-INS.</p>
Full article ">Figure 6
<p>MEMS-INS (<b>a</b>) Schematic diagram. (<b>b</b>) Physical photo. (<b>c</b>) Diagram of structural composition.</p>
Full article ">Figure 6 Cont.
<p>MEMS-INS (<b>a</b>) Schematic diagram. (<b>b</b>) Physical photo. (<b>c</b>) Diagram of structural composition.</p>
Full article ">Figure 7
<p>Output of the up-gyroscope in MEMS-IMU.</p>
Full article ">Figure 8
<p>Speed curve calculated by FOG-IMU/GNSS integrated navigation.</p>
Full article ">Figure 9
<p>(<b>a</b>) Track graph. (<b>b</b>) Enlarged graph near the star. (<b>c</b>) Enlarged view near the curved road segment. (<b>d</b>) Enlarged graph near the end. (<b>e</b>) Track graph.</p>
Full article ">Figure 10
<p>(<b>a</b>) Attitude angles calculated by MEMS-INS/GNSS combination navigation. (<b>b</b>) Misalignment angles calculated by MEMS-INS/GNSS combination navigation.</p>
Full article ">
14 pages, 7441 KiB  
Article
Construction of a Wi-Fi System with a Tethered Balloon in a Mountainous Region for the Teleoperation of Vehicular Forestry Machines
by Gyun-Hyung Kim, Hyeon-Seung Lee, Ho-Seong Mun, Jae-Heun Oh and Beom-Soo Shin
Forests 2024, 15(11), 1994; https://doi.org/10.3390/f15111994 - 12 Nov 2024
Viewed by 293
Abstract
In this study, a Wi-Fi system with a tethered balloon is proposed for the teleoperation of vehicular forestry machines. This system was developed to establish a Wi-Fi communication for stable teleoperation in a timber harvesting site. This system consisted of a helium balloon, [...] Read more.
In this study, a Wi-Fi system with a tethered balloon is proposed for the teleoperation of vehicular forestry machines. This system was developed to establish a Wi-Fi communication for stable teleoperation in a timber harvesting site. This system consisted of a helium balloon, Wi-Fi nodes, a measurement system, a global navigation satellite system (GNSS) antenna, and a wind speed sensor. The measurement system included a GNSS module, an inertial measurement unit (IMU), a data logger, and an altitude sensor. While the helium balloon with the Wi-Fi system was 60 m in the air, the received signal strength indicator (RSSI) was measured by moving a Wi-Fi receiver on the ground. Another GNSS set was also utilized to collect the latitude and longitude data from the Wi-Fi receiver as it traveled. The developed Wi-Fi system with a tethered balloon can create a Wi-Fi zone of up to 1.9 ha within an average wind speed range of 2.2 m/s. It is also capable of performing the teleoperation of vehicular forestry machines with a maximum latency of 185.7 ms. Full article
(This article belongs to the Section Forest Operations and Engineering)
Show Figures

Figure 1

Figure 1
<p>Concept of forest machine teleoperation using Wi-Fi on tethered balloon.</p>
Full article ">Figure 2
<p>Overview of helium balloon: (<b>a</b>) front and (<b>b</b>) bottom views.</p>
Full article ">Figure 3
<p>Real view of (<b>a</b>) lower jig, (<b>b</b>) Wi-Fi nodes under lower jig, and (<b>c</b>) upper jig.</p>
Full article ">Figure 4
<p>Data acquisition logic of the developed data logger.</p>
Full article ">Figure 5
<p>Developed mobile mooring and console station.</p>
Full article ">Figure 6
<p>Data collection and analysis.</p>
Full article ">Figure 7
<p>Study site.</p>
Full article ">Figure 8
<p>Wind velocity (<b>left</b>) and coordinates of the helium balloon moved by the wind (<b>right</b>).</p>
Full article ">Figure 9
<p>Changes in roll, pitch, and yaw according to altitude of the tethered balloon.</p>
Full article ">Figure 10
<p>Installation of the developed system.</p>
Full article ">Figure 11
<p>Schematic of the latency occurring in the Wi-Fi system with a tethered balloon (Wi-Fi roaming occurs from Wi-Fi node (1) to Wi-Fi node (2) when Wi-Fi receiver on the machine Wi-Fi goes out of area covered by Wi-Fi node (1)).</p>
Full article ">Figure 12
<p>Schematic diagram of LOS distance calculation method.</p>
Full article ">Figure 13
<p>Traveled path converted to planar coordinates.</p>
Full article ">Figure 14
<p>Creation of the Wi-Fi zone for the developed system.</p>
Full article ">Figure 15
<p>Overall latency for RSSI.</p>
Full article ">
26 pages, 9809 KiB  
Article
Tightly Coupled LIDAR/IMU/UWB Fusion via Resilient Factor Graph for Quadruped Robot Positioning
by Yujin Kuang, Tongfei Hu, Mujiao Ouyang, Yuan Yang and Xiaoguo Zhang
Remote Sens. 2024, 16(22), 4171; https://doi.org/10.3390/rs16224171 - 8 Nov 2024
Viewed by 563
Abstract
Continuous accurate positioning in global navigation satellite system (GNSS)-denied environments is essential for robot navigation. Significant advances have been made with light detection and ranging (LiDAR)-inertial measurement unit (IMU) techniques, especially in challenging environments with varying lighting and other complexities. However, the LiDAR/IMU [...] Read more.
Continuous accurate positioning in global navigation satellite system (GNSS)-denied environments is essential for robot navigation. Significant advances have been made with light detection and ranging (LiDAR)-inertial measurement unit (IMU) techniques, especially in challenging environments with varying lighting and other complexities. However, the LiDAR/IMU method relies on a recursive positioning principle, resulting in the gradual accumulation and dispersion of errors over time. To address these challenges, this study proposes a tightly coupled LiDAR/IMU/UWB fusion approach that integrates an ultra-wideband (UWB) positioning technique. First, a lightweight point cloud segmentation and constraint algorithm is designed to minimize elevation errors and reduce computational demands. Second, a multi-decision non-line-of-sight (NLOS) recognition module using information entropy is employed to mitigate NLOS errors. Finally, a tightly coupled framework via a resilient mechanism is proposed to achieve reliable position estimation for quadruped robots. Experimental results demonstrate that our system provides robust positioning results even in LiDAR-limited and NLOS conditions, maintaining low time costs. Full article
Show Figures

Figure 1

Figure 1
<p>Overview of the proposed tightly coupled LiDAR/IMU/UWB positioning system.</p>
Full article ">Figure 2
<p>Triangular-based ground point cloud exclusion diagram.</p>
Full article ">Figure 3
<p>Concentric circle model of ground segmentation. (<b>a</b>) traditional model on the left; (<b>b</b>) improved model on the right.</p>
Full article ">Figure 4
<p>Schematic of the proposed LiDAR/IMU/UWB factor graph model of quadruped robot.</p>
Full article ">Figure 5
<p>Experimental equipment: (<b>a</b>) Quadruped robot with sensors (Unitree, Hangzhou, China); (<b>b</b>) UWB hardware module (Nooploop, Nanjing, China); (<b>c</b>) Total station (Starfish, Foshan, China).</p>
Full article ">Figure 6
<p>UWB anchor deployment and key points of the mobile experiment.</p>
Full article ">Figure 7
<p>Experiment layout. The red triangles indicate the locations of the UWB anchor stations. The blue dot represents the location of the robot tag. The black line shows the reference track.</p>
Full article ">Figure 8
<p>Comparison of running time for processing a frame of point cloud by different algorithms across various datasets.</p>
Full article ">Figure 9
<p>Error comparison of positioning results for different algorithms across various datasets.</p>
Full article ">Figure 10
<p>Comparison of point cloud segmentation results. The pink line represents the fitted ground point cloud. (<b>a</b>) Field diagram of a typical complex environment with multiple steps and curbs. (<b>b</b>) Segmentation result using a traditional point cloud processing algorithm (<b>c</b>) Segmentation result using the proposed algorithm.</p>
Full article ">Figure 11
<p>IMU Allan variance analysis results. (<b>a</b>) Accelerometer. (<b>b</b>) Gyroscope.</p>
Full article ">Figure 12
<p>The result of distance measurement error optimization.</p>
Full article ">Figure 13
<p>Changes in the data curve for each decision criterion. (<b>a</b>) Signal strength difference curve for each anchor station; (<b>b</b>) Distance signal difference curve for each anchor station.</p>
Full article ">Figure 14
<p>LOS/NLOS discrimination based on information entropy.</p>
Full article ">Figure 15
<p>Two-dimensional trajectory comparisons in a parking lot scenario using LS, EKF, LIU, LIUT, and our method.</p>
Full article ">Figure 16
<p>Comparison of error results of different positioning algorithms.</p>
Full article ">Figure 17
<p>Three-dimensional trajectory coparions in parking lot scenario using LIU, LIUT and our method.</p>
Full article ">Figure 18
<p>Comparison of z-axis results of different positioning algorithms.</p>
Full article ">Figure 19
<p>Indoor complex environment and corresponding positioning results.</p>
Full article ">
21 pages, 2764 KiB  
Article
Dual Foot-Mounted Localisation Scheme Employing a Minimum-Distance-Constraint Kalman Filter Under Coloured Measurement Noise
by Yuan Xu, Jingwen Yu, Xiangpeng Wang, Teng Li and Mingxu Sun
Micromachines 2024, 15(11), 1346; https://doi.org/10.3390/mi15111346 - 31 Oct 2024
Viewed by 478
Abstract
This study proposes a dual foot-mounted localisation scheme with a minimum-distance-constraint (MDC) Kalman filter (KF) for human localisation under coloured measurement noise (CMN). The dual foot-mounted localisation employs inertial measurement unit (IMUs), one on each foot, and is intended for human navigation. The [...] Read more.
This study proposes a dual foot-mounted localisation scheme with a minimum-distance-constraint (MDC) Kalman filter (KF) for human localisation under coloured measurement noise (CMN). The dual foot-mounted localisation employs inertial measurement unit (IMUs), one on each foot, and is intended for human navigation. The KF under CMN (cKF) is then derived from the data-fusion model of the proposed navigation scheme. Finally, the MDC condition is designed and an MDC–cKF model is proposed to reduce the error in the IMUs. Empirical results showed that the proposed method effectively improves the navigation accuracy from that of MDC–KF, which neglects the effect of CMN. Full article
(This article belongs to the Special Issue MEMS Inertial Device, 2nd Edition)
Show Figures

Figure 1

Figure 1
<p>Structure of the dual foot-mounted IMU-based localisation scheme with a minimum distance-constraint Kalman filter under coloured measurement noise.</p>
Full article ">Figure 2
<p>The planned paths used in this work of the present experiment: (<b>a</b>) planned path 1, (<b>b</b>) planned path 2, (<b>c</b>) planned path 3.</p>
Full article ">Figure 3
<p>The acceleration value of the present experiment used in this work: (<b>a</b>) planned path 1, (<b>b</b>) planned path 2, (<b>c</b>) planned path 3.</p>
Full article ">Figure 4
<p>The left foot’s gyroscope value of the present experiment used in this work: (<b>a</b>) planned path 1, (<b>b</b>) planned path 2, (<b>c</b>) planned path 3.</p>
Full article ">Figure 5
<p>The right foot’s gyroscope value of the present experiment used in this work: (<b>a</b>) planned path 1, (<b>b</b>) planned path 2, (<b>c</b>) planned path 3.</p>
Full article ">Figure 6
<p>Structure of the testbed used in this work experiment.</p>
Full article ">Figure 7
<p>The target human used in this work subject of the present experiment.</p>
Full article ">Figure 8
<p>Foot positions along planned path 1 (black dashed lines) measured by the MDC–KF and MDC–cKF methods (solid blue and red lines, respectively): (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 9
<p>Position-error cumulative distribution functions (CDFs) of the MDC–KF and MDC–cKF methods (solid blue and solid red lines, respectively) along planned path 1: (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 10
<p>Root mean square errors (RMSEs) in the foot positions measured by the MDC–KF and MDC–cKF methods (solid blue and solid red lines, respectively) along planned path 1: (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 11
<p>Foot positions along planned path 2 (black dashed lines) measured by the MDC–KF and MDC–cKF methods (solid blue and red lines, respectively): (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 12
<p>Position-error CDFs of the MDC–KF and MDC–cKF methods (solid blue and solid red lines, respectively) along planned path 2: (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 13
<p>RMSEs in the foot positions measured by the MDC–KF and MDC–cKF methods (solid blue and solid red lines, respectively) along planned path 2: (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 14
<p>Foot positions along planned path 3 (black dashed lines) measured by the MDC–KF and MDC–cKF methods (solid blue and red lines, respectively): (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 15
<p>Position-error CDFs of the MDC–KF and MDC–cKF methods (solid blue and solid red lines, respectively) along planned path 3: (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 16
<p>RMSEs in the foot positions measured by the MDC–KF and MDC–cKF methods (solid blue and solid red lines, respectively) along planned path 3: (<b>a</b>) left foot, (<b>b</b>) right foot.</p>
Full article ">Figure 17
<p>Planned path 4.</p>
Full article ">
46 pages, 13038 KiB  
Review
A Review on Deep Learning for UAV Absolute Visual Localization
by Andy Couturier and Moulay A. Akhloufi
Drones 2024, 8(11), 622; https://doi.org/10.3390/drones8110622 - 29 Oct 2024
Viewed by 935
Abstract
In the past few years, the use of Unmanned Aerial Vehicles (UAVs) has expanded and now reached mainstream levels for applications such as infrastructure inspection, agriculture, transport, security, entertainment, real estate, environmental conservation, search and rescue, and even insurance. This surge in adoption [...] Read more.
In the past few years, the use of Unmanned Aerial Vehicles (UAVs) has expanded and now reached mainstream levels for applications such as infrastructure inspection, agriculture, transport, security, entertainment, real estate, environmental conservation, search and rescue, and even insurance. This surge in adoption can be attributed to the UAV ecosystem’s maturation, which has not only made these devices more accessible and cost effective but has also significantly enhanced their operational capabilities in terms of flight duration and embedded computing power. In conjunction with these developments, the research on Absolute Visual Localization (AVL) has seen a resurgence driven by the introduction of deep learning to the field. These new approaches have significantly improved localization solutions in comparison to the previous generation of approaches based on traditional computer vision feature extractors. This paper conducts an extensive review of the literature on deep learning-based methods for UAV AVL, covering significant advancements since 2019. It retraces key developments that have led to the rise in learning-based approaches and provides an in-depth analysis of related localization sources such as Inertial Measurement Units (IMUs) and Global Navigation Satellite Systems (GNSSs), highlighting their limitations and advantages for more effective integration with AVL. The paper concludes with an analysis of current challenges and proposes future research directions to guide further work in the field. Full article
Show Figures

Figure 1

Figure 1
<p>Simulation of IMUs displacement error over time.</p>
Full article ">Figure 2
<p>Types of GNSS signal reception.</p>
Full article ">Figure 3
<p>GNSS spoofing.</p>
Full article ">Figure 4
<p>State-of-the-art Absolute Visual Localization (AVL) framework.</p>
Full article ">Figure 5
<p>Absolute Visual Localization (AVL) methods and associated concepts.</p>
Full article ">
20 pages, 15485 KiB  
Article
Integrated Navigation Method for Orchard-Dosing Robot Based on LiDAR/IMU/GNSS
by Wang Wang, Jifeng Qin, Dezhao Huang, Furui Zhang, Zhijie Liu, Zheng Wang and Fuzeng Yang
Agronomy 2024, 14(11), 2541; https://doi.org/10.3390/agronomy14112541 - 28 Oct 2024
Viewed by 434
Abstract
To enhance the localization reliability and obstacle avoidance performance of the dosing robot in complex orchards, this study proposed an integrated navigation method using LiDAR, IMU, and GNSS. Firstly, the tightly coupled LIO-SAM algorithm was used to construct an orchard grid map for [...] Read more.
To enhance the localization reliability and obstacle avoidance performance of the dosing robot in complex orchards, this study proposed an integrated navigation method using LiDAR, IMU, and GNSS. Firstly, the tightly coupled LIO-SAM algorithm was used to construct an orchard grid map for path planning and obstacle avoidance. Then, a global localization model based on RTK-GNSS was developed to achieve accurate and efficient initial localization of the robot’s coordinates and heading, and a Kalman filter was applied to integrate GNSS and IMU to improve robustness. Next, an improved A* algorithm was introduced to ensure the global operational path maintained a safe distance from obstacles, while the DWA algorithm handled dynamic obstacle avoidance. Field tests showed that the global localization model achieved an accuracy of 2.215 cm, with a standard deviation of 1 cm, demonstrating stable positioning performance. Moreover, the global path maintained an average safe distance of 50.75 cm from the obstacle map. And the robot exhibited a maximum absolute lateral deviation of 9.82 cm, with an average of 4.16 cm, while maintaining a safe distance of 1 m from dynamic obstacles. Overall, the robot demonstrated smooth and reliable autonomous navigation, successfully completing its tasks. Full article
(This article belongs to the Special Issue Unmanned Farms in Smart Agriculture)
Show Figures

Figure 1

Figure 1
<p>The environment of the test field.</p>
Full article ">Figure 2
<p>Hardware platform structure of the dosing robot.</p>
Full article ">Figure 3
<p>Hardware coordinate system distribution of the dosing robot.</p>
Full article ">Figure 4
<p>The architecture of LIO-SMA algorithm.</p>
Full article ">Figure 5
<p>Filter processing of 3D point cloud plane mapping: (<b>a</b>) primary point cloud; (<b>b</b>) the result of point cloud after pass-through filtering; (<b>c</b>) the result of point cloud after radius filtering.</p>
Full article ">Figure 6
<p>Coordinate system model for the LIO-SAM point cloud map.</p>
Full article ">Figure 7
<p>The principle of coordinate system transformation for GNSS global localization.</p>
Full article ">Figure 8
<p>Schematic diagram of the improved A* algorithm based on the forward-looking pre-sight frame.</p>
Full article ">Figure 9
<p>The mapping of the orchard environment. (<b>a</b>) Environmental analysis of the orchard test area; (<b>b</b>) 3D point cloud map of the orchard.</p>
Full article ">Figure 10
<p>The grid map of the orchard: (<b>a</b>) panoramic view of orchard grid map; (<b>b</b>) partial enlargement of the orchard grid map.</p>
Full article ">Figure 11
<p>Initial global localization of the dosing robot: (<b>a</b>) distribution of test sites; (<b>b</b>) implementation of initial global localization at test site 1.</p>
Full article ">Figure 12
<p>Simulation comparison before and after the improvement of the A* algorithm: (<b>a</b>) path planned by the traditional A* algorithm; (<b>b</b>) path planned by the improved A* algorithm.</p>
Full article ">Figure 13
<p>Field test comparison before and after the improvement of the A* algorithm: (<b>a</b>) field test path of the traditional A* algorithm; (<b>b</b>) field test path of the improved A* algorithm.</p>
Full article ">Figure 14
<p>Field test of dosing robot navigation operation: (<b>a</b>) dynamic safe obstacle avoidance of the dosing robot; (<b>b</b>) dosing robot arriving at the dosing point.</p>
Full article ">Figure 15
<p>Autonomous obstacle-avoidance field test results for dosing robots.</p>
Full article ">Figure 16
<p>Autonomous navigation deviation test results of the dosing robot (<span class="html-italic">v</span> = 0.5 m/s).</p>
Full article ">
16 pages, 13038 KiB  
Article
Underwater Gyros Denoising Net (UGDN): A Learning-Based Gyros Denoising Method for Underwater Navigation
by Chun Cao, Can Wang, Shaoping Zhao, Tingfeng Tan, Liang Zhao and Feihu Zhang
J. Mar. Sci. Eng. 2024, 12(10), 1874; https://doi.org/10.3390/jmse12101874 - 18 Oct 2024
Viewed by 551
Abstract
Autonomous Underwater Vehicles (AUVs) are widely used for hydrological monitoring, underwater exploration, and geological surveys. However, AUVs face limitations in underwater navigation due to the high costs associated with Strapdown Inertial Navigation System (SINS) and Doppler Velocity Log (DVL), hindering the development of [...] Read more.
Autonomous Underwater Vehicles (AUVs) are widely used for hydrological monitoring, underwater exploration, and geological surveys. However, AUVs face limitations in underwater navigation due to the high costs associated with Strapdown Inertial Navigation System (SINS) and Doppler Velocity Log (DVL), hindering the development of low-cost vehicles. Micro Electro Mechanical System Inertial Measurement Units (MEMS IMUs) are widely used in industry due to their low cost and can output acceleration and angular velocity, making them suitable as an Attitude Heading Reference System (AHRS) for low-cost vehicles. However, poorly calibrated MEMS IMUs provide an inaccurate angular velocity, leading to rapid drift in orientation. In underwater environments where AUVs cannot use GPS for position correction, this drift can have severe consequences. To address this issue, this paper proposes Underwater Gyros Denoising Net (UGDN), a method based on dilated convolutions and LSTM that learns and extracts the spatiotemporal features of IMU sequences to dynamically compensate for the gyroscope’s angular velocity measurements, reducing attitude and heading errors. In the experimental section of this paper, we deployed this method on a dataset collected from field trials and achieved significant results. The experimental results show that the accuracy of MEMS IMU data denoised by UGDN approaches that of fiber-optic SINS, and when integrated with DVL, it can serve as a low-cost underwater navigation solution. Full article
(This article belongs to the Special Issue Autonomous Marine Vehicle Operations—2nd Edition)
Show Figures

Figure 1

Figure 1
<p>Overview of the proposed method.The network denoises the raw IMU data to provide real-time compensation for the gyroscope output. During training, the network output is used to obtain the rotation increments for loss function. During testing, the denoised data are used as the input for the intergration to estimate the orientation, which is then integrated with DVL measurements to estimate the position of the navigation system.</p>
Full article ">Figure 2
<p>Network Architecture.</p>
Full article ">Figure 3
<p>Proposed dilated CNN−LSTM model structure for <math display="inline"><semantics> <msub> <mover accent="true"> <mi>ω</mi> <mo stretchy="false">˜</mo> </mover> <mi>n</mi> </msub> </semantics></math>.</p>
Full article ">Figure 4
<p>Data collecting platform.</p>
Full article ">Figure 5
<p>The experimental environment. (<b>a</b>) satellite photo; (<b>b</b>) natural scene.</p>
Full article ">Figure 6
<p>One sequence of our dataset, we set off from the dock, heading along the embankment deeper into the canyon.</p>
Full article ">Figure 7
<p>The overview of Orientation Estimation and <math display="inline"><semantics> <mrow> <mi>SO</mi> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </semantics></math> Orientation Error. In each subplot, the left part shows a comparison between the orientation estimated by these methods and the ground truth, while the right part displays the error between the estimated values and the ground truth, where (<b>a</b>) seq01; (<b>b</b>) seq03; (<b>c</b>) seq08.</p>
Full article ">Figure 7 Cont.
<p>The overview of Orientation Estimation and <math display="inline"><semantics> <mrow> <mi>SO</mi> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </semantics></math> Orientation Error. In each subplot, the left part shows a comparison between the orientation estimated by these methods and the ground truth, while the right part displays the error between the estimated values and the ground truth, where (<b>a</b>) seq01; (<b>b</b>) seq03; (<b>c</b>) seq08.</p>
Full article ">Figure 8
<p>Comparisons of relative error between the proposed method and others, where (<b>a</b>) Seq06; (<b>b</b>) Seq05; (<b>c</b>) Seq02; (<b>d</b>) Seq09.</p>
Full article ">
22 pages, 1654 KiB  
Article
A New Scene Sensing Model Based on Multi-Source Data from Smartphones
by Zhenke Ding, Zhongliang Deng, Enwen Hu, Bingxun Liu, Zhichao Zhang and Mingyang Ma
Sensors 2024, 24(20), 6669; https://doi.org/10.3390/s24206669 - 16 Oct 2024
Viewed by 493
Abstract
Smartphones with integrated sensors play an important role in people’s lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect [...] Read more.
Smartphones with integrated sensors play an important role in people’s lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information—Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors—characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm. Full article
(This article belongs to the Special Issue Smart Sensor Systems for Positioning and Navigation)
Show Figures

Figure 1

Figure 1
<p>Four scene classifications: (<b>a</b>) outdoor, (<b>b</b>) semi-outdoor, (<b>c</b>) semi-indoor, and (<b>d</b>) indoor.</p>
Full article ">Figure 2
<p>Satellite zenith view: (<b>a</b>) west indoor neighboring window, (<b>b</b>) south indoor neighbouring window, (<b>c</b>) indoor, and (<b>d</b>) open outdoor neighboring window.</p>
Full article ">Figure 3
<p>DOP change graph: (<b>a</b>) outdoor DOP change graph, and (<b>b</b>) indoor DOP change graph.</p>
Full article ">Figure 4
<p>Visible satellite map: (<b>a</b>) variation in the number of visible satellites. (<b>b</b>) Variation in the rate of change of visible satellites in different windows.</p>
Full article ">Figure 5
<p>Satellite signal quality map: (<b>a</b>) CNR variation and (<b>b</b>) DCNR variation.</p>
Full article ">Figure 6
<p>State of motion versus acceleration.</p>
Full article ">Figure 7
<p>Wi-Fi channel spectrum scan: (<b>a</b>) indoor, (<b>b</b>) outdoor.</p>
Full article ">Figure 8
<p>Visible AP distribution of Wi-Fi: (<b>a</b>) number distribution, (<b>b</b>) signal strength distribution.</p>
Full article ">Figure 9
<p>Variation of light sensors and cellular network sensors: (<b>a</b>) variation of indoor and outdoor light intensity over 24 h, (<b>b</b>) variation of the number of base stations receiving signals.</p>
Full article ">Figure 10
<p>An algorithmic model for the classification of complex indoor and outdoor scenes based on spatio-temporal features.</p>
Full article ">Figure 11
<p>Pearson correlation feature map.</p>
Full article ">Figure 12
<p>Schematic of a two-scale convolutional neural network.</p>
Full article ">Figure 13
<p>BiLSTM network structure diagram.</p>
Full article ">Figure 14
<p>Structure of the ablation experiment.</p>
Full article ">Figure 15
<p>Confusion matrix: (<b>a</b>) confusion matrix before WOA optimization. (<b>b</b>) confusion matrix after WOA optimisation.</p>
Full article ">Figure 16
<p>Comparison of the accuracy of different models.</p>
Full article ">Figure 17
<p>Comparison of accuracy in different scenarios.</p>
Full article ">
19 pages, 12228 KiB  
Article
Sky-GVIO: Enhanced GNSS/INS/Vision Navigation with FCN-Based Sky Segmentation in Urban Canyon
by Jingrong Wang, Bo Xu, Jingnan Liu, Kefu Gao and Shoujian Zhang
Remote Sens. 2024, 16(20), 3785; https://doi.org/10.3390/rs16203785 - 11 Oct 2024
Viewed by 1108
Abstract
Accurate, continuous, and reliable positioning is critical to achieving autonomous driving. However, in complex urban canyon environments, the vulnerability of stand-alone sensors and non-line-of-sight (NLOS) caused by high buildings, trees, and elevated structures seriously affect positioning results. To address these challenges, a sky-view [...] Read more.
Accurate, continuous, and reliable positioning is critical to achieving autonomous driving. However, in complex urban canyon environments, the vulnerability of stand-alone sensors and non-line-of-sight (NLOS) caused by high buildings, trees, and elevated structures seriously affect positioning results. To address these challenges, a sky-view image segmentation algorithm based on a fully convolutional network (FCN) is proposed for NLOS detection in global navigation satellite systems (GNSSs). Building upon this, a novel NLOS detection and mitigation algorithm (named S−NDM) uses a tightly coupled GNSS, inertial measurement units (IMUs), and a visual feature system called Sky−GVIO with the aim of achieving continuous and accurate positioning in urban canyon environments. Furthermore, the system combines single-point positioning (SPP) with real-time kinematic (RTK) methodologies to bolster its operational versatility and resilience. In urban canyon environments, the positioning performance of the S−NDM algorithm proposed in this paper is evaluated under different tightly coupled SPP−related and RTK−related models. The results exhibit that the Sky−GVIO system achieves meter-level accuracy under the SPP mode and sub-decimeter precision with RTK positioning, surpassing the performance of GNSS/INS/Vision frameworks devoid of S−NDM. Additionally, the sky-view image dataset, inclusive of training and evaluation subsets, has been made publicly accessible for scholarly exploration. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The system structure of the proposed Sky-GVIO.</p>
Full article ">Figure 2
<p>The sky-view image segmentation algorithm based on FCN.</p>
Full article ">Figure 3
<p>The overall flow of S-NDM algorithm.</p>
Full article ">Figure 4
<p>Illustration of experimental hardware platform.</p>
Full article ">Figure 5
<p>The experimental route and scene in the urban canyon. ((<b>A</b>–<b>D</b>) on the right correspond to the sky-view images of the four scenes in the trajectory, respectively. In the sky-view images on the right, the red dots represent the NLOS satellite, and the blue dots represent the LOS satellite).</p>
Full article ">Figure 6
<p>Number of LOS satellites (<b>top</b>) and PDOP value of all satellites (<b>bottom</b>).</p>
Full article ">Figure 7
<p>Examples of sky-view image dataset annotated semantically. The first row is raw images (<b>top</b>), and the second row is semantic images (<b>bottom</b>).</p>
Full article ">Figure 8
<p>Experimental results of sky-view image segmentation using different methods. From left to right: Ground truth with the outline of sky region (red line), followed by the segmentation results using Otsu, Kmeans, region growth, and our proposed method, respectively. Regions that are incorrectly segmented are highlighted in green boxes.</p>
Full article ">Figure 9
<p>(<b>a</b>–<b>d</b>) The detection visibility of LOS/NLOS satellites. (The red dots represent the NLOS satellite, and the blue dots represent the LOS satellite).</p>
Full article ">Figure 10
<p>Comparisons of the tightly coupled SPP/INS/Vision and Sky−SPP/INS/Vision models regarding position errors in urban canyon areas (Sky−SPP/INS/Vision refers to SPP−related Sky−GVIO).</p>
Full article ">Figure 11
<p>Comparisons of the RTK/INS/Vision and Sky−RTK/INS/Vision models regarding position errors in urban canyon areas (Sky−RTK/INS/Vision refers to RTK−related Sky−GVIO).</p>
Full article ">Figure 12
<p>The position errors of VINS−mono in E, N, and U directions.</p>
Full article ">Figure 13
<p>The position errors of GVINS in E, N, and U directions.</p>
Full article ">
30 pages, 2683 KiB  
Article
Seal Pipeline: Enhancing Dynamic Object Detection and Tracking for Autonomous Unmanned Surface Vehicles in Maritime Environments
by Mohamed Ahmed, Bader Rasheed, Hadi Salloum, Mostafa Hegazy, Mohammad Reza Bahrami and Mikhail Chuchkalov
Drones 2024, 8(10), 561; https://doi.org/10.3390/drones8100561 - 8 Oct 2024
Viewed by 593
Abstract
This study addresses the dynamic object detection problem for Unmanned Surface Vehicles (USVs) in marine environments, which is complicated by boat tilting and camera illumination sensitivity. A novel pipeline named “Seal” is proposed to enhance detection accuracy and reliability. The approach begins with [...] Read more.
This study addresses the dynamic object detection problem for Unmanned Surface Vehicles (USVs) in marine environments, which is complicated by boat tilting and camera illumination sensitivity. A novel pipeline named “Seal” is proposed to enhance detection accuracy and reliability. The approach begins with an innovative preprocessing stage that integrates data from the Inertial Measurement Unit (IMU) with LiDAR sensors to correct tilt-induced distortions in LiDAR point cloud data and reduce ripple effects around objects. The adjusted data are grouped using clustering algorithms and bounding boxes for precise object localization. Additionally, a specialized Kalman filter tailored for maritime environments mitigates object discontinuities between successive frames and addresses data sparsity caused by boat tilting. The methodology was evaluated using the VRX simulator, with experiments conducted on the Volga River using real USVs. The preprocessing effectiveness was assessed using the Root Mean Square Error (RMSE) and tracking accuracy was evaluated through detection rate metrics. The results demonstrate a 25% to 30% improvement in detection accuracy and show that the pipeline can aid industry even with sparse object representation across different frames. This study highlights the potential of integrating sensor fusion with specialized tracking for accurate dynamic object detection in maritime settings, establishing a new benchmark for USV navigation systems’ accuracy and reliability. Full article
Show Figures

Figure 1

Figure 1
<p>Abstract overview of the Seal Pipeline.</p>
Full article ">Figure 2
<p>Sensor−derived Data: (<b>a</b>) LiDAR-captured point cloud depicting a large vessel and the noise generated by our boat’s trailing wake; (<b>b</b>) RS-LiDAR-32 laser distribution pattern.</p>
Full article ">Figure 3
<p>Clustered objects in the boat’s environment.</p>
Full article ">Figure 4
<p>The boat’s forward orientation corresponds to the point cloud’s backward orientation in the LiDAR frame coordinate system.</p>
Full article ">Figure 5
<p>Point cloud representation before and after preprocessing: (<b>a</b>) before preprocessing, showing distortions and noise; (<b>b</b>) after preprocessing, showing refined high-quality data.</p>
Full article ">Figure 6
<p>The USV used for conducting experiments in the Volga River. This USV is equipped with LiDAR, three FLIR cameras, radar, IMU, GNSS, and sonar for comprehensive data collection and sensor fusion.</p>
Full article ">Figure 7
<p>The VRX environment with Gazebo simulator [<a href="#B49-drones-08-00561" class="html-bibr">49</a>], represented in the upper figure, and rviz2 visualizer for showing the point cloud data from the boat, represented in the lower figure.</p>
Full article ">Figure 8
<p>Position error of different objects after the preprocessing step (stabilization); the error is calculated using the ground truth position of the object as the reference.</p>
Full article ">Figure 9
<p>Preprocessing step (stabilization); the error is calculated using the ground truth position of the object as the reference.</p>
Full article ">Figure 10
<p>Position errors of different objects after the preprocessing step (stabilization) and Kalman filter tracking; the error is calculated using the ground truth position of the object as the reference.</p>
Full article ">
19 pages, 2861 KiB  
Article
Autonomous Lunar Rover Localization while Fully Scanning a Bounded Obstacle-Rich Workspace
by Jonghoek Kim
Sensors 2024, 24(19), 6400; https://doi.org/10.3390/s24196400 - 2 Oct 2024
Viewed by 554
Abstract
This article addresses the scanning path plan strategy of a rover team composed of three rovers, such that the team explores unknown dark outer space environments. This research considers a dark outer space, where a rover needs to turn on its light and [...] Read more.
This article addresses the scanning path plan strategy of a rover team composed of three rovers, such that the team explores unknown dark outer space environments. This research considers a dark outer space, where a rover needs to turn on its light and camera simultaneously to measure a limited space in front of the rover. The rover team is deployed from a symmetric base station, and the rover team’s mission is to scan a bounded obstacle-rich workspace, such that there exists no remaining detection hole. In the team, only one rover, the hauler, can locate itself utilizing stereo cameras and Inertial Measurement Unit (IMU). Every other rover follows the hauler, while not locating itself. Since Global Navigation Satellite System (GNSS) is not available in outer space, the localization error of the hauler increases as time goes on. For rover’s location estimate fix, one occasionally makes the rover home to the base station, whose shape and global position are known in advance. Once a rover is near the station, it uses its Lidar to measure the relative position of the base station. In this way, the rover fixes its localization error whenever it homes to the base station. In this research, one makes the rover team fully scan a bounded obstacle-rich workspace without detection holes, such that a rover’s localization error is bounded by letting the rover home to the base station occasionally. To the best of our knowledge, this article is novel in addressing the scanning path plan strategy, so that a rover team fully scans a bounded obstacle-rich workspace without detection holes, while fixing the accumulated localization error occasionally. The efficacy of the proposed scanning and localization strategy is demonstrated utilizing MATLAB-based simulations. Full article
(This article belongs to the Special Issue Intelligent Control and Robotic Technologies in Path Planning)
Show Figures

Figure 1

Figure 1
<p>The NASA Space Research Challenge [<a href="#B1-sensors-24-06400" class="html-bibr">1</a>] considered a team of three rovers: the scouter, the hauler, and the excavator. From the left to the right in this figure, one depicts the scouter, the hauler, and the excavator in this order.</p>
Full article ">Figure 2
<p>The bounded obstacle-rich workspace of the NASA Space Research Challenge. Rocks are presented as obstacles of MATLAB-based simulations (see <a href="#sec5-sensors-24-06400" class="html-sec">Section 5</a>). The rover turns on its light in dark environments.</p>
Full article ">Figure 3
<p>This figure plots a symmetric base station used in the NASA Space Research Challenge [<a href="#B1-sensors-24-06400" class="html-bibr">1</a>]. The rover turns on its light in dark environments. To the right of the rover in this figure, there is the symmetric base station. It is assumed that the station’s global position and shape are known in advance.</p>
Full article ">Figure 4
<p>This figure represents <span class="html-italic">I</span> as <math display="inline"><semantics> <mi>η</mi> </semantics></math> changes. The length of <math display="inline"><semantics> <msub> <mi>r</mi> <mi>s</mi> </msub> </semantics></math> is plotted as a dashed line at the bottom of the figure. If <math display="inline"><semantics> <mrow> <mi>η</mi> <mo>=</mo> <mn>1.1</mn> </mrow> </semantics></math>, then <span class="html-italic">I</span> consists of three edges (three bold edges in the figure). However, if <math display="inline"><semantics> <mrow> <mi>η</mi> <mo>=</mo> <mn>3</mn> </mrow> </semantics></math>, then <span class="html-italic">I</span> consists of six edges.</p>
Full article ">Figure 5
<p>The hauler <span class="html-italic">R</span> forming a new guidance sensor. In this figure, <span class="html-italic">R</span>, <math display="inline"><semantics> <msup> <mi>R</mi> <mi>e</mi> </msup> </semantics></math>, and <math display="inline"><semantics> <msup> <mi>R</mi> <mi>s</mi> </msup> </semantics></math> are shown as circular robots. The obstacle boundaries are plotted as red curves. The path of the hauler <span class="html-italic">R</span> is shown as blue lines. The large dots along the path of the hauler <span class="html-italic">R</span> illustrate the guidance sensors formed by the hauler <span class="html-italic">R</span>. The footprint of each guidance sensor is represented as a dotted circle. A frontier of the recently formed guidance sensor is shown as a green curve. Two FrontierVertices are shown as two crosses along the frontier.</p>
Full article ">Figure 6
<p>ScanCirclePnts are depicted along the ScanCircle with radius <math display="inline"><semantics> <mstyle scriptlevel="0" displaystyle="true"> <mfrac> <msub> <mi>r</mi> <mi>s</mi> </msub> <msqrt> <mn>3</mn> </msqrt> </mfrac> </mstyle> </semantics></math>. In this figure, <math display="inline"><semantics> <mrow> <mi>D</mi> <mo>=</mo> <mn>6</mn> </mrow> </semantics></math> footprintPnts are marked with red dots along the footPrint centered at the hauler <span class="html-italic">R</span>. There is a rectangular obstacle (blue box).</p>
Full article ">Figure 7
<p>Scenario 1. In this scenario, we generate obstacle environments, inspired by <a href="#sensors-24-06400-f002" class="html-fig">Figure 2</a>. There is no localization error. One sets <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mo>=</mo> <mo>∞</mo> </mrow> </semantics></math> in Algorithm 3. Obstacle boundaries are shown with thick red curves. The red rectangle illustrates the workspace boundary. The path of the hauler <span class="html-italic">R</span> is depicted as a black circle. Backwards and forwards maneuvering of the scouter <math display="inline"><semantics> <msup> <mi>R</mi> <mi>s</mi> </msup> </semantics></math> (blue lines) is used to scan a ScanCircle entirely.</p>
Full article ">Figure 8
<p>Scenario 1. Regarding the scenario in <a href="#sensors-24-06400-f007" class="html-fig">Figure 7</a>, the upper subplot represents the 2D coordinates of the hauler with respect to time, while the lower subplot illustrates the relative distance between the hauler <span class="html-italic">R</span> and its closest obstacle boundary with respect to time. In the scanning process, the hauler’s distance to the nearest obstacle boundary always exceeds <math display="inline"><semantics> <mrow> <mi>r</mi> <mi>M</mi> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math> distance units.</p>
Full article ">Figure 9
<p>Scenario 1 with localization error. In Scenario 1, we generate obstacle environments, inspired by <a href="#sensors-24-06400-f002" class="html-fig">Figure 2</a>. One sets <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> in Algorithm 3. The path of the hauler <span class="html-italic">R</span> is marked as a black circle. The hauler <span class="html-italic">R</span> occasionally homes to the base station by applying <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> in Algorithm 3. The homing maneuver of the hauler <span class="html-italic">R</span> is depicted with green diamonds. Backwards and forwards maneuvering of the scouter <math display="inline"><semantics> <msup> <mi>R</mi> <mi>s</mi> </msup> </semantics></math> (blue lines) is used to scan a ScanCircle entirely.</p>
Full article ">Figure 10
<p>Scenario 1 with localization error. Regarding the scenario in <a href="#sensors-24-06400-f009" class="html-fig">Figure 9</a>, (<b>a</b>) subplot plots the 2D coordinates of the hauler with respect to time, while (<b>b</b>) subplot illustrates the relative distance between the hauler <span class="html-italic">R</span> and its closest obstacle boundary with respect to time. In the scanning process, the hauler’s distance to the nearest obstacle boundary always exceeds <math display="inline"><semantics> <mrow> <mi>r</mi> <mi>M</mi> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math> distance units. Compared to <a href="#sensors-24-06400-f008" class="html-fig">Figure 8</a>, the scanning time increases considerably, since the hauler <span class="html-italic">R</span> needs to home to the base station occasionally. (<b>c</b>) subplot presents the localization error <math display="inline"><semantics> <mrow> <mrow> <mo>∥</mo> <mi mathvariant="bold">R</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>−</mo> </mrow> <msup> <mrow> <mi mathvariant="bold">R</mi> </mrow> <mi>p</mi> </msup> <mrow> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>∥</mo> </mrow> </mrow> </semantics></math> as <span class="html-italic">k</span> varies.</p>
Full article ">Figure 11
<p>Scenario 1 with localization error under the random maneuver strategy. In Scenario 1, we generate obstacle environments, inspired by <a href="#sensors-24-06400-f002" class="html-fig">Figure 2</a>. The path of the hauler <span class="html-italic">R</span> is marked as a black circle. Backwards and forwards maneuvering of the scouter <math display="inline"><semantics> <msup> <mi>R</mi> <mi>s</mi> </msup> </semantics></math> (blue lines) is used to scan a ScanCircle entirely. See that the bounded obstacle-rich workspace can’t be fully scanned by a random maneuver. The simulation ends after 150 s pass.</p>
Full article ">Figure 12
<p>Scenario 1 with localization error under the random maneuver strategy. Regarding the scenario in <a href="#sensors-24-06400-f011" class="html-fig">Figure 11</a>, (<b>a</b>) subplot plots the 2D coordinates of the hauler with respect to time, while (<b>b</b>) subplot illustrates the relative distance between the hauler <span class="html-italic">R</span> and its closest obstacle boundary with respect to time. In the scanning process, the hauler’s distance to the nearest obstacle boundary always exceeds <math display="inline"><semantics> <mrow> <mi>r</mi> <mi>M</mi> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math> distance units. (<b>c</b>) subplot presents the localization error <math display="inline"><semantics> <mrow> <mrow> <mo>∥</mo> <mi mathvariant="bold">R</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>−</mo> </mrow> <msup> <mrow> <mi mathvariant="bold">R</mi> </mrow> <mi>p</mi> </msup> <mrow> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>∥</mo> </mrow> </mrow> </semantics></math> as <span class="html-italic">k</span> varies.</p>
Full article ">Figure 13
<p>Scenario 2 with localization error. One sets <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> in Algorithm 3. The path of the hauler <span class="html-italic">R</span> is depicted as a black circle. The hauler <span class="html-italic">R</span> occasionally homes to the base station by applying <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> in Algorithm 3. The homing maneuver of the hauler <span class="html-italic">R</span> is marked with green diamonds. Backwards and forwards maneuvering of the scouter <math display="inline"><semantics> <msup> <mi>R</mi> <mi>s</mi> </msup> </semantics></math> (blue lines) is used to scan a ScanCircle entirely.</p>
Full article ">Figure 14
<p>Scenario 2 with localization error. Regarding the scenario in <a href="#sensors-24-06400-f013" class="html-fig">Figure 13</a>, (<b>a</b>) subplot illustrates the 2D coordinates of the hauler with respect to time, while (<b>b</b>) subplot illustrates the relative distance between the hauler <span class="html-italic">R</span> and its closest obstacle boundary with respect to time. (<b>c</b>) subplot presents the localization error <math display="inline"><semantics> <mrow> <mrow> <mo>∥</mo> <mi mathvariant="bold">R</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>−</mo> </mrow> <msup> <mrow> <mi mathvariant="bold">R</mi> </mrow> <mi>p</mi> </msup> <mrow> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>∥</mo> </mrow> </mrow> </semantics></math> as <span class="html-italic">k</span> varies.</p>
Full article ">Figure 15
<p>Scenario 2 with localization error under the random maneuver strategy. The path of the hauler <span class="html-italic">R</span> is marked as a black circle. Backwards and forwards maneuvering of the scouter <math display="inline"><semantics> <msup> <mi>R</mi> <mi>s</mi> </msup> </semantics></math> (blue lines) is used to scan a ScanCircle entirely. See that the bounded obstacle-rich workspace can’t be fully scanned by a random maneuver.</p>
Full article ">Figure 16
<p>Scenario 2 with localization error under the random maneuver strategy. Regarding the scenario in <a href="#sensors-24-06400-f015" class="html-fig">Figure 15</a>, (<b>a</b>) subplot plots the 2D coordinates of the hauler with respect to time, while (<b>b</b>) subplot illustrates the relative distance between the hauler <span class="html-italic">R</span> and its closest obstacle boundary with respect to time. (<b>c</b>) subplot presents the localization error <math display="inline"><semantics> <mrow> <mrow> <mo>∥</mo> <mi mathvariant="bold">R</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>−</mo> </mrow> <msup> <mrow> <mi mathvariant="bold">R</mi> </mrow> <mi>p</mi> </msup> <mrow> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>∥</mo> </mrow> </mrow> </semantics></math> as <span class="html-italic">k</span> varies.</p>
Full article ">
26 pages, 13744 KiB  
Article
When-to-Loop: Enhanced Loop Closure for LiDAR SLAM in Urban Environments Based on SCAN CONTEXT
by Xu Xu, Lianwu Guan, Jianhui Zeng, Yunlong Sun, Yanbin Gao and Qiang Li
Micromachines 2024, 15(10), 1212; https://doi.org/10.3390/mi15101212 - 29 Sep 2024
Viewed by 2662
Abstract
Global Navigation Satellite Systems (GNSSs) frequently encounter challenges in providing reliable navigation and positioning within urban canyons due to signal obstruction. Micro-Electro-Mechanical System (MEMS) Inertial Measurement Units (IMUs) offers an alternative for autonomous navigation, but they are susceptible to accumulating errors. To mitigate [...] Read more.
Global Navigation Satellite Systems (GNSSs) frequently encounter challenges in providing reliable navigation and positioning within urban canyons due to signal obstruction. Micro-Electro-Mechanical System (MEMS) Inertial Measurement Units (IMUs) offers an alternative for autonomous navigation, but they are susceptible to accumulating errors. To mitigate these influences, a LiDAR-based Simultaneous Localization and Mapping (SLAM) system is often employed. However, these systems face challenges in drift and error accumulation over time. This paper presents a novel approach to loop closure detection within LiDAR-based SLAM, focusing on the identification of previously visited locations to correct time-accumulated errors. Specifically, the proposed method leverages the vehicular drivable area and IMU trajectory to identify significant environmental changes in keyframe selection. This approach differs from conventional methods that only rely on distance or time intervals. Furthermore, the proposed method extends the SCAN CONTEXT algorithm. This technique incorporates the overall distribution of point clouds within a region rather than solely relying on maximum height to establish more robust loop closure constraints. Finally, the effectiveness of the proposed method is validated through experiments conducted on the KITTI dataset with an enhanced accuracy of 6%, and the local scenarios exhibit a remarkable improvement in accuracy of 17%, demonstrating improved robustness in loop closure detection for LiDAR-based SLAM. Full article
Show Figures

Figure 1

Figure 1
<p>Illustration of Navigation Error Divergence over Time.</p>
Full article ">Figure 2
<p>Overview Workflow of Proposed Algorithm.</p>
Full article ">Figure 3
<p>Potential Scenarios for Vehicular Navigation in Urban Environments from KITTI Datasets.</p>
Full article ">Figure 4
<p>Ground and Obstacle Points in LiDAR Point Cloud and Image from Left Camera.</p>
Full article ">Figure 5
<p>Ground Points Extracted from Point Cloud <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi>P</mi> </mrow> <mo>^</mo> </mover> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>Point Cloud Processing for a Scene in Overhead Occlusion. (<b>a</b>) Front View Camera Image. (<b>b</b>) Original LiDAR Point Cloud Scan with Overhead Occlusion Highlighted in White. (<b>c</b>) Filtered Point Cloud with Distractions like Pedestrians Removed. (<b>d</b>) Extracted Ground Points.</p>
Full article ">Figure 7
<p>Extracted Ground and Obstacle Point Clouds in Different Scenarios. (<b>a</b>) Straight Road. (<b>b</b>) T-Junction on a Circular Road. (<b>c</b>) T-Junction. (<b>d</b>) Four-Way Intersection.</p>
Full article ">Figure 7 Cont.
<p>Extracted Ground and Obstacle Point Clouds in Different Scenarios. (<b>a</b>) Straight Road. (<b>b</b>) T-Junction on a Circular Road. (<b>c</b>) T-Junction. (<b>d</b>) Four-Way Intersection.</p>
Full article ">Figure 8
<p>SCAN CONTEXT Bins.</p>
Full article ">Figure 9
<p>Keyframe Determination Flowchart.</p>
Full article ">Figure 10
<p>MEMS Gyroscope and Accelerometer Data Variations During Experiment.</p>
Full article ">Figure 11
<p>Improved SCAN CONTEXT Point Cloud Compression.</p>
Full article ">Figure 12
<p>Camera Images, LiDAR Point Cloud and Re-Encoding Result of the Cross Road. (<b>a</b>) View of Crossroad from Camera. (<b>b</b>) LiDAR Point Cloud of Cross. (<b>c</b>) Encoding Result of LiDAR Point Cloud.</p>
Full article ">Figure 13
<p>Use of Traditional MEMS/GNSS-Integrated Navigation Systems to Redefine LiDAR-Integrated Navigation Systems.</p>
Full article ">Figure 14
<p>Algorithm Overview of SINS-Based 3D LiDAR Tightly Integrated with SLAM.</p>
Full article ">Figure 15
<p>The sensor configuration on the platform.</p>
Full article ">Figure 16
<p>Camera Image and Point Clouds from KITTI Dataset 00.</p>
Full article ">Figure 17
<p>Comparison Results of KITTI Dataset 00.</p>
Full article ">Figure 18
<p>Experimental Hardware. (<b>a</b>) 3D LiDAR and GNSS Antenna. (<b>b</b>) GNSS Processing, Fiber Optic Gyroscope, and MEMS.</p>
Full article ">Figure 19
<p>Result of LiDAR SLAM in Data_1. (<b>a</b>) MEMS data from Data_1. (<b>b</b>) Built by LOAM. (<b>c</b>) Built by SC-LeGO-LOAM. (<b>d</b>) Mapping Result of Method from Paper. (<b>e</b>) Direct Comparison of Various Algorithms. (<b>f</b>) Positioning Errors of Algorithms Measured in Meters.</p>
Full article ">Figure 19 Cont.
<p>Result of LiDAR SLAM in Data_1. (<b>a</b>) MEMS data from Data_1. (<b>b</b>) Built by LOAM. (<b>c</b>) Built by SC-LeGO-LOAM. (<b>d</b>) Mapping Result of Method from Paper. (<b>e</b>) Direct Comparison of Various Algorithms. (<b>f</b>) Positioning Errors of Algorithms Measured in Meters.</p>
Full article ">Figure 20
<p>Camera Image and Point Clouds Near Parking Lot. (<b>a</b>) Camera Image. (<b>b</b>) Point Clouds.</p>
Full article ">Figure 21
<p>Result of LiDAR SLAM in Data_2. (<b>a</b>) MEMS Data from Data_2. (<b>b</b>) Built by LOAM. (<b>c</b>) Built by SC-LeGO-LOAM. (<b>d</b>) Mapping Result of Method from Paper. (<b>e</b>) Direct Comparison of Various Algorithms. (<b>f</b>) Positioning Errors of Algorithms Measured in Meters.</p>
Full article ">Figure 21 Cont.
<p>Result of LiDAR SLAM in Data_2. (<b>a</b>) MEMS Data from Data_2. (<b>b</b>) Built by LOAM. (<b>c</b>) Built by SC-LeGO-LOAM. (<b>d</b>) Mapping Result of Method from Paper. (<b>e</b>) Direct Comparison of Various Algorithms. (<b>f</b>) Positioning Errors of Algorithms Measured in Meters.</p>
Full article ">
17 pages, 1797 KiB  
Article
Central Difference Variational Filtering Based on Conjugate Gradient Method for Distributed Imaging Application
by Wen Ye, Fubo Zhang and Hongmei Chen
Remote Sens. 2024, 16(18), 3541; https://doi.org/10.3390/rs16183541 - 23 Sep 2024
Viewed by 504
Abstract
The airborne distributed position and orientation system (ADPOS), which integrates multi-inertia measurement units (IMUs), a data-processing computer, and a Global Navigation Satellite System (GNSS), serves as a key sensor in new higher-resolution airborne remote sensing applications, such as array SAR and multi-node imaging [...] Read more.
The airborne distributed position and orientation system (ADPOS), which integrates multi-inertia measurement units (IMUs), a data-processing computer, and a Global Navigation Satellite System (GNSS), serves as a key sensor in new higher-resolution airborne remote sensing applications, such as array SAR and multi-node imaging loads. ADPOS can provide reliable, high-precision and high-frequency spatio-temporal reference information to realize multinode motion compensation with the various nonlinear filter estimation methods such as Central Difference Kalman Filtering (CDKF), and modified CDKF. Although these known nonlinear models demonstrate good performance, their noise estimation performance with its linear minimum variance estimation criterion is limited for ADPOS. For this reason, in this paper, Central Difference Variational Filtering (CDVF) based on the variational optimization process is presented. This method adopts the conjugate gradient algorithm to enhance the estimation performance for mean correction in the filtering update stage. On one hand, the proposed method achieves adaptability by estimating noise covariance through the variational optimization method. On the other hand, robustness is implemented under the minimum variance estimation criterion based on the conjugate gradient algorithm to suppress measurement noise. We conducted a real ADPOS flight test, and the experimental results show that the accuracy of the slave motion parameters has significantly improved compared to the current CDKF. Moreover, the compensation performance shows a clear enhancement. Full article
(This article belongs to the Special Issue Array and Signal Processing for Radar)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The CDVF flowchart.</p>
Full article ">Figure 2
<p>Flight airplane.</p>
Full article ">Figure 3
<p>Flight experiment trajectory(the red lines reprent the imaging area).</p>
Full article ">Figure 4
<p>Attitude estimation comparison.</p>
Full article ">Figure 5
<p>Velocity estimation comparison.</p>
Full article ">Figure 6
<p>Position estimation comparison.</p>
Full article ">Figure 7
<p>2D imaging.</p>
Full article ">Figure 8
<p>3D imaging after compensation.</p>
Full article ">
19 pages, 9520 KiB  
Article
Study of Global Navigation Satellite System Receivers’ Accuracy for Unmanned Vehicles
by Rosen Miletiev, Peter Z. Petkov, Rumen Yordanov and Tihomir Brusev
Sensors 2024, 24(18), 5909; https://doi.org/10.3390/s24185909 - 12 Sep 2024
Viewed by 612
Abstract
The development of unmanned ground vehicles and unmanned aerial vehicles requires high-precision navigation due to the autonomous motion and higher traffic intensity. The existing L1 band GNSS receivers are a good and cheap decision for smartphones, vehicle navigation, fleet management systems, etc., but [...] Read more.
The development of unmanned ground vehicles and unmanned aerial vehicles requires high-precision navigation due to the autonomous motion and higher traffic intensity. The existing L1 band GNSS receivers are a good and cheap decision for smartphones, vehicle navigation, fleet management systems, etc., but their accuracy is not good enough for many civilian purposes. At the same time, real-time kinematic (RTK) navigation allows for position precision in a sub-centimeter range, but the system cost significantly narrows this navigation to a very limited area of applications, such as geodesy. A practical solution includes the integration of dual-band GNSS receivers and inertial sensors to solve high-precision navigation tasks, but GNSS position accuracy may significantly affect IMU performance due to having a great impact on Kalman filter performance in unmanned vehicles. The estimation of dilution-of-precision (DOP) parameters is essential for the filter performance as the optimality of the estimation in the filter is closely connected to the quality of a priori information about the noise covariance matrix and measurement noise covariance. In this regard, the current paper analyzes the DOP parameters of the latest generation dual-band GNSS receivers and compares the results with the L1 ones. The study was accomplished using two types of antennas—L1/L5 band patch and wideband helix antennas, which were designed and assembled by the authors. In addition, the study is extended with a comparison of GNSS receivers from different generations but sold on the market by one of the world’s leading GNSS manufacturers. The analyses of dilution-of-precision (DOP) parameters show that the introduction of dual-band receivers may significantly increase the navigation precision in a sub-meter range, in addition to multi-constellation signal reception. The fast advances in the performance of the integrated CPU in GNSS receivers allow the number of correlations and tracking satellites to be increased from 8–10 to 24–30, which also significantly improves the position accuracy even of L1-band receivers. Full article
(This article belongs to the Special Issue GNSS Signals and Precise Point Positioning)
Show Figures

Figure 1

Figure 1
<p>PRN codes in GPS system.</p>
Full article ">Figure 2
<p>Power spectrum of GPS codes.</p>
Full article ">Figure 3
<p>Schematic of the system design.</p>
Full article ">Figure 4
<p>Outline mechanical design of the GNSS helix antenna.</p>
Full article ">Figure 5
<p>Three-dimensional radiation pattern, RHCP @ 1.2 GHz.</p>
Full article ">Figure 6
<p>Three-dimensional radiation pattern, RHCP @ 1.575 GHz.</p>
Full article ">Figure 7
<p>Cross-polar properties of the helix antenna.</p>
Full article ">Figure 8
<p>Return loss of the antenna, matched to 75 Ohm port.</p>
Full article ">Figure 9
<p>Phase center position estimation from the ground plane @ 1.2 GHz.</p>
Full article ">Figure 10
<p>Phase center position estimation from the ground plane @ 1.575 GHz.</p>
Full article ">Figure 11
<p>Ceramic patch GNSS L1/L5 antenna.</p>
Full article ">Figure 12
<p>U-blox MAX M10S (patch and helix antenna, respectively).</p>
Full article ">Figure 13
<p>MinewSemi MS32SN1 receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 14
<p>ATGM336H receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 15
<p>U-blox MAX M8Q (patch and helix antenna, respectively).</p>
Full article ">Figure 16
<p>MinewSemi ME32GR01 receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 16 Cont.
<p>MinewSemi ME32GR01 receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 17
<p>U-blox NEO F10T receiver (patch and helix antenna, respectively).</p>
Full article ">Figure 17 Cont.
<p>U-blox NEO F10T receiver (patch and helix antenna, respectively).</p>
Full article ">
17 pages, 28105 KiB  
Article
Application of IMU/GPS Integrated Navigation System Based on Adaptive Unscented Kalman Filter Algorithm in 3D Positioning of Forest Rescue Personnel
by Shengli Pang, Bohan Zhang, Jintian Lu, Ruoyu Pan, Honggang Wang, Zhe Wang and Shiji Xu
Sensors 2024, 24(18), 5873; https://doi.org/10.3390/s24185873 - 10 Sep 2024
Viewed by 740
Abstract
Utilizing reliable and accurate positioning and navigation systems is crucial for saving the lives of rescue personnel and accelerating rescue operations. However, Global Navigation Satellite Systems (GNSSs), such as GPS, may not provide stable signals in dense forests. Therefore, integrating multiple sensors like [...] Read more.
Utilizing reliable and accurate positioning and navigation systems is crucial for saving the lives of rescue personnel and accelerating rescue operations. However, Global Navigation Satellite Systems (GNSSs), such as GPS, may not provide stable signals in dense forests. Therefore, integrating multiple sensors like GPS and Inertial Measurement Units (IMUs) becomes essential to enhance the availability and accuracy of positioning systems. To accurately estimate rescuers’ positions, this paper employs the Adaptive Unscented Kalman Filter (AUKF) algorithm with measurement noise variance matrix adaptation, integrating IMU and GPS data alongside barometric altitude measurements for precise three-dimensional positioning in complex environments. The AUKF enhances estimation robustness through the adaptive adjustment of the measurement noise variance matrix, particularly excelling when GPS signals are interrupted. This study conducted tests on two-dimensional and three-dimensional road scenarios in forest environments, confirming that the AUKF-algorithm-based integrated navigation system outperforms the traditional Extended Kalman Filter (EKF), Unscented Kalman Filter (UKF), and Adaptive Extended Kalman Filter (AEKF) in emergency rescue applications. The tests further evaluated the system’s navigation performance on rugged roads and during GPS signal interruptions. The results demonstrate that the system achieves higher positioning accuracy on rugged forest roads, notably reducing errors by 18.32% in the north direction, 8.51% in the up direction, and 3.85% in the east direction compared to the EKF. Furthermore, the system exhibits good adaptability during GPS signal interruptions, ensuring continuous and accurate personnel positioning during rescue operations. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

Figure 1
<p>Physical diagram of wearable device hardware.</p>
Full article ">Figure 2
<p>Data collection techniques in rescue operations: (<b>a</b>) location of rescue personnel wearable devices; (<b>b</b>) high-precision RTK equipment; (<b>c</b>) pose when data are collected using a high-precision RTK device.</p>
Full article ">Figure 3
<p>Comparison of 2D planar trajectories between GPS and IMU: (<b>a</b>) 2D trajectories; (<b>b</b>) actual 2D trajectories.</p>
Full article ">Figure 4
<p>Comparison of AUKF combined navigation, GPS, and IMU errors with ground truth: (<b>a</b>) global trajectory; (<b>b</b>) detailed local trajectory.</p>
Full article ">Figure 5
<p>Comparison of AUKF combined navigation, GPS, and IMU errors with ground truth: (<b>a</b>) east distance error; (<b>b</b>) north distance error.</p>
Full article ">Figure 6
<p>Comparison of 2D planar trajectories among AUKF integrated navigation, GPS, IMU, and ground truth for long distances: (<b>a</b>) full GPS signal; (<b>b</b>) partial GPS signal failure.</p>
Full article ">Figure 7
<p>Data collection by rescue personnel walking on rough terrain and roads with GPS signal interruptions: (<b>a</b>) rugged terrain, actual view; (<b>b</b>) road with GPS signal interruptions, actual view; (<b>c</b>) rugged terrain, satellite view; (<b>d</b>) road with partial GPS signal interruptions, satellite view.</p>
Full article ">Figure 8
<p>Comparison of 3D trajectories using AUKF, AEKF, UKF, EKF, and RTK on rugged terrain.</p>
Full article ">Figure 9
<p>Distance errors on rugged terrain: (<b>a</b>) eastward position error; (<b>b</b>) northward position error; (<b>c</b>) upward position error.</p>
Full article ">Figure 10
<p>Comparison of 3D trajectories using AUKF, AEKF, GPS, and RTK with partial GPS coverage.</p>
Full article ">Figure 11
<p>Distance errors with AUKF and AEKF during GPS outages: (<b>a</b>) eastward position error; (<b>b</b>) northward position error; (<b>c</b>) upward position error.</p>
Full article ">Figure 12
<p>Distance errors using AUKF and AEKF algorithms with and without GPS coverage: (<b>a</b>) Eastward position error; (<b>b</b>) Northward position error; (<b>c</b>) Upward position error.</p>
Full article ">
Back to TopTop