Nothing Special   »   [go: up one dir, main page]

CN114646993A - A data fusion algorithm for precise positioning based on GNSS, vision and IMU - Google Patents

A data fusion algorithm for precise positioning based on GNSS, vision and IMU Download PDF

Info

Publication number
CN114646993A
CN114646993A CN202210256763.7A CN202210256763A CN114646993A CN 114646993 A CN114646993 A CN 114646993A CN 202210256763 A CN202210256763 A CN 202210256763A CN 114646993 A CN114646993 A CN 114646993A
Authority
CN
China
Prior art keywords
gnss
data
imu
positioning
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210256763.7A
Other languages
Chinese (zh)
Inventor
王睿
侯治
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202210256763.7A priority Critical patent/CN114646993A/en
Publication of CN114646993A publication Critical patent/CN114646993A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/35Constructional details or hardware or software details of the signal processing chain
    • G01S19/37Hardware or software details of the signal processing chain
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention discloses a data fusion algorithm for accurate positioning based on GNSS, vision and IMU. The GNSS positioning technology is easily limited by external observation conditions, and satellite signal receiving capacity is sharply weakened in sheltered areas such as viaducts and urban canyons, so that the positioning and navigation requirements cannot be met. The scale problem cannot be solved by pure visual position estimation, and a matching relation cannot be established in a weak texture scene and a fast motion scene, so that tracking loss is easy to occur; the IMU can reflect dynamic changes in a short time (in milliseconds), but its cumulative error increases over long runs (in seconds). According to the invention, according to the complementarity among different sensor data sources, the global positioning precision of the GNSS is improved by using the local data sources (vision and IMU), the local accumulated error is eliminated by using the global data sources of the GNSS, the short-term high-precision global positioning is maintained by depending on the local vision and IMU data when the satellite signals are shielded, and the performance and the robustness of the positioning navigation system are integrally improved.

Description

一种基于GNSS、视觉以及IMU进行精确定位的数据融合算法A data fusion algorithm for precise positioning based on GNSS, vision and IMU

技术领域technical field

本发明涉及导航定位领域,尤其涉及基于GNSS、视觉以及IMU的导航定位系统。The present invention relates to the field of navigation and positioning, in particular to a navigation and positioning system based on GNSS, vision and IMU.

背景技术Background technique

高精度、高鲁棒性的定位技术对于国家安全、人工智能以及物联网等高新产业领域至关重要。经过行业不断的研究发展,相应地产生了多种不同的定位系统,包括GNSS全球导航卫星系统、INS惯性导航系统、VO视觉里程计、CNS天文导航系统等。High-precision and robust positioning technology is crucial to high-tech industries such as national security, artificial intelligence, and the Internet of Things. After continuous research and development in the industry, a variety of different positioning systems have been produced accordingly, including GNSS global navigation satellite system, INS inertial navigation system, VO visual odometry, CNS astronomical navigation system, etc.

近年来,随着位置服务行业的快速发展,单一定位系统已经无法满足用户对于可靠定位结果的需求。GNSS全球导航卫星系统,可以获取真实世界下的定位轨迹,但其易受外界观测条件的限制,在高架桥、城市峡谷、地下车库以及隧道等遮挡区域,卫星信号接收能力急剧减弱,导致定位精度显著下降或无法解算定位信息。视觉里程计利用相机的每帧图像进行纯视觉的位置估计,但其解算的定位轨迹与真实世界相比存在一个不确定的尺度缩放比例,且在弱纹理场景及快速运动场景下无法建立匹配关系,容易跟踪丢失。视觉惯性里程计使用相机以及IMU进行位置估计,具有较高的精度,但在长期运行中会受到累计误差的影响而导致精度不断下降,且其定位轨迹与真实世界中的轨迹存在一个航向角的偏差。由于单传感器定位方法的局限性以及多传感器之间数据源的互补性,高精度、高鲁棒性的多源融合定位技术已经成为亟需解决的科学问题。In recent years, with the rapid development of the location service industry, a single positioning system has been unable to meet users' needs for reliable positioning results. The GNSS global navigation satellite system can obtain positioning trajectories in the real world, but it is easily limited by external observation conditions. In blocked areas such as viaducts, urban canyons, underground garages and tunnels, the satellite signal receiving ability is sharply weakened, resulting in significant positioning accuracy. Falling or unable to resolve location information. Visual odometry uses each frame of camera image for pure visual position estimation, but the calculated positioning trajectory has an uncertain scaling ratio compared with the real world, and cannot establish matching in weak texture scenes and fast motion scenes Relationships, easy to track lost. Visual inertial odometry uses camera and IMU for position estimation, which has high accuracy, but will be affected by accumulated errors in long-term operation, resulting in continuous decline of accuracy, and its positioning trajectory has a heading angle with the trajectory in the real world. deviation. Due to the limitations of single-sensor localization methods and the complementarity of data sources among multiple sensors, high-precision and robust multi-source fusion localization technology has become an urgent scientific problem to be solved.

发明内容SUMMARY OF THE INVENTION

本发明的目的是提升GNSS导航定位系统的定位性能以及解决复杂场景下GNSS卫星信号丢失的问题。The purpose of the present invention is to improve the positioning performance of the GNSS navigation and positioning system and solve the problem of GNSS satellite signal loss in complex scenarios.

本发明提出一种基于GNSS、视觉以及IMU进行精确定位的数据融合算法,实现过程包括:The present invention proposes a data fusion algorithm for precise positioning based on GNSS, vision and IMU, and the implementation process includes:

步骤S1:移动设备上挂载GNSS卫星接收装置、单目相机以及IMU模块,在移动过程中采集GNSS信号、图像数据以及IMU数据,作为整个定位算法的输入;Step S1: Mount a GNSS satellite receiving device, a monocular camera and an IMU module on the mobile device, and collect GNSS signals, image data and IMU data during the moving process as the input of the entire positioning algorithm;

步骤S2:对本地的视觉、IMU数据进行初始化对齐;对图像数据进行前端视觉跟踪以及SFM视觉位姿求解,对IMU数据进行预积分,然后将两者的数据对齐从而粗略解算初始时刻的视觉尺度、重力加速度、系统初始速度;Step S2: Initialize and align the local vision and IMU data; perform front-end visual tracking and SFM visual pose solution on the image data, pre-integrate the IMU data, and then align the two data to roughly solve the vision at the initial moment. scale, gravitational acceleration, system initial velocity;

步骤S3:对本地数据进行非线性优化:对设定滑动窗口中的关键帧,构建视觉代价函数以及IMU代价函数,通过最小化总体代价函数,得到本地数据的定位结果;Step S3: Perform nonlinear optimization on local data: construct a visual cost function and an IMU cost function for the key frames in the set sliding window, and obtain the local data positioning result by minimizing the overall cost function;

步骤S4:进行本地与全局数据源的松耦合联合初始化:基于非线性优化技术,将本地的定位结果与GNSS基于SPP算法下的全局定位结果进行融合,粗略解算初始时刻的系统全局位置以及本地坐标系与全局坐标系之间的航向角偏差;Step S4: Loosely coupled joint initialization of local and global data sources: Based on nonlinear optimization technology, the local positioning results are fused with the global positioning results of GNSS based on the SPP algorithm, and the system global position and local position at the initial moment are roughly calculated. The heading angle deviation between the coordinate system and the global coordinate system;

步骤S5:构建GNSS的紧耦合全局代价函数约束定位估计:若此刻接收的卫星数量为0颗,则直接将本地定位结果全局化;若此刻接收的卫星数量大于0颗,则利用GNSS卫星信号的原始数据构建GNSS代价函数对定位估计进行约束;Step S5: Constrain the positioning estimation by constructing a tightly coupled global cost function of GNSS: if the number of satellites received at this moment is 0, the local positioning result is directly globalized; if the number of received satellites at this moment is greater than 0, the The raw data constructs a GNSS cost function to constrain the positioning estimation;

步骤S6:根据系统当前运动速度调节多普勒频移代价函数的协方差矩阵,改变定位估计对多普勒频移信息的置信程度;Step S6: adjusting the covariance matrix of the Doppler frequency shift cost function according to the current motion speed of the system, and changing the confidence level of the positioning estimation on the Doppler frequency shift information;

步骤S7:进行本地与全局的紧耦合联合非线性优化:通过最小化由视觉约束、IMU约束以及GNSS原始数据约束构成的总体代价函数,解算系统的最终全局定位估计。Step S7: Perform local and global tightly coupled joint nonlinear optimization: by minimizing the overall cost function composed of visual constraints, IMU constraints, and GNSS raw data constraints, the final global positioning estimate of the system is solved.

本发明的有益效果:Beneficial effects of the present invention:

(1)本发明通过融合GNSS的全局数据源与高精度的视觉、IMU本地数据源,提升了导航定位系统的定位性能。在GNSS信号接收的室外场景下,基于SPP算法解算接收机在全球坐标系下唯一坐标的定位方法精度不足,无法满足许多场景下定位导航的需求。(1) The present invention improves the positioning performance of the navigation and positioning system by integrating the global data source of GNSS with the high-precision visual and IMU local data sources. In the outdoor scene of GNSS signal reception, the positioning method based on the SPP algorithm to calculate the unique coordinates of the receiver in the global coordinate system is not accurate enough to meet the needs of positioning and navigation in many scenarios.

(2)本发明所提出的基于GNSS、视觉以及IMU进行精确定位的数据融合算法,使得系统在GNSS卫星信号完全丢失的各种场景下,亦能无缝地提供短期高精度的全局定位估计。当卫星信号重新捕获时,无缝切换系统定位方式,利用GNSS全局信息消除本地的累积误差,持续获取高精度全局定位。(2) The data fusion algorithm based on GNSS, vision and IMU for precise positioning proposed by the present invention enables the system to seamlessly provide short-term high-precision global positioning estimation in various scenarios where GNSS satellite signals are completely lost. When the satellite signal is re-acquired, the system positioning mode is switched seamlessly, and the local accumulated error is eliminated by using the GNSS global information, and the high-precision global positioning is continuously obtained.

(3)本发明提出的基于GNSS原始数据(包括伪距、多普勒频移、时钟等)构建代价函数约束定位估计的方法,在接收卫星的数量小于4颗大于0颗的情况下,依旧可以利用GNSS全局信息消除本地累积误差,使定位估计收益。(3) The method of constructing cost function constrained positioning estimation based on GNSS original data (including pseudorange, Doppler frequency shift, clock, etc.) proposed by the present invention, in the case that the number of receiving satellites is less than 4 and greater than 0, still The local accumulated error can be eliminated by using the GNSS global information, which can benefit the positioning estimation.

附图说明Description of drawings

图1为基于GNSS、视觉以及IMU进行精确定位的数据融合算法的流程图。Figure 1 is a flowchart of a data fusion algorithm for precise positioning based on GNSS, vision and IMU.

图2为本发明中系统速度与多普勒置信的关系示意图。FIG. 2 is a schematic diagram of the relationship between system velocity and Doppler confidence in the present invention.

具体实施方式Detailed ways

下面将结合说明书附图,对本发明做进一步的说明。The present invention will be further described below with reference to the accompanying drawings.

本发明公开了一种基于GNSS、视觉以及IMU进行精确定位的数据融合算法。本发明的具体实施方式流程图如图1所示:The invention discloses a data fusion algorithm for precise positioning based on GNSS, vision and IMU. The flow chart of the specific embodiment of the present invention is shown in Figure 1:

步骤S1:移动设备上挂载GNSS卫星信号接收机、单目相机以及IMU模块,在移动过程中采集频率为10Hz的GNSS信号、帧率为30fps的图像数据以及频率为200Hz的IMU数据,作为整个定位算法的输入。其中GNSS的原始数据包括星历、伪距

Figure BDA0003548895890000031
多普勒频移
Figure BDA0003548895890000032
时钟等信息,相机的输入数据是连续序列的图像帧,IMU的输入数据是一连串物体的线性加速度α以及旋转角速度ω信息。Step S1: Mount a GNSS satellite signal receiver, a monocular camera and an IMU module on the mobile device, and collect GNSS signals with a frequency of 10 Hz, image data with a frame rate of 30 fps, and IMU data with a frequency of 200 Hz during the moving process, as the whole process. Input to the positioning algorithm. The raw data of GNSS includes ephemeris, pseudorange
Figure BDA0003548895890000031
Doppler shift
Figure BDA0003548895890000032
Clock and other information, the input data of the camera is a continuous sequence of image frames, and the input data of the IMU is a series of linear acceleration α and rotational angular velocity ω information of the object.

步骤S2:对本地的视觉、IMU数据进行初始化对齐:经过前端特征点跟踪算法把图像总结为一系列稀疏的特征点,并基于SFM进行一个纯视觉的位姿估计。由于IMU的采样频率高于相机,将IMU的输入数据进行预积分,与相机的每帧图像在时间上对齐。接着将视觉、IMU的数据联合校准,大致解算系统的初始速度、重力加速度(标志着初始姿态)以及视觉尺度。Step S2: Initialize and align the local visual and IMU data: The front-end feature point tracking algorithm summarizes the image into a series of sparse feature points, and performs a purely visual pose estimation based on SFM. Since the sampling frequency of the IMU is higher than that of the camera, the input data of the IMU is pre-integrated and aligned with each frame of the camera in time. Then, the data of vision and IMU are jointly calibrated, and the initial velocity, gravitational acceleration (marking the initial attitude) and visual scale of the system are roughly calculated.

步骤S3:对本地数据进行非线性优化:对设定滑动窗口中的关键帧,构建视觉代价函数以及IMU代价函数。对于视觉代价函数,通过两帧之间由光束平差法得到的重投影误差构建,对于IMU代价函数,通过物体在两帧之间的加速度、角速度以及IMU的零偏构建。Step S3: Perform nonlinear optimization on local data: construct a visual cost function and an IMU cost function for the key frames in the set sliding window. For the visual cost function, it is constructed from the reprojection error obtained by the beam adjustment method between two frames, and for the IMU cost function, it is constructed from the acceleration, angular velocity of the object between the two frames and the zero offset of the IMU.

步骤S4:进行本地与全局数据源的松耦合联合初始化:基于非线性优化技术,以松耦合的方式,将本地的定位结果与GNSS基于SPP算法下的全局定位结果进行融合,粗略解算初始时刻的系统全局位置以及本地坐标系与全局坐标系之间的航向角偏差,如式(1)所示。Step S4: Loosely coupled joint initialization of local and global data sources: Based on nonlinear optimization technology, in a loosely coupled manner, fuse the local positioning results with the GNSS global positioning results based on the SPP algorithm, and roughly calculate the initial moment The global position of the system and the heading angle deviation between the local coordinate system and the global coordinate system are shown in formula (1).

Figure BDA0003548895890000041
Figure BDA0003548895890000041

其中,rL为本地数据的代价函数:Among them, r L is the cost function of the local data:

Figure BDA0003548895890000042
Figure BDA0003548895890000042

rG为松耦合的全局数据代价函数:r G is the loosely coupled global data cost function:

Figure BDA0003548895890000043
Figure BDA0003548895890000043

其中,下标k表示第k帧,δx表示x方向上的代价,δy表示y方向上的代价,x与y即需要优化的定位坐标。右上角标的local与global分别表示本地数据源与全局数据源各自的定位结果,θ为本地坐标系与全局坐标系的航向角偏差,可通过本地定位轨迹与全局定位轨迹的角度偏差获取。Among them, the subscript k represents the kth frame, δx represents the cost in the x direction, δy represents the cost in the y direction, and x and y are the positioning coordinates that need to be optimized. The local and global in the upper right corner represent the respective positioning results of the local data source and the global data source, and θ is the heading angle deviation between the local coordinate system and the global coordinate system, which can be obtained from the angle deviation between the local positioning trajectory and the global positioning trajectory.

步骤S5:构建GNSS全局代价函数约束定位估计:若此刻接收的卫星数量为0颗,则通过弥补本地与全局坐标系之间的航向角偏差,将本地定位结果全局化。若此刻接收的卫星数量大于0颗,则利用GNSS卫星信号的原始数据,包括伪距、多普勒频移以及时钟信息,构建GNSS代价函数对定位估计进行约束。其中:伪距表示了接收机与卫星的距离,其代价函数如式(4)。Step S5: Constrain the positioning estimation by constructing a GNSS global cost function: if the number of satellites received at this moment is 0, the local positioning result is globalized by compensating for the heading angle deviation between the local and global coordinate systems. If the number of satellites received at this moment is greater than 0, the raw data of GNSS satellite signals, including pseudorange, Doppler frequency shift and clock information, are used to construct a GNSS cost function to constrain the positioning estimation. Among them: the pseudorange represents the distance between the receiver and the satellite, and its cost function is shown in formula (4).

Figure BDA0003548895890000051
Figure BDA0003548895890000051

其中,Rz(θ)表示围绕ECI坐标系的z轴旋转了θ角度的旋转矩阵,ωE表示地球自转的角速度,tf是GNSS卫星信号的飞行时间,p表示位置坐标,s表示卫星,r表示接收机,E表示接收机收到信号时刻的ECEF坐标系,e′表示卫星发出信号时刻的ECEF坐标系,

Figure BDA0003548895890000052
是一个四维向量,其中三个值为0,一个值为1,表示GNSS中特定的卫星导航系统,δt代表时钟偏置,
Figure BDA0003548895890000053
代表卫星的时钟误差,c代表光速,T与I分别表示对流层和电离层延迟,
Figure BDA0003548895890000054
表示伪距。多普勒频移反映了卫星在信号传播路径上相对于接收机的运动,其代价函数如式(5)。Among them, R z (θ) represents the rotation matrix rotated by the angle θ around the z-axis of the ECI coordinate system, ω E represents the angular velocity of the earth's rotation, t f is the flight time of the GNSS satellite signal, p represents the position coordinate, s represents the satellite, r represents the receiver, E represents the ECEF coordinate system when the receiver receives the signal, e′ represents the ECEF coordinate system when the satellite sends the signal,
Figure BDA0003548895890000052
is a four-dimensional vector, where three values are 0 and one value is 1, which represents the specific satellite navigation system in GNSS, δt represents the clock offset,
Figure BDA0003548895890000053
represents the clock error of the satellite, c represents the speed of light, T and I represent the tropospheric and ionospheric delays, respectively,
Figure BDA0003548895890000054
represents the pseudorange. The Doppler frequency shift reflects the movement of the satellite relative to the receiver on the signal propagation path, and its cost function is shown in equation (5).

Figure BDA0003548895890000055
Figure BDA0003548895890000055

其中,v表示速度,λ表示载波信号波长,κ表示接收机到卫星的单位矢量,

Figure BDA00035488958900000511
为卫星时钟误差漂移率,
Figure BDA0003548895890000056
为多普勒频移,
Figure BDA0003548895890000057
代表时钟漂移率。接收机解算卫星信号的过程与接收机的时钟偏置以及时钟漂移率相关,其构建的两个代价函数如式(6)。where v is the velocity, λ is the wavelength of the carrier signal, κ is the unit vector from the receiver to the satellite,
Figure BDA00035488958900000511
is the satellite clock error drift rate,
Figure BDA0003548895890000056
is the Doppler frequency shift,
Figure BDA0003548895890000057
represents the clock drift rate. The process of the receiver to solve the satellite signal is related to the receiver's clock offset and clock drift rate.

Figure BDA0003548895890000058
Figure BDA0003548895890000058

Figure BDA0003548895890000059
Figure BDA0003548895890000059

其中,δt代表时钟偏置,1m×n表示m行n列所有元素均为1的矩阵,

Figure BDA00035488958900000510
表示k帧与k-1帧的时间间隔。where δt represents the clock offset, 1 m×n represents a matrix with m rows and n columns where all elements are 1,
Figure BDA00035488958900000510
Indicates the time interval between k frames and k-1 frames.

步骤S6:根据系统当前运动速度调节多普勒频移代价函数的协方差矩阵,改变定位估计对多普勒频移信息的置信程度。当接收机处于低速运动状态下时,多普勒频移的信噪比会下降,导致其约束精度降低。如图2所示,展示了本算法中系统速度与多普勒置信的关系。当速度小于vmin时,对多普勒信息的置信程度为0,即完全不考虑卫星提供的多普勒频移信息。速度在vmin到vmax之间时,两者呈线性关系,当速度增大到vmax以上时,置信程度不再改变。vmin与vmax可根据具体的传感器套件特性以及测试情况进行调整。Step S6: Adjust the covariance matrix of the Doppler frequency shift cost function according to the current motion speed of the system, and change the confidence level of the positioning estimation on the Doppler frequency shift information. When the receiver is moving at a low speed, the signal-to-noise ratio of the Doppler frequency shift will decrease, resulting in a decrease in its constraint accuracy. As shown in Figure 2, the relationship between system velocity and Doppler confidence in this algorithm is shown. When the speed is less than v min , the confidence level of the Doppler information is 0, that is, the Doppler frequency shift information provided by the satellite is not considered at all. When the speed is between v min and v max , there is a linear relationship between the two, and when the speed increases above v max , the confidence level does not change. v min and v max can be adjusted based on specific sensor kit characteristics and test conditions.

步骤S7:进行本地与全局的紧耦合联合非线性优化:使用最小二乘法,通过最小化由视觉约束、IMU约束以及GNSS原始数据约束构成的总体代价函数,解算系统的最终全局定位估计,如式(7)所示。Step S7: Perform local and global tightly coupled joint nonlinear optimization: use the least squares method to solve the final global positioning estimate of the system by minimizing the overall cost function composed of visual constraints, IMU constraints and GNSS raw data constraints, such as Formula (7) is shown.

Figure BDA0003548895890000061
Figure BDA0003548895890000061

其中,第一项为边缘化带来的先验信息,rB为IMU的代价函数,rC为相机的代价函数,

Figure BDA0003548895890000062
为GNSS的紧耦合代价函数。根据实际需要,决定是否输出系统的姿态以及视觉的点云信息。Among them, the first item is the prior information brought by marginalization, r B is the cost function of the IMU, r C is the cost function of the camera,
Figure BDA0003548895890000062
is the tightly coupled cost function of GNSS. According to actual needs, decide whether to output the pose of the system and the visual point cloud information.

总之,依据上述7个步骤,通过融合GNSS、视觉以及IMU的信息,可以实现不同传感器间的优势互补,最终使得定位导航系统在开阔室外场景下达到精准的定位效果,在卫星信号丢失的场景下保证短期高精度的全局定位估计,整体上提升了定位导航系统的性能以及鲁棒性。In a word, according to the above 7 steps, by integrating the information of GNSS, vision and IMU, the advantages of different sensors can be complemented, and finally the positioning and navigation system can achieve accurate positioning effect in the open outdoor scene, in the scene of satellite signal loss. It ensures short-term high-precision global positioning estimation, which improves the performance and robustness of the positioning and navigation system as a whole.

Claims (7)

1.基于GNSS、视觉以及IMU进行精确定位的数据融合算法,其特征在于,包括如下步骤:1. the data fusion algorithm that carries out precise positioning based on GNSS, vision and IMU, is characterized in that, comprises the steps: 步骤S1:移动设备上挂载GNSS卫星接收装置、单目相机以及IMU模块,在移动过程中采集GNSS信号、图像数据以及IMU数据,作为整个定位算法的输入;Step S1: Mount a GNSS satellite receiving device, a monocular camera and an IMU module on the mobile device, and collect GNSS signals, image data and IMU data during the moving process as the input of the entire positioning algorithm; 步骤S2:对本地的视觉、IMU数据进行初始化对齐;对图像数据进行前端视觉跟踪以及SFM视觉位姿求解,对IMU数据进行预积分,然后将两者的数据对齐从而粗略解算初始时刻的视觉尺度、重力加速度、系统初始速度;Step S2: Initialize and align the local vision and IMU data; perform front-end visual tracking and SFM visual pose solution on the image data, pre-integrate the IMU data, and then align the two data to roughly solve the vision at the initial moment. scale, gravitational acceleration, system initial velocity; 步骤S3:对本地数据进行非线性优化:对设定滑动窗口中的关键帧,构建视觉代价函数以及IMU代价函数,通过最小化总体代价函数,得到本地数据的定位结果;Step S3: Perform nonlinear optimization on local data: construct a visual cost function and an IMU cost function for the key frames in the set sliding window, and obtain the local data positioning result by minimizing the overall cost function; 步骤S4:进行本地与全局数据源的松耦合联合初始化:基于非线性优化技术,将本地的定位结果与GNSS基于SPP算法下的全局定位结果进行融合,粗略解算初始时刻的系统全局位置以及本地坐标系与全局坐标系之间的航向角偏差;Step S4: Loosely coupled joint initialization of local and global data sources: Based on nonlinear optimization technology, the local positioning results are fused with the global positioning results of GNSS based on the SPP algorithm, and the system global position and local position at the initial moment are roughly calculated. The heading angle deviation between the coordinate system and the global coordinate system; 步骤S5:构建GNSS的紧耦合全局代价函数约束定位估计:若此刻接收的卫星数量为0颗,则直接将本地定位结果全局化;若此刻接收的卫星数量大于0颗,则利用GNSS卫星信号的原始数据构建GNSS代价函数对定位估计进行约束;Step S5: Constrain the positioning estimation by constructing a tightly coupled global cost function of GNSS: if the number of satellites received at this moment is 0, the local positioning result is directly globalized; if the number of received satellites at this moment is greater than 0, the The raw data constructs a GNSS cost function to constrain the positioning estimation; 步骤S6:根据系统当前运动速度调节多普勒频移代价函数的协方差矩阵,改变定位估计对多普勒频移信息的置信程度;Step S6: adjusting the covariance matrix of the Doppler frequency shift cost function according to the current motion speed of the system, and changing the confidence level of the positioning estimation on the Doppler frequency shift information; 步骤S7:进行本地与全局的紧耦合联合非线性优化:通过最小化由视觉约束、IMU约束以及GNSS原始数据约束构成的总体代价函数,解算系统的最终全局定位估计。Step S7: Perform local and global tightly coupled joint nonlinear optimization: by minimizing the overall cost function composed of visual constraints, IMU constraints, and GNSS raw data constraints, the final global positioning estimate of the system is solved. 2.如权利要求1所述的基于GNSS、视觉以及IMU进行精确定位的数据融合算法,其特征在于,所述步骤S1,定位算法的输入包括来自GNSS的原始数据、相机的输入数据、IMU的输入数据的数据;GNSS的原始数据包括星历、伪距
Figure FDA0003548895880000021
多普勒频移
Figure FDA0003548895880000022
时钟等信息,相机的输入数据是连续序列的图像帧,IMU的输入数据是一连串物体的线性加速度α以及旋转角速度ω信息。
2. the data fusion algorithm that carries out precise positioning based on GNSS, vision and IMU as claimed in claim 1, it is characterized in that, described step S1, the input of positioning algorithm comprises from the raw data of GNSS, the input data of camera, the input data of IMU. Input data data; GNSS raw data including ephemeris, pseudorange
Figure FDA0003548895880000021
Doppler shift
Figure FDA0003548895880000022
Clock and other information, the input data of the camera is a continuous sequence of image frames, and the input data of the IMU is a series of linear acceleration α and rotational angular velocity ω information of the object.
3.如权利要求1所述的基于GNSS、视觉以及IMU进行精确定位的数据融合算法,其特征在于,所述步骤S1,采集频率为10Hz的GNSS信号、帧率为30fps的图像数据以及频率为200Hz的IMU数据,作为整个定位算法的输入。3. the data fusion algorithm that carries out precise positioning based on GNSS, vision and IMU as claimed in claim 1, is characterized in that, described step S1, collection frequency is that the GNSS signal of 10Hz, the image data of frame rate of 30fps and frequency are The 200Hz IMU data is used as the input for the entire positioning algorithm. 4.如权利要求1所述的基于GNSS、视觉以及IMU进行精确定位的数据融合算法,其特征在于,所述步骤S4,基于松耦合的优化方法,如式(1)所示;4. the data fusion algorithm that carries out precise positioning based on GNSS, vision and IMU as claimed in claim 1, is characterized in that, described step S4, based on the optimization method of loose coupling, as shown in formula (1);
Figure FDA0003548895880000023
Figure FDA0003548895880000023
其中,rL为本地数据的代价函数:Among them, r L is the cost function of the local data:
Figure FDA0003548895880000024
Figure FDA0003548895880000024
rG为松耦合的全局数据代价函数:r G is the loosely coupled global data cost function:
Figure FDA0003548895880000025
Figure FDA0003548895880000025
其中,下标k表示第k帧,δx表示x方向上的代价,δy表示y方向上的代价,x与y即需要优化的定位坐标;右上角标的local与global分别表示本地数据源与全局数据源各自的定位结果,θ为本地坐标系与全局坐标系的航向角偏差,通过本地定位轨迹与全局定位轨迹的角度偏差获取。Among them, the subscript k represents the kth frame, δx represents the cost in the x direction, δy represents the cost in the y direction, and x and y are the positioning coordinates that need to be optimized; the local and global subscripts in the upper right corner represent the local data source and global data respectively. Source respective positioning results, θ is the heading angle deviation between the local coordinate system and the global coordinate system, which is obtained by the angular deviation between the local positioning trajectory and the global positioning trajectory.
5.如权利要求1所述的基于GNSS、视觉以及IMU进行精确定位的数据融合算法,其特征在于,所述步骤S5,若此刻接收的卫星数量为0颗,则通过弥补本地与全局坐标系之间的航向角偏差,将本地定位结果全局化;若此刻接收的卫星数量大于0颗,则利用GNSS卫星信号的原始数据,包括伪距、多普勒频移以及时钟信息,构建GNSS代价函数对定位估计进行约束。5. the data fusion algorithm that carries out precise positioning based on GNSS, vision and IMU as claimed in claim 1, is characterized in that, described step S5, if the number of satellites received at this moment is 0, then by making up local and global coordinate system If the number of satellites received at this moment is greater than 0, the raw data of GNSS satellite signals, including pseudorange, Doppler frequency shift and clock information, are used to construct a GNSS cost function Constrain the localization estimate. 6.如权利要求1所述的基于GNSS、视觉以及IMU进行精确定位的数据融合算法,其特征在于,所述步骤S6,当接收机处于低速运动状态下时,多普勒频移的信噪比会下降,导致其约束精度降低;当速度小于vmin时,对多普勒信息的置信程度为0,即完全不考虑卫星提供的多普勒频移信息;速度在vmin到vmax之间时,两者呈线性关系,当速度增大到vmax以上时,置信程度不再改变;vmin与vmax根据具体的传感器套件特性以及测试情况进行调整。6. the data fusion algorithm that carries out precise positioning based on GNSS, vision and IMU as claimed in claim 1, it is characterized in that, described step S6, when receiver is in low-speed motion state, the signal-to-noise of Doppler frequency shift When the speed is less than v min , the confidence in the Doppler information is 0, that is, the Doppler frequency shift information provided by the satellite is not considered at all; the speed is between v min and v max . When the speed increases above v max , the confidence level does not change; v min and v max are adjusted according to the specific sensor suite characteristics and test conditions. 7.如权利要求1所述的基于GNSS、视觉以及IMU进行精确定位的数据融合算法,其特征在于,所述步骤S7,基于紧耦合的非线性优化方法,如式(4)所示;7. the data fusion algorithm that carries out precise positioning based on GNSS, vision and IMU as claimed in claim 1, is characterized in that, described step S7, based on the nonlinear optimization method of tight coupling, as shown in formula (4);
Figure FDA0003548895880000031
Figure FDA0003548895880000031
其中,第一项为边缘化带来的先验信息,rB为IMU的代价函数,rC为相机的代价函数,
Figure FDA0003548895880000032
为GNSS的紧耦合代价函数;根据实际需要,决定是否输出系统的姿态以及视觉的点云信息。
Among them, the first item is the prior information brought by marginalization, r B is the cost function of the IMU, r C is the cost function of the camera,
Figure FDA0003548895880000032
is the tightly coupled cost function of GNSS; according to actual needs, decide whether to output the attitude of the system and the visual point cloud information.
CN202210256763.7A 2022-03-16 2022-03-16 A data fusion algorithm for precise positioning based on GNSS, vision and IMU Pending CN114646993A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210256763.7A CN114646993A (en) 2022-03-16 2022-03-16 A data fusion algorithm for precise positioning based on GNSS, vision and IMU

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210256763.7A CN114646993A (en) 2022-03-16 2022-03-16 A data fusion algorithm for precise positioning based on GNSS, vision and IMU

Publications (1)

Publication Number Publication Date
CN114646993A true CN114646993A (en) 2022-06-21

Family

ID=81994295

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210256763.7A Pending CN114646993A (en) 2022-03-16 2022-03-16 A data fusion algorithm for precise positioning based on GNSS, vision and IMU

Country Status (1)

Country Link
CN (1) CN114646993A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116381760A (en) * 2023-06-05 2023-07-04 之江实验室 GNSS RTK/INS tightly coupled positioning method, device and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116381760A (en) * 2023-06-05 2023-07-04 之江实验室 GNSS RTK/INS tightly coupled positioning method, device and medium
CN116381760B (en) * 2023-06-05 2023-08-15 之江实验室 GNSS RTK/INS tightly coupled positioning method, device and medium

Similar Documents

Publication Publication Date Title
EP3566021B1 (en) Systems and methods for using a global positioning system velocity in visual-inertial odometry
CN111880207B (en) Visual inertial satellite tight coupling positioning method based on wavelet neural network
US20080195316A1 (en) System and method for motion estimation using vision sensors
CN101344391A (en) Autonomous determination method of lunar rover position and orientation based on full-function solar compass
CN115523920B (en) A seamless positioning method based on visual-inertial GNSS tight coupling
CN109506660B (en) Attitude optimization resolving method for bionic navigation
CN115388884A (en) A joint initialization method for agent pose estimators
CN115435779A (en) A Pose Estimation Method for Agents Based on GNSS/IMU/Optical Flow Information Fusion
Song et al. R2-GVIO: A robust, real-time GNSS-visual-inertial state estimator in urban challenging environments
Wen et al. 3D LiDAR aided GNSS real-time kinematic positioning
CN114646993A (en) A data fusion algorithm for precise positioning based on GNSS, vision and IMU
CN114690229A (en) GPS-fused mobile robot visual inertial navigation method
Gao et al. A fast and stable GNSS-LiDAR-inertial state estimator from coarse to fine by iterated error-state Kalman filter
Afia et al. A low-cost gnss/imu/visual monoslam/wss integration based on federated kalman filtering for navigation in urban environments
Liu et al. 3D LiDAR aided GNSS real-time kinematic positioning via coarse-to-fine batch optimization for high accuracy mapping in dense urban canyons
Li et al. Accuracy-and simplicity-oriented self-calibration approach for in-vehicle GNSS/INS/vision system with observability analysis
CN116105729A (en) A multi-sensor fusion positioning method for wild cave forest environmental reconnaissance
CN117687059A (en) Vehicle positioning method and device, electronic equipment and storage medium
CN115542363B (en) Attitude measurement method suitable for vertical downward-looking aviation pod
Hou et al. A Loosely-Coupled GNSS-Visual-Inertial Fusion for State Estimation Based On Optimation
CN116753948A (en) A positioning method based on visual-inertial GNSS PPP coupling
ZHANG et al. RRVPE: a robust and real-time visual-inertial-GNSS pose estimator for aerial robot navigation
Luo et al. An imu/visual odometry integrated navigation method based on measurement model optimization
Tsao Observability Analysis and Performance Evaluation for a Graph-Based GNSS–Visual–Inertial Odometry on Matrix Lie Groups
Ronen et al. Development challenges and performance analysis of drone visual/inertial slam in a global reference system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination