CN114646993A - Data fusion algorithm for accurate positioning based on GNSS, vision and IMU - Google Patents
Data fusion algorithm for accurate positioning based on GNSS, vision and IMU Download PDFInfo
- Publication number
- CN114646993A CN114646993A CN202210256763.7A CN202210256763A CN114646993A CN 114646993 A CN114646993 A CN 114646993A CN 202210256763 A CN202210256763 A CN 202210256763A CN 114646993 A CN114646993 A CN 114646993A
- Authority
- CN
- China
- Prior art keywords
- gnss
- data
- imu
- positioning
- local
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/485—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/35—Constructional details or hardware or software details of the signal processing chain
- G01S19/37—Hardware or software details of the signal processing chain
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
The invention discloses a data fusion algorithm for accurate positioning based on GNSS, vision and IMU. The GNSS positioning technology is easily limited by external observation conditions, and satellite signal receiving capacity is sharply weakened in sheltered areas such as viaducts and urban canyons, so that the positioning and navigation requirements cannot be met. The scale problem cannot be solved by pure visual position estimation, and a matching relation cannot be established in a weak texture scene and a fast motion scene, so that tracking loss is easy to occur; the IMU can reflect dynamic changes in a short time (in milliseconds), but its cumulative error increases over long runs (in seconds). According to the invention, according to the complementarity among different sensor data sources, the global positioning precision of the GNSS is improved by using the local data sources (vision and IMU), the local accumulated error is eliminated by using the global data sources of the GNSS, the short-term high-precision global positioning is maintained by depending on the local vision and IMU data when the satellite signals are shielded, and the performance and the robustness of the positioning navigation system are integrally improved.
Description
Technical Field
The invention relates to the field of navigation positioning, in particular to a navigation positioning system based on GNSS, vision and IMU.
Background
The high-precision and high-robustness positioning technology is of great importance to high and new industrial fields such as national security, artificial intelligence, Internet of things and the like. Through continuous research and development of the industry, a plurality of different positioning systems are correspondingly generated, including a GNSS global navigation satellite system, an INS inertial navigation system, a VO visual odometer, a CNS astronomical navigation system and the like.
In recent years, with the rapid development of the location service industry, a single positioning system has been unable to meet the user's demand for reliable positioning results. The GNSS global navigation satellite system can acquire a positioning track in the real world, but is easily limited by external observation conditions, and in shielded areas such as viaducts, urban canyons, underground garages and tunnels, the satellite signal receiving capability is sharply weakened, so that the positioning accuracy is remarkably reduced or the positioning information cannot be solved. The visual odometer performs pure visual position estimation by utilizing each frame of image of the camera, but the calculated positioning track has an uncertain scale scaling compared with the real world, and a matching relation cannot be established under a weak texture scene and a fast motion scene, so that the tracking loss is easy. The visual inertial odometer uses the camera and the IMU to estimate the position, has higher precision, but is influenced by accumulated errors in long-term operation to cause continuous reduction of the precision, and the positioning track of the visual inertial odometer has a course angle deviation with the track in the real world. Due to the limitation of a single-sensor positioning method and the complementarity of data sources among multiple sensors, a high-precision and high-robustness multi-source fusion positioning technology becomes a scientific problem to be solved urgently.
Disclosure of Invention
The invention aims to improve the positioning performance of a GNSS navigation positioning system and solve the problem of GNSS satellite signal loss in a complex scene.
The invention provides a data fusion algorithm for accurate positioning based on GNSS, vision and IMU, which comprises the following steps:
step S1: the method comprises the following steps that a GNSS satellite receiving device, a monocular camera and an IMU module are mounted on mobile equipment, GNSS signals, image data and IMU data are collected in the moving process and serve as the input of the whole positioning algorithm;
step S2: initializing and aligning local visual and IMU data; performing front-end visual tracking and SFM visual pose solving on image data, performing pre-integration on IMU data, and aligning the data of the IMU data and the SFM visual pose solving data so as to roughly solve visual scale, gravitational acceleration and system initial speed at the initial moment;
step S3: carrying out nonlinear optimization on local data: for the key frames in the set sliding window, constructing a visual cost function and an IMU cost function, and obtaining a positioning result of local data by minimizing a total cost function;
step S4: performing loose coupling joint initialization of local and global data sources: based on a nonlinear optimization technology, fusing a local positioning result and a global positioning result of the GNSS based on an SPP algorithm, and roughly resolving a system global position at an initial moment and course angle deviation between a local coordinate system and a global coordinate system;
step S5: constructing a tightly-coupled global cost function constraint positioning estimation of the GNSS: if the number of the received satellites is 0 at the moment, the local positioning result is directly integrated; if the number of the received satellites is larger than 0, constructing a GNSS cost function by using original data of GNSS satellite signals to restrain the positioning estimation;
step S6: adjusting a covariance matrix of a Doppler frequency shift cost function according to the current motion speed of the system, and changing the confidence degree of the positioning estimation to the Doppler frequency shift information;
step S7: carrying out local and global close coupling joint nonlinear optimization: the final global positioning estimate of the system is solved by minimizing the overall cost function consisting of visual constraints, IMU constraints and GNSS raw data constraints.
The invention has the beneficial effects that:
(1) the invention improves the positioning performance of the navigation positioning system by fusing the global data source of the GNSS and the high-precision vision and IMU local data source. Under an outdoor scene of GNSS signal reception, a positioning method for resolving the unique coordinate of a receiver under a global coordinate system based on an SPP algorithm has insufficient precision, and cannot meet the requirements of positioning and navigation under many scenes.
(2) The data fusion algorithm for accurate positioning based on GNSS, vision and IMU provided by the invention can seamlessly provide short-term high-precision global positioning estimation under various scenes that GNSS satellite signals are completely lost. When the satellite signals are recaptured, the positioning mode of the system is seamlessly switched, the global information of the GNSS is utilized to eliminate the local accumulated error, and the high-precision global positioning is continuously obtained.
(3) According to the method for constructing the cost function constraint positioning estimation based on the GNSS original data (including the pseudo range, the Doppler frequency shift, the clock and the like), when the number of the received satellites is less than 4 and more than 0, the local accumulated error can be eliminated by utilizing the GNSS global information, so that the positioning estimation benefit is obtained.
Drawings
FIG. 1 is a flow chart of a data fusion algorithm for accurate positioning based on GNSS, vision and IMU.
FIG. 2 is a diagram illustrating the relationship between the system velocity and the Doppler confidence in the present invention.
Detailed Description
The invention will be further described with reference to the accompanying drawings.
The invention discloses a data fusion algorithm for accurate positioning based on GNSS, vision and IMU. The flow chart of the specific embodiment of the invention is shown in figure 1:
step S1: the GNSS satellite signal receiver, the monocular camera and the IMU module are mounted on the mobile equipment, GNSS signals with the frequency of 10Hz, image data with the frame rate of 30fps and IMU data with the frequency of 200Hz are collected in the moving process and serve as input of the whole positioning algorithm. Wherein the raw data of GNSS comprises ephemeris and pseudorangeDoppler shiftClock, etc., the input data to the camera being a continuous sequence of image frames, the input of the IMUThe data is information of linear acceleration α and rotational angular velocity ω of a series of objects.
Step S2: performing initial alignment on local visual and IMU data: the images are summarized into a series of sparse feature points through a front-end feature point tracking algorithm, and a pure visual pose estimation is carried out based on SFM. Since the sampling frequency of the IMU is higher than the camera, the input data of the IMU is pre-integrated, temporally aligned with each frame of image of the camera. The vision, IMU data are then jointly calibrated to roughly solve for the initial velocity, gravitational acceleration (marking the initial pose), and vision scale of the system.
Step S3: carrying out nonlinear optimization on local data: and constructing a visual cost function and an IMU cost function for the key frames in the set sliding window. And for the visual cost function, the reprojection error obtained by a beam adjustment method between two frames is constructed, and for the IMU cost function, the acceleration and the angular velocity of the object between the two frames and the zero offset of the IMU are constructed.
Step S4: performing loose coupling joint initialization of local and global data sources: based on a nonlinear optimization technology, a local positioning result and a global positioning result of the GNSS based on the SPP algorithm are fused in a loose coupling mode, and a system global position at an initial moment and course angle deviation between a local coordinate system and a global coordinate system are roughly solved, as shown in a formula (1).
Wherein r isLCost function for local data:
rGfor a loosely coupled global data cost function:
the subscript k denotes the kth frame, δ x denotes the cost in the x direction, δ y denotes the cost in the y direction, and x and y are the positioning coordinates to be optimized. And the local and global of the upper right corner mark respectively represent the respective positioning results of the local data source and the global data source, and theta is the course angle deviation of the local coordinate system and the global coordinate system and can be obtained through the angle deviation of the local positioning track and the global positioning track.
Step S5: constructing a global cost function constraint positioning estimation of a GNSS: if the number of the received satellites is 0, the local positioning result is integrated by making up course angle deviation between the local coordinate system and the global coordinate system. If the number of the received satellites is larger than 0, a GNSS cost function is constructed by using the original data of the GNSS satellite signals, including pseudo range, Doppler frequency shift and clock information, to constrain the positioning estimation. Wherein: the pseudorange represents the distance of the receiver from the satellite, and its cost function is expressed by equation (4).
Wherein R isz(theta) represents a rotation matrix rotated by an angle theta around the z-axis of the ECI coordinate system, omegaERepresenting angular velocity, t, of the earth's rotationfIs the time of flight of the GNSS satellite signals, p represents the position coordinates, s represents the satellite, r represents the receiver, E represents the ECEF coordinate system at the moment the receiver receives the signals, E' represents the ECEF coordinate system at the moment the satellite sends the signals,is a four-dimensional vector with three values of 0 and one value of 1, representing a particular satellite navigation system in the GNSS, deltat representing the clock bias,representing the clock error of the satellite, c the speed of light, T and I the tropospheric and ionospheric delays, respectively,the pseudoranges are represented. The doppler shift reflects the motion of the satellite relative to the receiver on the signal propagation path, and its cost function is given by equation (5).
Where v denotes velocity, λ denotes carrier signal wavelength, κ denotes the unit vector of the receiver to the satellite,in order to be able to determine the rate of satellite clock error drift,in order to be the doppler shift frequency,representing the clock drift rate. The process of resolving the satellite signal by the receiver is related to the clock bias and the clock drift rate of the receiver, and two cost functions are constructed according to the formula (6).
Where δ t represents the clock bias, 1m×nA matrix representing m rows and n columns with all elements 1,representing the time interval of k frames from k-1 frames.
Step S6: and adjusting the covariance matrix of the Doppler frequency shift cost function according to the current motion speed of the system, and changing the confidence degree of the positioning estimation on the Doppler frequency shift information. When the receiver is in a low-speed motion state,the signal-to-noise ratio of the doppler shift decreases, resulting in a decrease in the constraint accuracy thereof. As shown in fig. 2, the relationship between the system velocity and the doppler confidence in the present algorithm is shown. When the velocity is less than vminThe confidence level of the doppler information is 0, i.e. the doppler shift information provided by the satellite is not considered at all. Velocity at vminTo vmaxIn a linear relationship, when the velocity increases to vmaxAbove, the confidence level is no longer changed. v. ofminAnd vmaxCan be adjusted according to the specific characteristics of the sensor suite and the test condition.
Step S7: carrying out local and global close coupling joint nonlinear optimization: and (3) solving the final global positioning estimation of the system by minimizing a total cost function consisting of visual constraint, IMU constraint and GNSS raw data constraint by using a least square method, wherein the final global positioning estimation is shown as a formula (7).
Wherein the first term is prior information brought by marginalization, rBIs a cost function of the IMU, rCIs a function of the cost of the camera,is a tightly coupled cost function of GNSS. And determining whether to output the attitude of the system and the visual point cloud information according to actual needs.
In a word, according to the 7 steps, through the fusion of the information of the GNSS, the vision and the IMU, the advantage complementation among different sensors can be realized, finally, the positioning navigation system achieves an accurate positioning effect in an open outdoor scene, short-term high-precision global positioning estimation is guaranteed in a scene where satellite signals are lost, and the performance and the robustness of the positioning navigation system are integrally improved.
Claims (7)
1. The data fusion algorithm for performing accurate positioning based on GNSS, vision and IMU is characterized by comprising the following steps:
step S1: the method comprises the following steps that a GNSS satellite receiving device, a monocular camera and an IMU module are mounted on mobile equipment, GNSS signals, image data and IMU data are collected in the moving process and serve as the input of the whole positioning algorithm;
step S2: initializing and aligning local visual and IMU data; performing front-end visual tracking and SFM visual pose solving on image data, performing pre-integration on IMU data, and aligning the data of the IMU data and the SFM visual pose solving data so as to roughly solve visual scale, gravitational acceleration and system initial speed at the initial moment;
step S3: carrying out nonlinear optimization on local data: for the key frames in the set sliding window, constructing a visual cost function and an IMU cost function, and obtaining a positioning result of local data by minimizing a total cost function;
step S4: performing loose coupling joint initialization of local and global data sources: fusing a local positioning result and a global positioning result of the GNSS based on an SPP algorithm based on a nonlinear optimization technology, and roughly calculating a system global position at an initial moment and course angle deviation between a local coordinate system and a global coordinate system;
step S5: constructing a tightly-coupled global cost function constraint positioning estimation of the GNSS: if the number of the received satellites is 0 at the moment, the local positioning result is directly integrated; if the number of the received satellites is larger than 0, constructing a GNSS cost function by using original data of GNSS satellite signals to restrain the positioning estimation;
step S6: adjusting a covariance matrix of a Doppler frequency shift cost function according to the current motion speed of the system, and changing the confidence degree of the positioning estimation to the Doppler frequency shift information;
step S7: carrying out local and global close coupling joint nonlinear optimization: the final global positioning estimate of the system is solved by minimizing the overall cost function consisting of visual constraints, IMU constraints and GNSS raw data constraints.
2. The GNSS, vision and IMU based data fusion algorithm for precise positioning according to claim 1, whereinStep S1, inputting the positioning algorithm including raw data from GNSS, input data from camera, and input data from IMU; the raw data of GNSS comprises ephemeris and pseudo rangeDoppler shiftClock, etc., the input data of the camera is a continuous sequence of image frames, and the input data of the IMU is linear acceleration a and rotational angular velocity ω information of a series of objects.
3. The GNSS, vision and IMU based data fusion algorithm for precise positioning according to claim 1, wherein in step S1, GNSS signals with a frequency of 10Hz, image data with a frame rate of 30fps and IMU data with a frequency of 200Hz are acquired as input to the whole positioning algorithm.
4. The GNSS, vision and IMU based data fusion algorithm for precise positioning according to claim 1, wherein the step S4 is based on a loose coupling optimization method, as shown in equation (1);
wherein r isLCost function for local data:
rGfor a loosely coupled global data cost function:
the subscript k represents the kth frame, δ x represents the cost in the x direction, δ y represents the cost in the y direction, and x and y are the positioning coordinates to be optimized; and local and global of the upper right corner mark respectively represent the respective positioning results of the local data source and the global data source, and theta is the course angle deviation of the local coordinate system and the global coordinate system and is obtained through the angle deviation of the local positioning track and the global positioning track.
5. The GNSS, vision and IMU based data fusion algorithm for precise positioning according to claim 1, wherein in step S5, if the number of received satellites at this time is 0, the local positioning result is made global by compensating for the heading angle deviation between the local and global coordinate systems; if the number of the received satellites is larger than 0, a GNSS cost function is constructed by using the original data of the GNSS satellite signals, including pseudo range, Doppler frequency shift and clock information, to constrain the positioning estimation.
6. The GNSS, vision and IMU based data fusion algorithm for precise positioning according to claim 1, wherein in step S6, when the receiver is in a low-speed motion state, the signal-to-noise ratio of the doppler shift is decreased, resulting in a decrease of the constraint accuracy; when the velocity is less than vminWhen the confidence degree of the Doppler information is 0, the Doppler frequency shift information provided by the satellite is completely not considered; velocity at vminTo vmaxIn a linear relationship, when the velocity increases to vmaxIn the above, the confidence level is not changed any more; v. ofminAnd vmaxAnd adjusting according to the specific characteristics of the sensor suite and the test condition.
7. The GNSS, vision and IMU based precise positioning data fusion algorithm according to claim 1, wherein said step S7 is based on a tightly coupled non-linear optimization method, as shown in equation (4);
wherein the first term is prior information brought by marginalization, rBIs a cost function of the IMU, rCIs a function of the cost of the camera and,a tightly coupled cost function for the GNSS; and determining whether to output the system posture and visual point cloud information according to actual needs.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210256763.7A CN114646993A (en) | 2022-03-16 | 2022-03-16 | Data fusion algorithm for accurate positioning based on GNSS, vision and IMU |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210256763.7A CN114646993A (en) | 2022-03-16 | 2022-03-16 | Data fusion algorithm for accurate positioning based on GNSS, vision and IMU |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114646993A true CN114646993A (en) | 2022-06-21 |
Family
ID=81994295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210256763.7A Pending CN114646993A (en) | 2022-03-16 | 2022-03-16 | Data fusion algorithm for accurate positioning based on GNSS, vision and IMU |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114646993A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116381760A (en) * | 2023-06-05 | 2023-07-04 | 之江实验室 | GNSS RTK/INS tight coupling positioning method, device and medium |
-
2022
- 2022-03-16 CN CN202210256763.7A patent/CN114646993A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116381760A (en) * | 2023-06-05 | 2023-07-04 | 之江实验室 | GNSS RTK/INS tight coupling positioning method, device and medium |
CN116381760B (en) * | 2023-06-05 | 2023-08-15 | 之江实验室 | GNSS RTK/INS tight coupling positioning method, device and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113203418B (en) | GNSSINS visual fusion positioning method and system based on sequential Kalman filtering | |
CN111947652A (en) | Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander | |
Sun et al. | An adaptive weighting strategy for multisensor integrated navigation in urban areas | |
CN109916394A (en) | Combined navigation algorithm fusing optical flow position and speed information | |
Li et al. | Multi-GNSS PPP/INS/Vision/LiDAR tightly integrated system for precise navigation in urban environments | |
CN110187375A (en) | A kind of method and device improving positioning accuracy based on SLAM positioning result | |
CN115523920B (en) | Seamless positioning method based on visual inertial GNSS tight coupling | |
CN114019552A (en) | Bayesian multi-sensor error constraint-based location reliability optimization method | |
CN113376669A (en) | Monocular VIO-GNSS fusion positioning algorithm based on dotted line characteristics | |
CN115388884A (en) | Joint initialization method for intelligent body pose estimator | |
CN113503872B (en) | Low-speed unmanned aerial vehicle positioning method based on fusion of camera and consumption-level IMU | |
CN108445519A (en) | A kind of pinpoint implementation methods of mobile terminal GPS | |
CN115435779A (en) | Intelligent body pose estimation method based on GNSS/IMU/optical flow information fusion | |
CN114690229A (en) | GPS-fused mobile robot visual inertial navigation method | |
CN114646993A (en) | Data fusion algorithm for accurate positioning based on GNSS, vision and IMU | |
Afia et al. | A low-cost gnss/imu/visual monoslam/wss integration based on federated kalman filtering for navigation in urban environments | |
CN117687059A (en) | Vehicle positioning method and device, electronic equipment and storage medium | |
CN116753948A (en) | Positioning method based on visual inertial GNSS PPP coupling | |
CN116105729A (en) | Multi-sensor fusion positioning method for reconnaissance of forest environment of field cave | |
CN116625359A (en) | Visual inertial positioning method and device for self-adaptive fusion of single-frequency RTK | |
Song et al. | R2-GVIO: A Robust, Real-Time GNSS-Visual-Inertial State Estimator in Urban Challenging Environments | |
Afia et al. | A GNSS/IMU/WSS/VSLAM hybridization using an extended kalman filter | |
Qian | Generic multisensor integration strategy and innovative error analysis for integrated navigation | |
Wu et al. | A Failure-Resistant, Lightweight, and Tightly Coupled GNSS/INS/Vision Vehicle Integration for Complex Urban Environments | |
CN118149854A (en) | Mileage metering method based on multi-sensor fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |