CN113790738A - Data compensation method based on intelligent cradle head IMU - Google Patents
Data compensation method based on intelligent cradle head IMU Download PDFInfo
- Publication number
- CN113790738A CN113790738A CN202110931666.9A CN202110931666A CN113790738A CN 113790738 A CN113790738 A CN 113790738A CN 202110931666 A CN202110931666 A CN 202110931666A CN 113790738 A CN113790738 A CN 113790738A
- Authority
- CN
- China
- Prior art keywords
- imu
- data
- intelligent
- intelligent holder
- collector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000005259 measurement Methods 0.000 claims abstract description 10
- 230000009466 transformation Effects 0.000 claims description 22
- 239000011159 matrix material Substances 0.000 claims description 20
- 238000012937 correction Methods 0.000 claims description 10
- 238000005070 sampling Methods 0.000 claims description 10
- 230000001133 acceleration Effects 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 6
- 230000001360 synchronised effect Effects 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 4
- 230000006641 stabilisation Effects 0.000 claims description 4
- 238000011105 stabilization Methods 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Manufacturing & Machinery (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention provides a data compensation method based on an intelligent tripod head IMU, which comprises the following steps: calibrating coordinates of an IMU and a collector of the intelligent holder; calibrating and determining the relative position of the IMU and the collector of the intelligent holder according to the coordinate of the IMU and the collector of the intelligent holder; constructing a time synchronization relation between an IMU (inertial measurement Unit) of the intelligent holder and the collector; and performing data compensation on the data acquired by the acquisition unit based on the IMU of the intelligent holder. The data compensation method based on the intelligent cradle head IMU can improve the precision and accuracy of data acquired by the sensors and can compensate data acquired by various sensors in real time.
Description
Technical Field
The invention relates to the technical field of sensor data compensation, in particular to a data compensation method based on an intelligent cradle head IMU.
Background
An Inertial Measurement Unit (IMU) is a device for measuring the three-axis attitude angle and acceleration of an object; the sensor is at the in-process of data acquisition, and the intelligence cloud platform that bears the weight of the sensor receives environmental factor influences such as vibration, produces deviation such as angle, displacement easily, leads to the data emergence error that the sensor was gathered, distortion scheduling problem. Therefore, correction and compensation are needed to be performed on data collected by the sensor, and the existing correction and compensation technologies mainly include the following:
the first method is used for correcting and compensating data acquired by a camera sensor, and realizing image stabilization technology and compensation based on an IMU (inertial measurement Unit): specifically, a time synchronization model is established between the IMU and the camera sensor, the relative pose is determined, mapping between the IMU and the camera sensor is established, the IMU acquires real-time angle data or motion velocity acceleration data, an instantaneous compensation transformation matrix is established, and the real-time angle data or motion velocity acceleration data is multiplied by the original data of the camera sensor to obtain corrected data.
And secondly, correcting and compensating data acquired by a laser radar sensor, and realizing distortion compensation of laser point cloud based on an IMU (inertial measurement Unit): specifically, a synchronous model is established between the IMU and the laser radar, a corresponding transformation matrix is determined, a compensation transformation matrix is calculated according to real-time IMU angular velocity and position coordinate data, and correction data are obtained after the compensation transformation matrix is multiplied by the laser radar data.
And thirdly, correcting and compensating the data acquired by the millimeter wave radar sensor, and realizing the compensation of the millimeter wave radar based on the IMU: specifically, a synchronous model is established between the IMU and the millimeter wave radar, a corresponding conversion matrix is determined, a compensation transformation matrix is calculated according to real-time IMU angular velocity and position coordinate data, and correction data are obtained after the compensation transformation matrix is multiplied by the millimeter wave radar data.
However, when the existing sensor data is compensated by using the IMU, because the IMU itself has random errors, the errors are accumulated along with the passage of time, the IMU and the sensor parameters need to be calibrated before use, and compensation transformation matrices obtained at each moment of the IMU are different, so that the dimensionality is high, and the requirement on the calculation performance is high. In addition, the prior art can only carry out data compensation for one sensor, and does not combine the camera, the laser radar and the millimeter wave radar together to carry out compensation in parallel. In addition, the existing data compensation technology is mainly applied to vehicle-mounted sensing equipment, and the roadside sensing sensors are different in working conditions, so that the corresponding compensation technology is lacked.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a data compensation method based on an intelligent tripod head IMU, which can improve the precision and accuracy of data acquired by a sensor and can simultaneously compensate data acquired by various sensors in real time.
In order to solve the problems, the technical scheme of the invention is as follows:
a data compensation method based on an intelligent cradle head IMU (inertial measurement Unit), comprising the following steps of:
calibrating coordinates of an IMU and a collector of the intelligent holder;
calibrating and determining the relative position of the IMU and the collector of the intelligent holder according to the coordinate of the IMU and the collector of the intelligent holder;
constructing a time synchronization relation between an IMU (inertial measurement Unit) of the intelligent holder and the collector; and
and performing data compensation on the data acquired by the acquisition unit based on the IMU of the intelligent holder.
Optionally, the collector includes a camera, a laser radar, and a millimeter wave radar.
Optionally, the step of calibrating coordinates of the IMU and the collector of the intelligent pan/tilt specifically includes: for a camera, the two-dimensional point coordinate of the imaging plane is mb=[u v]TThe coordinate of the three-dimensional point in the corresponding space is Mb=[X Y Z]And calculating H by using a chessboard calibration method or a two-dimensional code calibration method, and further calculating corresponding internal and external parameters.
Optionally, the step of determining the relative position of the IMU and the collector of the intelligent pan/tilt head according to the coordinate calibration of the IMU and the collector of the intelligent pan/tilt head specifically includes: for the camera, IMU parameter data acquired under the same posture and internal and external parameter data of the camera are used as coordinate data of the camera, transformation matrixes corresponding to the IMU parameter data and the internal and external parameter data are calculated and obtained, and a least square method is used for minimizing a target function to obtain an optimal compensation transformation matrix.
Optionally, the step of determining the relative position of the IMU and the collector of the intelligent pan/tilt head according to the coordinate calibration of the IMU and the collector of the intelligent pan/tilt head specifically includes: and for the laser radar and the millimeter wave radar, establishing a compensation transformation matrix between corresponding speed and acceleration so as to determine the position relation between the coordinates of the laser radar and the millimeter wave radar and the coordinate point at the initial time when data are collected.
Optionally, the step of constructing a time synchronization relationship between the IMU of the intelligent pan/tilt and the collector specifically includes: for a camera, different sampling frequencies are arranged between an IMU (inertial measurement Unit) of an intelligent tripod head and the camera, the minimum common multiple between the IMU of the intelligent tripod head and the camera is taken as the output sampling frequency, and the IMU of the intelligent tripod head and an acquisition device are both expanded to the same acquisition frequency by using a linear interpolation method, so that the synchronization of acquired data is realized.
Optionally, the step of constructing a time synchronization relationship between the IMU of the intelligent pan/tilt and the collector specifically includes: the time synchronization process between the IMU of the intelligent holder and the laser radar or the millimeter wave radar is as follows: according to the built-in timestamp between the IMU of the intelligent holder and the laser radar or the millimeter wave radar, calculating the difference between the timestamp of data acquired by the laser radar or the millimeter wave radar every time and the timestamp of the nearest IMU acquired data adjacent to the timestamp, if the difference is smaller than a threshold value, outputting the data as synchronous data, if the difference is too large, discarding the data, and taking the next frame of sampled data as input to perform the process.
Optionally, the step of performing data compensation on the data acquired by the collector by the intelligent cradle head-based IMU specifically includes: for the camera, the original sampling data of each frame is multiplied by the obtained compensation transformation matrix to obtain the output correction data, and the correction data is continuously output to obtain an image stabilization image.
Optionally, the step of performing data compensation on the data acquired by the collector by the intelligent cradle head-based IMU specifically includes: for the laser radar and the millimeter wave radar, a compensation transformation matrix obtained according to the speed and the acceleration is multiplied by the original data point coordinate, and the corrected data point coordinate can be obtained.
Compared with the prior art, the method and the device have the advantages that the data of the roadside sensor are compensated in real time based on the IMU attitude data of the intelligent holder, so that the precision and the accuracy of the data collected by the roadside sensor are improved, and the data collected by various sensors such as a camera, a laser radar and a millimeter wave radar can be compensated and fused by the IMU of the intelligent holder.
In addition, the invention realizes the real-time compensation of the collector under the working environment of road side, thereby meeting the requirements of intelligent networking equipment on attitude stability and data instantaneity.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a flow chart of a data compensation method based on an intelligent holder IMU according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
Specifically, fig. 1 is a flow chart of a data compensation method based on an intelligent cradle head IMU according to an embodiment of the present invention, and as shown in fig. 1, the present invention provides a data compensation method based on an intelligent cradle head IMU, where the method includes the following steps:
s1: calibrating coordinates of an IMU and a collector of the intelligent holder;
specifically, in this embodiment, the collector includes a camera, a laser radar sensor, and a millimeter wave radar sensor, whereinFor a camera, the two-dimensional point coordinate of the imaging plane is mb=[u v]TThe coordinate of the three-dimensional point in the corresponding space is Mb=[X Y Z]The default camera is a camera model for pinhole imaging, the corresponding projection relationship is sm-HM, s is an arbitrary number, and H is a corresponding conversion matrix.
For the laser radar sensor and the millimeter wave radar sensor, the collected data are the distance and the speed of the point directly, so that no internal and external parameters exist.
S2: calibrating and determining the relative position of the IMU and the collector of the intelligent holder according to the coordinate of the IMU and the collector of the intelligent holder;
specifically, for the camera, the acquired IMU parameter data { IMU under the same posture is acquirediAnd internal and external parameter data of camera (cam)iAnd (4) calculating to obtain a transformation matrix R corresponding to the coordinate data, wherein cam is R imu + t. Minimizing an objective function using a least squares methodThe optimal compensation transformation matrix R can be found.
The calibration modes of the laser radar and the millimeter wave radar are similar to those of a camera, and the difference is that besides coordinate points, compensation transformation matrixes between corresponding speed and acceleration are required to be established, so that the position relation between the coordinates of the laser radar and the millimeter wave radar and the coordinate points at the initial time is determined at the time of data acquisition.
S3: constructing a time synchronization relation between an IMU (inertial measurement Unit) of the intelligent holder and the collector;
specifically, for the camera, different sampling frequencies are provided between the IMU of the intelligent pan-tilt and the camera, the least common multiple between the IMU of the intelligent pan-tilt and the camera is taken as the output sampling frequency, and the IMU of the intelligent pan-tilt and the collector are both extended to the same sampling frequency by using a linear interpolation method, so that the synchronization of the collected data is realized.
The time synchronization process between the IMU of the intelligent holder and the laser radar or the millimeter wave radar is as follows: according to the built-in timestamp between the IMU of the intelligent holder and the laser radar or the millimeter wave radar, calculating the difference between the timestamp of data acquired by the laser radar or the millimeter wave radar every time and the timestamp of the nearest IMU acquired data adjacent to the timestamp, if the difference is smaller than a threshold value, outputting the data as synchronous data, if the difference is too large, discarding the data, and taking the next frame of sampled data as input to perform the process.
S4: and performing data compensation on the data acquired by the acquisition unit based on the IMU of the intelligent holder.
Specifically, for the camera, the original sampling data of each frame is multiplied by the obtained compensation transformation matrix to obtain the output correction data, and the correction data is continuously output to obtain an image stabilization image.
For the laser radar and the millimeter wave radar, a compensation transformation matrix obtained according to the speed and the acceleration is multiplied by the original data point coordinate, and the corrected data point coordinate can be obtained.
Compared with the prior art, the method and the device have the advantages that the data of the roadside sensor are compensated in real time based on the IMU attitude data of the intelligent holder, so that the precision and the accuracy of the data collected by the roadside sensor are improved, and the data collected by various sensors such as a camera, a laser radar and a millimeter wave radar can be compensated and fused by the IMU of the intelligent holder.
In addition, the invention realizes the real-time compensation of the collector under the working environment of road side, thereby meeting the requirements of intelligent networking equipment on attitude stability and data instantaneity.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.
Claims (9)
1. A data compensation method based on an intelligent cradle head IMU is characterized by comprising the following steps:
calibrating coordinates of an IMU and a collector of the intelligent holder;
calibrating and determining the relative position of the IMU and the collector of the intelligent holder according to the coordinate of the IMU and the collector of the intelligent holder;
constructing a time synchronization relation between an IMU (inertial measurement Unit) of the intelligent holder and the collector; and
and performing data compensation on the data acquired by the acquisition unit based on the IMU of the intelligent holder.
2. The intelligent holder IMU-based data compensation method according to claim 1, wherein the collector comprises a camera, a laser radar and a millimeter wave radar.
3. The intelligent holder IMU-based data compensation method according to claim 2, wherein the step of calibrating the coordinates of the intelligent holder IMU and the collector specifically comprises: for a camera, the two-dimensional point coordinate of the imaging plane is mb=[u v]TThe coordinate of the three-dimensional point in the corresponding space is Mb=[X Y Z]And calculating H by using a chessboard calibration method or a two-dimensional code calibration method, and further calculating corresponding internal and external parameters.
4. The intelligent holder IMU-based data compensation method according to claim 2, wherein the step of determining the relative position of the intelligent holder IMU and the collector according to the coordinate calibration of the intelligent holder IMU and the collector specifically comprises: for the camera, IMU parameter data acquired under the same posture and internal and external parameter data of the camera are used as coordinate data of the camera, transformation matrixes corresponding to the IMU parameter data and the internal and external parameter data are calculated and obtained, and a least square method is used for minimizing a target function to obtain an optimal compensation transformation matrix.
5. The intelligent holder IMU-based data compensation method according to claim 2, wherein the step of determining the relative position of the intelligent holder IMU and the collector according to the coordinate calibration of the intelligent holder IMU and the collector specifically comprises: and for the laser radar and the millimeter wave radar, establishing a compensation transformation matrix between corresponding speed and acceleration so as to determine the position relation between the coordinates of the laser radar and the millimeter wave radar and the coordinate point at the initial time when data are collected.
6. The intelligent holder IMU-based data compensation method according to claim 2, wherein the step of constructing the time synchronization relationship between the IMU of the intelligent holder and the collector specifically comprises: for a camera, different sampling frequencies are arranged between an IMU (inertial measurement Unit) of an intelligent tripod head and the camera, the minimum common multiple between the IMU of the intelligent tripod head and the camera is taken as the output sampling frequency, and the IMU of the intelligent tripod head and an acquisition device are both expanded to the same acquisition frequency by using a linear interpolation method, so that the synchronization of acquired data is realized.
7. The intelligent holder IMU-based data compensation method according to claim 2, wherein the step of constructing the time synchronization relationship between the IMU of the intelligent holder and the collector specifically comprises: the time synchronization process between the IMU of the intelligent holder and the laser radar or the millimeter wave radar is as follows: according to the built-in timestamp between the IMU of the intelligent holder and the laser radar or the millimeter wave radar, calculating the difference between the timestamp of data acquired by the laser radar or the millimeter wave radar every time and the timestamp of the nearest IMU acquired data adjacent to the timestamp, if the difference is smaller than a threshold value, outputting the data as synchronous data, if the difference is too large, discarding the data, and taking the next frame of sampled data as input to perform the process.
8. The intelligent holder IMU-based data compensation method according to claim 4, wherein the step of performing data compensation on the data acquired by the collector by the intelligent holder IMU specifically comprises: for the camera, the original sampling data of each frame is multiplied by the obtained compensation transformation matrix to obtain the output correction data, and the correction data is continuously output to obtain an image stabilization image.
9. The intelligent holder IMU-based data compensation method according to claim 5, wherein the step of performing data compensation on the data acquired by the collector by the intelligent holder IMU specifically comprises: for the laser radar and the millimeter wave radar, a compensation transformation matrix obtained according to the speed and the acceleration is multiplied by the original data point coordinate, and the corrected data point coordinate can be obtained.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110931666.9A CN113790738A (en) | 2021-08-13 | 2021-08-13 | Data compensation method based on intelligent cradle head IMU |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110931666.9A CN113790738A (en) | 2021-08-13 | 2021-08-13 | Data compensation method based on intelligent cradle head IMU |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113790738A true CN113790738A (en) | 2021-12-14 |
Family
ID=79181798
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110931666.9A Pending CN113790738A (en) | 2021-08-13 | 2021-08-13 | Data compensation method based on intelligent cradle head IMU |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113790738A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117518196A (en) * | 2023-12-19 | 2024-02-06 | 中联重科股份有限公司 | Motion compensation method, device, system, equipment and medium for laser radar |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106251305A (en) * | 2016-07-29 | 2016-12-21 | 长春理工大学 | A kind of realtime electronic image stabilizing method based on Inertial Measurement Unit IMU |
CN106289275A (en) * | 2015-06-23 | 2017-01-04 | 沃尔沃汽车公司 | For improving unit and the method for positioning precision |
CN109975792A (en) * | 2019-04-24 | 2019-07-05 | 福州大学 | Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion |
CN110221302A (en) * | 2019-05-24 | 2019-09-10 | 上海高智科技发展有限公司 | Environmental detection device and its modification method, system, portable equipment and storage medium |
CN110873883A (en) * | 2019-11-29 | 2020-03-10 | 上海有个机器人有限公司 | Positioning method, medium, terminal and device integrating laser radar and IMU |
CN111882612A (en) * | 2020-07-21 | 2020-11-03 | 武汉理工大学 | A vehicle multi-scale localization method based on 3D laser detection of lane lines |
WO2020233443A1 (en) * | 2019-05-21 | 2020-11-26 | 菜鸟智能物流控股有限公司 | Method and device for performing calibration between lidar and camera |
CN112147599A (en) * | 2019-06-28 | 2020-12-29 | 浙江大学 | A spline function-based external parameter calibration method for 3D lidar and inertial sensors in continuous time |
KR20210051030A (en) * | 2019-10-29 | 2021-05-10 | 에스케이텔레콤 주식회사 | Apparatus and method for correcting bias in sensor |
CN112859051A (en) * | 2021-01-11 | 2021-05-28 | 桂林电子科技大学 | Method for correcting laser radar point cloud motion distortion |
CN113155154A (en) * | 2021-04-07 | 2021-07-23 | 扬州大学 | Error correction method based on attitude and mileage of sensor and camera |
-
2021
- 2021-08-13 CN CN202110931666.9A patent/CN113790738A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106289275A (en) * | 2015-06-23 | 2017-01-04 | 沃尔沃汽车公司 | For improving unit and the method for positioning precision |
CN106251305A (en) * | 2016-07-29 | 2016-12-21 | 长春理工大学 | A kind of realtime electronic image stabilizing method based on Inertial Measurement Unit IMU |
CN109975792A (en) * | 2019-04-24 | 2019-07-05 | 福州大学 | Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion |
WO2020233443A1 (en) * | 2019-05-21 | 2020-11-26 | 菜鸟智能物流控股有限公司 | Method and device for performing calibration between lidar and camera |
CN110221302A (en) * | 2019-05-24 | 2019-09-10 | 上海高智科技发展有限公司 | Environmental detection device and its modification method, system, portable equipment and storage medium |
CN112147599A (en) * | 2019-06-28 | 2020-12-29 | 浙江大学 | A spline function-based external parameter calibration method for 3D lidar and inertial sensors in continuous time |
KR20210051030A (en) * | 2019-10-29 | 2021-05-10 | 에스케이텔레콤 주식회사 | Apparatus and method for correcting bias in sensor |
CN110873883A (en) * | 2019-11-29 | 2020-03-10 | 上海有个机器人有限公司 | Positioning method, medium, terminal and device integrating laser radar and IMU |
CN111882612A (en) * | 2020-07-21 | 2020-11-03 | 武汉理工大学 | A vehicle multi-scale localization method based on 3D laser detection of lane lines |
CN112859051A (en) * | 2021-01-11 | 2021-05-28 | 桂林电子科技大学 | Method for correcting laser radar point cloud motion distortion |
CN113155154A (en) * | 2021-04-07 | 2021-07-23 | 扬州大学 | Error correction method based on attitude and mileage of sensor and camera |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117518196A (en) * | 2023-12-19 | 2024-02-06 | 中联重科股份有限公司 | Motion compensation method, device, system, equipment and medium for laser radar |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5992184B2 (en) | Image data processing apparatus, image data processing method, and image data processing program | |
CN109975792A (en) | Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion | |
CN105627991A (en) | Real-time panoramic stitching method and system for unmanned aerial vehicle images | |
KR20200064542A (en) | Apparatus for measuring ground control point using unmanned aerial vehicle and method thereof | |
JP3850541B2 (en) | Advanced measuring device | |
CN111380514A (en) | Robot position and posture estimation method and device, terminal and computer storage medium | |
KR101890612B1 (en) | Method and apparatus for detecting object using adaptive roi and classifier | |
CN103106339A (en) | Synchronous aerial image assisting airborne laser point cloud error correction method | |
CN112880642A (en) | Distance measuring system and distance measuring method | |
CN113758498B (en) | Unmanned aerial vehicle holder calibration method and device | |
CN111750896B (en) | Holder calibration method and device, electronic equipment and storage medium | |
CN114445506A (en) | Camera calibration processing method, device, equipment and storage medium | |
CN114383612B (en) | Vision-assisted inertial differential pose measurement system | |
CN114413887A (en) | A kind of sensor external parameter calibration method, equipment and medium | |
CN111998870B (en) | Calibration method and device of camera inertial navigation system | |
CA2555773C (en) | Shape measurement device and method thereof | |
CN103673890B (en) | Length and cylinder area measurement method based on digital image analysis | |
CN111932637B (en) | Vehicle body camera external parameter self-adaptive calibration method and device | |
KR20040054858A (en) | Method of precision correction for geometrically distorted satellite images | |
CN113790738A (en) | Data compensation method based on intelligent cradle head IMU | |
KR101224830B1 (en) | Portable Multi-Sensor System for Acquiring Georeferenced Images and Method thereof | |
JP6816454B2 (en) | Imaging device and mobile system | |
CN109945785A (en) | A kind of platform inclination angle and height method for real-time measurement and system | |
CN112837314A (en) | Parameter detection system and method of fruit tree canopy based on 2D-LiDAR and Kinect | |
KR20190051253A (en) | Image displacement measurement method for correcting three dimensional position error of camera using dual camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |