CN110345937A - Appearance localization method and system are determined in a kind of navigation based on two dimensional code - Google Patents
Appearance localization method and system are determined in a kind of navigation based on two dimensional code Download PDFInfo
- Publication number
- CN110345937A CN110345937A CN201910733720.1A CN201910733720A CN110345937A CN 110345937 A CN110345937 A CN 110345937A CN 201910733720 A CN201910733720 A CN 201910733720A CN 110345937 A CN110345937 A CN 110345937A
- Authority
- CN
- China
- Prior art keywords
- dimensional code
- navigation
- attitude
- dimensional
- monocular camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 230000004807 localization Effects 0.000 title abstract 4
- 238000013507 mapping Methods 0.000 claims abstract description 20
- 238000013519 translation Methods 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 4
- 230000008878 coupling Effects 0.000 claims description 2
- 238000010168 coupling process Methods 0.000 claims description 2
- 238000005859 coupling reaction Methods 0.000 claims description 2
- 239000011159 matrix material Substances 0.000 description 11
- 239000013598 vector Substances 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C1/00—Measuring angles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K17/00—Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
- G06K17/0022—Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisions for transferring data to distant stations, e.g. from a sensing device
- G06K17/0025—Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisions for transferring data to distant stations, e.g. from a sensing device the arrangement consisting of a wireless interrogation device in combination with a device for optically marking the record carrier
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Navigation (AREA)
Abstract
The invention discloses a kind of navigation based on two dimensional code to determine appearance localization method and system, and affiliated navigation determines appearance localization method the following steps are included: several two dimensional codes are arranged in (1);(2) monocular camera takes two dimensional code;(3) IPM mapping is carried out to the two dimensional code taken;(4) position, attitude angle and geometry information of the two dimensional code in world coordinate system are obtained;Calculate position and attitude angle of the monocular camera in two dimensional code coordinate system;(5) position and attitude angle of the monocular camera in world coordinate system are calculated, realize positioning and determines appearance.By the present invention in that being further added by the location information extracted from two dimensional code and attitude angle information with two dimensional code, can not only realizing the three-dimensional localization to devices such as autonomous driving vehicle and mobile robots and determining appearance, and it is easily achieved, it is low in cost.
Description
Technical Field
The invention relates to the technical field of navigation pose-fixing positioning, in particular to a navigation pose-fixing positioning method and system based on two-dimensional codes.
Background
Autonomous vehicles and robots must know their own position and attitude angles to achieve controlled movement. At present, laser scanning radars, camera sensors And Simultaneous positioning And Mapping (SLAM) algorithms are commonly used by automatic driving automobiles And mobile robots to realize positioning And attitude determination of the automobiles And the robots, And then a Global Navigation Satellite System (GNSS) And inertial Navigation assistance are added. The laser scanning radar has high precision, but the cost is high, so that the laser scanning radar is not suitable for large-scale commercial application. SLAM algorithms are also unreliable due to the dynamic nature of the environment surrounding the vehicle and robot. Global navigation satellite systems are also vulnerable to shadowing and are completely unable to locate indoors or in tunnels. Inertial navigation also cannot store position and attitude information of the vehicle for a long time, and the positioning and attitude determination accuracy is lost in a complex environment.
Currently, two-dimensional code positioning has been widely used for positioning and attitude determination of an Automatic Guided Vehicle (AGV) in a room. The two-dimensional code is usually pasted on the floor, and when a camera installed at the bottom of the vehicle shoots the two-dimensional code, the AGV calculates the plane position (x, y) and the yaw angle (theta) according to the information on the two-dimensional code. This method is very practical for an indoor closed environment AGV, but it does not extend to autonomous vehicles. Firstly, in an open environment, such as an outdoor road, the two-dimensional code on the ground is easily polluted by wheels and human activities, and the navigation, attitude determination and positioning capabilities are lost. Secondly, the method can only determine the two-dimensional position coordinates and the yaw angle of the AGV. However, the autonomous automobile and the mobile robot are not limited to travel on a horizontal ground. Therefore, there is a need for a new navigation, attitude determination and positioning method and system based on two-dimensional code, which can calculate the three-dimensional spatial position (x, y, z) and three attitude angles (pitch angle, yaw angle, and roll angle) of an autonomous vehicle or a robot and the like at any time to realize three-dimensional positioning.
It is seen that improvements and enhancements to the prior art are needed.
Disclosure of Invention
In view of the defects of the prior art, the invention aims to provide a navigation pose-determining positioning method and system based on a two-dimensional code, and aims to solve the technical problem that the navigation pose-determining positioning method based on the two-dimensional code can only perform two-dimensional positioning in the prior art.
In order to achieve the purpose, the invention adopts the following technical scheme:
a navigation attitude determination positioning method based on two-dimensional codes is characterized by comprising the following steps:
(1) setting a plurality of two-dimensional codes;
(2) shooting the two-dimensional code by a monocular camera;
(3) performing inverse perspective mapping on the shot two-dimensional code;
(4) acquiring the position, attitude angle and geometric dimension information of the two-dimensional code in a world coordinate system; calculating the position and attitude angle of the monocular camera in a two-dimensional code coordinate system;
(5) and calculating the position and attitude angle of the monocular camera in a world coordinate system to realize positioning and attitude determination.
Further, in the navigation, attitude determination and positioning method based on the two-dimensional codes, each two-dimensional code is respectively provided with a position identifier, a time identifier and a fitting identifier.
Further, in the navigation attitude-fixing positioning method based on the two-dimensional code, in the step (3), the two-dimensional pixel space and the world coordinate of the shot two-dimensional code meet the inverse perspective mapping relationship; the inverse perspective mapping is related to three-dimensional translation and three-axis rotation of world coordinates to monocular camera coordinates.
Further, in the navigation attitude-determining positioning method based on the two-dimensional code, the distance from the monocular camera to the two-dimensional code is calculated by utilizing the size of the two-dimensional code, the focal length and the imaging size of the two-dimensional code.
Further, in the navigation, attitude determination and positioning method based on the two-dimensional code, the three-dimensional position and the three-axis attitude angle of the monocular camera in the world coordinate system can be obtained by calculating the position and the attitude angle of the two-dimensional code in the physical space, and the translation and rotation angles from the monocular camera to the two-dimensional code.
The invention provides a navigation attitude determination positioning system based on two-dimensional codes, which comprises:
the system comprises a plurality of two-dimensional codes, a plurality of image processing units and a plurality of image processing units, wherein each two-dimensional code respectively comprises position, attitude angle and geometric dimension information of the two-dimensional code in a world coordinate system;
the monocular camera is used for shooting the two-dimensional code;
and the computer is used for carrying out IPM mapping on the two-dimensional code shot by the monocular camera and calculating the position and the attitude angle of the monocular camera in a world coordinate system.
Further, in the navigation, pose determination and positioning system based on the two-dimensional codes, the position, pose angle and geometric dimension information of each two-dimensional code in a world coordinate system is stored in the digital information of the two-dimensional code or stored in a computer.
Further, in the navigation, pose determination and positioning system based on the two-dimensional code, the two-dimensional code is used for being arranged on the ground or a building.
Furthermore, in the navigation and attitude determination positioning system based on the two-dimensional code, combined navigation is realized by using an SLAM algorithm and a laser scanning radar.
Furthermore, in the navigation attitude determination positioning system based on the two-dimensional code, the integrated navigation is realized by coupling with inertial navigation.
Has the advantages that: compared with the prior art, the invention realizes the measurement of three-dimensional positioning (x, y, z) and three-axis attitude angles (pitch angle, yaw angle and roll angle) of devices such as an automatic driving automobile, a mobile robot and the like by using the two-dimensional code and adding position information and attitude angle information extracted from the two-dimensional code so as to couple with inertial navigation and computer vision positioning and realize positioning and attitude determination. Compared with a laser scanning radar in the prior art, the invention mainly uses the camera and one or more two-dimensional codes at the same position, is simple and cheap, and can realize the cheap three-dimensional positioning and attitude determination of the automatic driving automobile.
Drawings
Fig. 1 is a schematic structural diagram of a two-dimensional code applicable to the present invention.
Fig. 2 is a schematic structural diagram of another two-dimensional code applicable to the present invention.
Fig. 3 is a schematic diagram of triangulation of a monocular head to capture a known object size.
Fig. 4 is a schematic image of a two-dimensional code photographed.
Fig. 5 is an image of the two-dimensional code restored by extracting IPM conversion parameters of the two-dimensional code shown in fig. 4.
Fig. 6 is a simplified flowchart of an embodiment of a navigation, pose determination and positioning method based on two-dimensional codes according to the present invention.
Fig. 7 is a simplified flowchart of another embodiment of a navigation, pose determination and positioning method based on two-dimensional codes according to the present invention.
Detailed Description
The invention provides a navigation attitude determination positioning method and a navigation attitude determination positioning system based on two-dimensional codes, and the invention is further explained in detail by combining with embodiments in order to make the purpose, technical scheme and effect of the invention clearer and more clear. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A navigation attitude determination positioning method based on two-dimensional codes comprises the following steps:
(1) setting a plurality of two-dimensional codes;
(2) shooting the two-dimensional code by a monocular camera;
(3) performing Inverse Perspective Mapping (namely IPM (intelligent platform management) on the shot two-dimensional code;
(4) acquiring the position, attitude angle and geometric dimension information of the two-dimensional code in a world coordinate system; calculating the position and attitude angle of the monocular camera in a two-dimensional code coordinate system;
(5) and calculating the position and attitude angle of the monocular camera in a world coordinate system to realize positioning and attitude determination.
The two-dimensional codes in the step (1) are set in the navigation path, and the specific number of the two-dimensional codes is not limited. The focal length, distortion parameters and the like of the monocular camera are all determined in advance. The "position, attitude angle and geometric dimension information of the two-dimensional code in the world coordinate system" may be information such as the longitude and latitude height, the position of the normal vector of the two-dimensional code surface in the northeast coordinate system, the size of the two-dimensional code, and the like. In practical use, the monocular camera is used for being installed on an automatic driving automobile (or other devices needing navigation, such as a mobile robot), and after the position and the attitude angle of the monocular camera in a world coordinate system are determined, the positioning and the attitude determination are also realized on the automatic driving automobile (or other devices needing navigation, such as the mobile robot).
Fig. 1 shows a form of two-dimensional codes, each of which is provided with a Position identifier (Position Patterns), a time identifier (Timing Patterns), and an Alignment pattern identifier (Alignment Patterns), respectively; they are used to extract the attitude angle (usually represented by the rotational euler angle) and translation vector of the two-dimensional code relative to the camera.
In practical applications, the two-dimensional code can have various formats. FIG. 2 illustrates another form of two-dimensional code with nested circular ring position identifiers in the top left, top right, and bottom left corners of the two-dimensional code, and the S letter in the bottom right corner; when the monocular camera and the two-dimensional code are at different distances and different angles, they can also be used to extract the relative distance and the relative rotation angle between the monocular camera and the two-dimensional code.
In the step (3), the two-dimensional pixel space and the world coordinate of the shot two-dimensional code meet the inverse perspective mapping relation; the inverse perspective mapping is related to three-dimensional translation and three-axis rotation of world coordinates to monocular camera coordinates. Specifically, the IPM mapping relationship of the coordinates of one point P in the two-dimensional pixel space and the three-dimensional physical space of the two-dimensional code is as follows:
wherein Z is the Z-axis position component of the point P in the monocular camera coordinate system (namely, the distance between the monocular camera and the center position of the two-dimensional code); u and v are two-dimensional pixel coordinates; f. ofxIs the focal length in the x-axis, fyIs the focal length on the y-axis, cxFor translation in the x-axis, cyTranslation in the y-axis; r is a three-dimensional rotation matrix from a monocular camera coordinate system to a physical coordinate system; pw is the coordinate value of the point P in the world coordinate system; t is a translation vector from a monocular camera coordinate system to three directions of a world coordinate system; k is an IPM transformation matrix; t is a rotation and translation matrix between the world coordinate system to the monocular camera coordinate system.
The distance from the monocular camera to the two-dimensional code (i.e., Z in formula (1)) is calculated by using the size of the two-dimensional code, the focal length, and the imaging size of the two-dimensional code. Specifically, Z can be calculated by a triangulation method (see fig. 3), specifically according to the following formula:
wherein f is the focal length of the monocular camera, D is the geometric size of the two-dimensional code, and D is the imaging size of the two-dimensional code. Since the size D of the two-dimensional code (e.g., the distance between two diagonally-determined position markers) and the focal length of the monocular camera are determined in advance, Z can be calculated.
In practical applications, when the two-dimensional code is not directly in front of the monocular camera and the two-dimensional code is not directly facing upwards (such as a two-dimensional code image captured as shown in fig. 4), the position identifier of the two-dimensional code can be used to extract a rotation vector and a translation vector of the monocular camera relative to the two-dimensional code; they represent 6 IPM parameters of 3 euler angles and 3 directions of translation of the monocular camera with respect to the two-dimensional code rotation. These 6 parameters determine the IPM mapping (i.e., inverse perspective mapping) between the 2-dimensional image coordinates to the 3-dimensional two-dimensional code coordinates. The IPM mapped image recovers a two-dimensional code (such as the two-dimensional code image shown in fig. 5) having no perspective distortion on the front surface, and can be used for decoding information for reading the two-dimensional code. It should be noted that it is a mature technology to quickly find the location identifier of the two-dimensional code in the image, fit the identifier and use them to calibrate the two-dimensional code itself, and the innovation of the present invention is to use these identifiers to extract the rotation and translation vectors between the two-dimensional code and the monocular camera for calculating the displacement and attitude angle of the monocular camera with respect to the center of the two-dimensional code.
Further, the three-dimensional position and the three-axis attitude angle of the monocular camera in the world coordinate system can be obtained by calculating the position and the attitude angle of the two-dimensional code in the physical space, and the translation and rotation angles from the monocular camera to the two-dimensional code. Specifically, the position and attitude angle of the monocular camera in the world coordinate system can be calculated by the following formula:
wherein,andare all matrices (preferably 4 x 4 matrices). Matrix arrayRepresenting the position and attitude angle of the monocular camera in a world coordinate system; matrix arrayRepresenting the position and attitude angle of the monocular camera in a two-dimensional code coordinate system; matrix arrayAnd the position and the attitude angle of the two-dimensional code in a world coordinate system are represented.
In practical application, IPM mapping is carried out on the shot two-dimensional code, and coordinates of the monocular camera on the two-dimensional code are calculatedPosition and attitude angles in the system, thereby constructing a matrixAnd extracting the position, attitude angle and geometric dimension information of the two-dimensional code in a world coordinate system, wherein the position, attitude angle and geometric dimension information of the two-dimensional code are contained in the two-dimensional code, and the information is used for constructing a matrixThereby calculating a matrix
The 4 x 4 matrix when used represents the rotation and translation of the monocular camera with respect to world coordinates,the following can be constructed:
wherein,an orthogonal matrix of 3x3 representing the rotation attitude angle of the monocular camera with respect to the world coordinate system;representing the translation components of the monocular camera in the x, y, z directions relative to the world coordinate system.
The invention provides a navigation attitude determination positioning system based on two-dimensional codes, which comprises:
the system comprises a plurality of two-dimensional codes, a plurality of image processing units and a plurality of image processing units, wherein each two-dimensional code respectively comprises position, attitude angle and geometric dimension information of the two-dimensional code in a world coordinate system;
the monocular camera is used for shooting the two-dimensional code;
and the computer is used for carrying out IPM mapping on the two-dimensional code shot by the monocular camera and calculating the position and the attitude angle of the monocular camera in a world coordinate system.
In the invention, the position, attitude angle and geometric dimension information of each two-dimensional code in a world coordinate system is stored in the digital information of the two-dimensional code or stored in a computer (or a server).
In practical application, the two-dimensional code is used for being arranged on the ground or a building; the arrangement on the building described herein may be on a wall, on an overpass, on a street beside a driveway, among other locations.
Autonomous cars and mobile robots can move over a wide range. If the distance between the monocular camera and the two-dimensional code is large, the two-dimensional code with the small size cannot be identified. In order to solve the problem, two-dimensional codes with different sizes can be used in practical use, namely: setting a two-dimensional code with a large size at a place possibly far away from the monocular camera; and a two-dimensional code with a smaller size is arranged at a place closer to the monocular camera.
In practical application, the navigation and attitude determination positioning system based on the two-dimensional code can also combine other mature navigation technologies, such as inertial navigation, satellite navigation, simultaneous positioning and mapping (SLAM) of vision and laser, wheel encoder navigation of an automatic driving automobile and other methods, so as to further improve the positioning precision and expand the application range. The extended two-dimensional code positioning system provides measurement information for other navigation attitude-determining positioning systems, and the measurement information can be used for correcting the vehicle position and the vehicle attitude angle by using a Kalman filter, or nonlinearly optimizing the vehicle position and attitude angle information, so that the measurement error is minimum, and the purpose of positioning the attitude angle is achieved. Including but not limited to the following:
(1) the navigation attitude determination positioning system based on the two-dimensional code realizes integrated navigation by using a Kalman filter and other navigation systems.
(2) The navigation attitude-determining positioning system based on the two-dimensional code corrects the position and the attitude angle of a vehicle by using a nonlinear optimization method, and realizes combined navigation with other navigation systems.
(3) The navigation attitude determination positioning system based on the two-dimensional code is coupled with inertial navigation to realize integrated navigation.
(4) The navigation attitude determination positioning system laser scanning radar based on the two-dimensional code uses an SLAM algorithm to realize combined navigation.
(5) The navigation, attitude determination and positioning based on the two-dimensional code and the system vision (or stereoscopic vision) use an SLAM algorithm to realize combined navigation.
(6) The navigation attitude-fixing positioning system is coupled with the satellite navigation to realize the integrated navigation.
(7) The navigation attitude-fixing positioning system based on the two-dimensional code and the inertial navigation and laser scanning radar use the SLAM algorithm to realize combined navigation.
(8) The navigation attitude determination positioning system based on the two-dimensional code and inertial navigation, vision or stereoscopic vision use SLAM algorithm to realize combined navigation.
(9) The navigation attitude-determining positioning system based on the two-dimensional code, the inertial navigation and the vision using the SLAM algorithm or the stereoscopic vision and the satellite navigation realize the combined navigation.
(10) The navigation attitude-fixing positioning system based on the two-dimensional code, the inertial navigation, the laser scanning radar using the SLAM algorithm and the satellite navigation realize the combined navigation.
(11) The navigation attitude-determining positioning system based on the two-dimensional code, the inertial navigation, the laser scanning radar using the SLAM algorithm, and the vision or stereoscopic vision and satellite navigation using the SLAM algorithm realize combined navigation.
The navigation pose-fixing positioning method and system based on the two-dimensional code are applied to various combined navigations, and therefore, the navigation pose-fixing positioning method and system based on the two-dimensional code are all within the protection scope of the invention.
Continuing to refer to fig. 6, fig. 6 is a simplified workflow diagram of a two-dimensional code based navigation, attitude determination and positioning system, wherein the position, attitude angle and geometric dimension information of each two-dimensional code in the world coordinate system is stored in the digital information of the two-dimensional code. The process is briefly described as follows: starting navigation; the navigation software searches images shot by the monocular camera all the time and searches for the two-dimensional code; when the two-dimensional code is found, the navigation software attempts to extract the appearance of the monocular camera, i.e., the rotation and displacement of the monocular camera relative to the two-dimensional code (where the displacement parameters lack scale, so the displacement has an unknown scale multiplier). If 6 IPM parameters are successfully extracted, IPM mapping is carried out on the shot two-dimensional code, the two-dimensional code is converted into a front physical space, and perspective errors are eliminated. And then decoding the two-dimensional code, and extracting the spatial coordinates and attitude angles of the two-dimensional code embedded in the two-dimensional code and the scale of the two-dimensional code. The scale information of the two-dimensional code can be used for determining the scale multiplier of the IPM transformation in front, so that all rotation and displacement parameters can be extracted, and the positioning of the vehicle is realized. Then the next two-dimensional code shot by the monocular camera is searched until the navigation is finished.
Referring to fig. 7, in another embodiment, the position, attitude angle and size information of each two-dimensional code in the world coordinate system is stored on the navigation computer of the autonomous vehicle or stored on the navigation server through the network. The process is similar to that shown in fig. 6, except that when the monocular camera shoots a two-dimensional code, the navigation software can acquire the position, attitude angle and size information of the two-dimensional code in a world coordinate system from a computer or a navigation server according to the unique identification number of the two-dimensional code.
The above-mentioned "navigation software" is per se prior art and is not limited thereto.
Through the analysis, the invention realizes the measurement of three-dimensional positioning (x, y, z) and three-axis attitude angles (pitch angle, yaw angle and roll angle) of devices such as an automatic driving automobile, a mobile robot and the like by using the two-dimensional code and adding the position information and the attitude angle information extracted from the two-dimensional code, so as to be coupled with inertial navigation and computer vision positioning, and realize positioning and attitude determination. Compared with a laser scanning radar in the prior art, the invention mainly uses the camera and one or more two-dimensional codes at the same position, is simple and cheap, and can realize the cheap three-dimensional positioning and attitude determination of the automatic driving automobile. In addition, if only two-dimensional position coordinates (x, y) and an Euler angle (yaw angle) of the automatic driving automobile are extracted in the navigation attitude-fixing positioning method, two-dimensional positioning on a plane can be realized.
It should be understood that equivalents and modifications of the technical solution and inventive concept thereof may occur to those skilled in the art, and all such modifications and alterations should fall within the protective scope of the present invention.
Claims (10)
1. A navigation attitude determination positioning method based on two-dimensional codes is characterized by comprising the following steps:
(1) setting a plurality of two-dimensional codes;
(2) shooting the two-dimensional code by a monocular camera;
(3) performing inverse perspective mapping on the shot two-dimensional code;
(4) acquiring the position, attitude angle and geometric dimension information of the two-dimensional code in a world coordinate system; calculating the position and attitude angle of the monocular camera in a two-dimensional code coordinate system;
(5) and calculating the position and attitude angle of the monocular camera in a world coordinate system to realize positioning and attitude determination.
2. The navigation, attitude and positioning method based on the two-dimensional codes according to claim 1, characterized in that each two-dimensional code is respectively provided with a position identifier, a time identifier and a fitting identifier.
3. The navigation, attitude and location method based on the two-dimensional code according to claim 1, characterized in that in step (3), the two-dimensional pixel space and world coordinates of the photographed two-dimensional code satisfy an inverse perspective mapping relationship; the inverse perspective mapping is related to three-dimensional translation and three-axis rotation of world coordinates to monocular camera coordinates.
4. The navigation, attitude determination and positioning method based on the two-dimensional code as claimed in claim 3, wherein the distance from the monocular camera to the two-dimensional code is calculated by using the size of the two-dimensional code, the focal length and the imaging size of the two-dimensional code.
5. The two-dimensional code based navigation, attitude determination and positioning method according to claim 1, wherein the three-dimensional position and the three-axis attitude angle of the monocular camera in the world coordinate system can be calculated from the position and the attitude angle of the two-dimensional code in the physical space, and the translation and rotation angles from the monocular camera to the two-dimensional code.
6. The navigation attitude determination positioning system based on the two-dimensional code is characterized by comprising the following components:
the system comprises a plurality of two-dimensional codes, a plurality of image processing units and a plurality of image processing units, wherein each two-dimensional code respectively comprises position, attitude angle and geometric dimension information of the two-dimensional code in a world coordinate system;
the monocular camera is used for shooting the two-dimensional code;
and the computer is used for carrying out inverse perspective mapping on the two-dimensional code shot by the monocular camera and calculating the position and the attitude angle of the monocular camera in a world coordinate system.
7. The two-dimensional code based navigation, attitude and orientation positioning system according to claim 6, wherein the position, attitude angle and geometric dimension information of each two-dimensional code in the world coordinate system is stored in the digital information of the two-dimensional code or stored in the computer.
8. The two-dimensional code based navigation, attitude and positioning system according to claim 6, wherein the two-dimensional code is used for being arranged on the ground or on a building.
9. The two-dimensional code based navigation, attitude and positioning system according to claim 6, characterized in that the combined navigation is realized by using SLAM algorithm and laser scanning radar.
10. The two-dimensional code based navigation, attitude and positioning system according to claim 6, characterized in that the integrated navigation is realized by coupling with inertial navigation.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910733720.1A CN110345937A (en) | 2019-08-09 | 2019-08-09 | Appearance localization method and system are determined in a kind of navigation based on two dimensional code |
PCT/CN2019/100692 WO2021026850A1 (en) | 2019-08-09 | 2019-08-15 | Qr code-based navigation attitude determining and positioning method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910733720.1A CN110345937A (en) | 2019-08-09 | 2019-08-09 | Appearance localization method and system are determined in a kind of navigation based on two dimensional code |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110345937A true CN110345937A (en) | 2019-10-18 |
Family
ID=68184501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910733720.1A Pending CN110345937A (en) | 2019-08-09 | 2019-08-09 | Appearance localization method and system are determined in a kind of navigation based on two dimensional code |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110345937A (en) |
WO (1) | WO2021026850A1 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110702118A (en) * | 2019-10-25 | 2020-01-17 | 桂林电子科技大学 | AGV-based outdoor positioning navigation system and positioning method thereof |
CN110806749A (en) * | 2019-10-31 | 2020-02-18 | 成都四威高科技产业园有限公司 | Accurate positioning method and system for differential drive AGV |
CN111015654A (en) * | 2019-12-18 | 2020-04-17 | 深圳市优必选科技股份有限公司 | Visual positioning method and device for robot, terminal equipment and storage medium |
CN111426320A (en) * | 2020-05-18 | 2020-07-17 | 中南大学 | Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter |
CN111619388A (en) * | 2020-07-02 | 2020-09-04 | 孙旭阳 | Intelligent charging device for electric automobile |
CN111679671A (en) * | 2020-06-08 | 2020-09-18 | 南京聚特机器人技术有限公司 | Method and system for automatic docking of robot and charging pile |
CN111964681A (en) * | 2020-07-29 | 2020-11-20 | 中国安全生产科学研究院 | Real-time positioning system of inspection robot |
CN111964680A (en) * | 2020-07-29 | 2020-11-20 | 中国安全生产科学研究院 | Real-time positioning method of inspection robot |
CN112419403A (en) * | 2020-11-30 | 2021-02-26 | 海南大学 | Indoor unmanned aerial vehicle positioning method based on two-dimensional code array |
CN112596070A (en) * | 2020-12-29 | 2021-04-02 | 四叶草(苏州)智能科技有限公司 | Robot positioning method based on laser and vision fusion |
CN112684792A (en) * | 2020-12-01 | 2021-04-20 | 广东嘉腾机器人自动化有限公司 | Two-dimensional code array label detection method and storage device |
CN112926712A (en) * | 2021-04-13 | 2021-06-08 | 西安美拓信息技术有限公司 | Continuous positioning system and method for four-way shuttle |
CN113256732A (en) * | 2021-04-19 | 2021-08-13 | 安吉智能物联技术有限公司 | Camera calibration and pose acquisition method |
CN113566827A (en) * | 2021-07-09 | 2021-10-29 | 中国能源建设集团安徽省电力设计院有限公司 | Transformer substation inspection robot indoor positioning method based on information fusion |
CN113935356A (en) * | 2021-10-20 | 2022-01-14 | 广东新时空科技股份有限公司 | Three-dimensional positioning and attitude determining system and method based on two-dimensional code |
CN114136314A (en) * | 2021-11-30 | 2022-03-04 | 北京天兵科技有限公司 | Auxiliary attitude calculation method for aerospace vehicle |
WO2022170855A1 (en) * | 2021-02-09 | 2022-08-18 | 灵动科技(北京)有限公司 | Method and device for controlling autonomous mobile robot |
CN115774265A (en) * | 2023-02-15 | 2023-03-10 | 江苏集萃清联智控科技有限公司 | Two-dimensional code and laser radar fusion positioning method and device for industrial robot |
CN115936029A (en) * | 2022-12-13 | 2023-04-07 | 湖南大学无锡智能控制研究院 | SLAM positioning method and device based on two-dimensional code |
CN116592876A (en) * | 2023-07-17 | 2023-08-15 | 北京元客方舟科技有限公司 | Positioning device and positioning method thereof |
CN117011387A (en) * | 2023-10-07 | 2023-11-07 | 湖州丽天智能科技有限公司 | Photovoltaic panel pose fitting method based on visual recognition and installation robot |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105243366A (en) * | 2015-10-10 | 2016-01-13 | 北京微尘嘉业科技有限公司 | Two-dimensional code based vehicle positioning method |
CN106323294A (en) * | 2016-11-04 | 2017-01-11 | 新疆大学 | Positioning method and device for patrol robot of transformer substation |
US9969337B2 (en) * | 2014-09-03 | 2018-05-15 | Sharp Laboratories Of America, Inc. | Methods and systems for mobile-agent navigation |
CN108305264A (en) * | 2018-06-14 | 2018-07-20 | 江苏中科院智能科学技术应用研究院 | A kind of unmanned plane precision landing method based on image procossing |
CN109543489A (en) * | 2019-01-04 | 2019-03-29 | 广州广电研究院有限公司 | Localization method, device and storage medium based on two dimensional code |
CN109725645A (en) * | 2019-03-29 | 2019-05-07 | 中国人民解放军国防科技大学 | Nested unmanned aerial vehicle landing cooperation sign design and relative pose acquisition method |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106507285A (en) * | 2016-11-22 | 2017-03-15 | 宁波亿拍客网络科技有限公司 | A kind of based on the localization method of position basic point, specific markers and relevant device method |
CN106969766A (en) * | 2017-03-21 | 2017-07-21 | 北京品创智能科技有限公司 | A kind of indoor autonomous navigation method based on monocular vision and Quick Response Code road sign |
CN106989746A (en) * | 2017-03-27 | 2017-07-28 | 远形时空科技(北京)有限公司 | Air navigation aid and guider |
CN107578078B (en) * | 2017-09-14 | 2020-05-12 | 哈尔滨工业大学 | Non-mirror symmetry two-dimensional code mark graph verification and layout method for monocular vision positioning |
CN107830854A (en) * | 2017-11-06 | 2018-03-23 | 深圳精智机器有限公司 | Vision positioning method based on sparse cloud of ORB and Quick Response Code |
-
2019
- 2019-08-09 CN CN201910733720.1A patent/CN110345937A/en active Pending
- 2019-08-15 WO PCT/CN2019/100692 patent/WO2021026850A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9969337B2 (en) * | 2014-09-03 | 2018-05-15 | Sharp Laboratories Of America, Inc. | Methods and systems for mobile-agent navigation |
CN105243366A (en) * | 2015-10-10 | 2016-01-13 | 北京微尘嘉业科技有限公司 | Two-dimensional code based vehicle positioning method |
CN106323294A (en) * | 2016-11-04 | 2017-01-11 | 新疆大学 | Positioning method and device for patrol robot of transformer substation |
CN108305264A (en) * | 2018-06-14 | 2018-07-20 | 江苏中科院智能科学技术应用研究院 | A kind of unmanned plane precision landing method based on image procossing |
CN109543489A (en) * | 2019-01-04 | 2019-03-29 | 广州广电研究院有限公司 | Localization method, device and storage medium based on two dimensional code |
CN109725645A (en) * | 2019-03-29 | 2019-05-07 | 中国人民解放军国防科技大学 | Nested unmanned aerial vehicle landing cooperation sign design and relative pose acquisition method |
Non-Patent Citations (1)
Title |
---|
尚明超: "基于二维码和角点标签的无人小车室内定位算法研究实现", 《中国优秀硕士学位论文全文数据库》 * |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110702118A (en) * | 2019-10-25 | 2020-01-17 | 桂林电子科技大学 | AGV-based outdoor positioning navigation system and positioning method thereof |
CN110806749A (en) * | 2019-10-31 | 2020-02-18 | 成都四威高科技产业园有限公司 | Accurate positioning method and system for differential drive AGV |
CN111015654A (en) * | 2019-12-18 | 2020-04-17 | 深圳市优必选科技股份有限公司 | Visual positioning method and device for robot, terminal equipment and storage medium |
CN111426320A (en) * | 2020-05-18 | 2020-07-17 | 中南大学 | Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter |
CN111679671A (en) * | 2020-06-08 | 2020-09-18 | 南京聚特机器人技术有限公司 | Method and system for automatic docking of robot and charging pile |
CN111619388A (en) * | 2020-07-02 | 2020-09-04 | 孙旭阳 | Intelligent charging device for electric automobile |
CN111964681A (en) * | 2020-07-29 | 2020-11-20 | 中国安全生产科学研究院 | Real-time positioning system of inspection robot |
CN111964680A (en) * | 2020-07-29 | 2020-11-20 | 中国安全生产科学研究院 | Real-time positioning method of inspection robot |
CN112419403A (en) * | 2020-11-30 | 2021-02-26 | 海南大学 | Indoor unmanned aerial vehicle positioning method based on two-dimensional code array |
CN112684792A (en) * | 2020-12-01 | 2021-04-20 | 广东嘉腾机器人自动化有限公司 | Two-dimensional code array label detection method and storage device |
CN112684792B (en) * | 2020-12-01 | 2022-05-10 | 广东嘉腾机器人自动化有限公司 | Two-dimensional code array label detection method and storage device |
CN112596070A (en) * | 2020-12-29 | 2021-04-02 | 四叶草(苏州)智能科技有限公司 | Robot positioning method based on laser and vision fusion |
CN112596070B (en) * | 2020-12-29 | 2024-04-19 | 四叶草(苏州)智能科技有限公司 | Robot positioning method based on laser and vision fusion |
WO2022170855A1 (en) * | 2021-02-09 | 2022-08-18 | 灵动科技(北京)有限公司 | Method and device for controlling autonomous mobile robot |
CN112926712B (en) * | 2021-04-13 | 2023-09-22 | 西安美拓信息技术有限公司 | Four-way shuttle continuous positioning system and method |
CN112926712A (en) * | 2021-04-13 | 2021-06-08 | 西安美拓信息技术有限公司 | Continuous positioning system and method for four-way shuttle |
CN113256732A (en) * | 2021-04-19 | 2021-08-13 | 安吉智能物联技术有限公司 | Camera calibration and pose acquisition method |
CN113566827A (en) * | 2021-07-09 | 2021-10-29 | 中国能源建设集团安徽省电力设计院有限公司 | Transformer substation inspection robot indoor positioning method based on information fusion |
CN113566827B (en) * | 2021-07-09 | 2024-08-30 | 中国能源建设集团安徽省电力设计院有限公司 | Indoor positioning method for substation inspection robot based on information fusion |
CN113935356A (en) * | 2021-10-20 | 2022-01-14 | 广东新时空科技股份有限公司 | Three-dimensional positioning and attitude determining system and method based on two-dimensional code |
CN114136314A (en) * | 2021-11-30 | 2022-03-04 | 北京天兵科技有限公司 | Auxiliary attitude calculation method for aerospace vehicle |
CN115936029B (en) * | 2022-12-13 | 2024-02-09 | 湖南大学无锡智能控制研究院 | SLAM positioning method and device based on two-dimensional code |
CN115936029A (en) * | 2022-12-13 | 2023-04-07 | 湖南大学无锡智能控制研究院 | SLAM positioning method and device based on two-dimensional code |
CN115774265A (en) * | 2023-02-15 | 2023-03-10 | 江苏集萃清联智控科技有限公司 | Two-dimensional code and laser radar fusion positioning method and device for industrial robot |
CN116592876B (en) * | 2023-07-17 | 2023-10-03 | 北京元客方舟科技有限公司 | Positioning device and positioning method thereof |
CN116592876A (en) * | 2023-07-17 | 2023-08-15 | 北京元客方舟科技有限公司 | Positioning device and positioning method thereof |
CN117011387B (en) * | 2023-10-07 | 2024-01-26 | 湖州丽天智能科技有限公司 | Photovoltaic panel pose fitting method based on visual recognition and installation robot |
CN117011387A (en) * | 2023-10-07 | 2023-11-07 | 湖州丽天智能科技有限公司 | Photovoltaic panel pose fitting method based on visual recognition and installation robot |
Also Published As
Publication number | Publication date |
---|---|
WO2021026850A1 (en) | 2021-02-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110345937A (en) | Appearance localization method and system are determined in a kind of navigation based on two dimensional code | |
US10788830B2 (en) | Systems and methods for determining a vehicle position | |
JP5992184B2 (en) | Image data processing apparatus, image data processing method, and image data processing program | |
CN110411457B (en) | Positioning method, system, terminal and storage medium based on stroke perception and vision fusion | |
JP4232167B1 (en) | Object identification device, object identification method, and object identification program | |
US8744169B2 (en) | Voting strategy for visual ego-motion from stereo | |
CN109443348B (en) | Underground garage position tracking method based on fusion of look-around vision and inertial navigation | |
CN111121754A (en) | Mobile robot positioning navigation method and device, mobile robot and storage medium | |
KR101444685B1 (en) | Method and Apparatus for Determining Position and Attitude of Vehicle by Image based Multi-sensor Data | |
CN104501779A (en) | High-accuracy target positioning method of unmanned plane on basis of multi-station measurement | |
JP4978615B2 (en) | Target identification device | |
CN114459467B (en) | VI-SLAM-based target positioning method in unknown rescue environment | |
CN110458885B (en) | Positioning system and mobile terminal based on stroke perception and vision fusion | |
CN108007456A (en) | A kind of indoor navigation method, apparatus and system | |
JP4132068B2 (en) | Image processing apparatus, three-dimensional measuring apparatus, and program for image processing apparatus | |
Jende et al. | A fully automatic approach to register mobile mapping and airborne imagery to support the correction of platform trajectories in GNSS-denied urban areas | |
CN114037762B (en) | Real-time high-precision positioning method based on registration of image and high-precision map | |
Hoang et al. | 3D motion estimation based on pitch and azimuth from respective camera and laser rangefinder sensing | |
CN112862818A (en) | Underground parking lot vehicle positioning method combining inertial sensor and multi-fisheye camera | |
Khoshelham et al. | Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry | |
Xian et al. | Fusing stereo camera and low-cost inertial measurement unit for autonomous navigation in a tightly-coupled approach | |
Ji et al. | Comparison of two panoramic sensor models for precise 3d measurements | |
CN115930948A (en) | Orchard robot fusion positioning method | |
US20240200953A1 (en) | Vision based cooperative vehicle localization system and method for gps-denied environments | |
Liu et al. | Stereo-image matching using a speeded up robust feature algorithm in an integrated vision navigation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191018 |