CN111145262A - Vehicle-mounted monocular calibration method - Google Patents
Vehicle-mounted monocular calibration method Download PDFInfo
- Publication number
- CN111145262A CN111145262A CN201910833525.6A CN201910833525A CN111145262A CN 111145262 A CN111145262 A CN 111145262A CN 201910833525 A CN201910833525 A CN 201910833525A CN 111145262 A CN111145262 A CN 111145262A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- point
- calibration
- coordinates
- vehicle body
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 239000011159 matrix material Substances 0.000 claims abstract description 21
- 238000012545 processing Methods 0.000 claims abstract description 12
- 230000009466 transformation Effects 0.000 claims abstract description 12
- 238000006243 chemical reaction Methods 0.000 claims abstract description 7
- 239000003550 marker Substances 0.000 claims description 22
- 238000005259 measurement Methods 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 3
- 230000001960 triggered effect Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 12
- 238000012549 training Methods 0.000 description 11
- 241000282414 Homo sapiens Species 0.000 description 5
- 244000241872 Lycium chinense Species 0.000 description 5
- 235000015468 Lycium chinense Nutrition 0.000 description 5
- 238000013519 translation Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000003860 storage Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 102000027426 receptor tyrosine kinases Human genes 0.000 description 2
- 108091008598 receptor tyrosine kinases Proteins 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 241001061260 Emmelichthys struhsakeri Species 0.000 description 1
- 238000004026 adhesive bonding Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000005357 flat glass Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002427 irreversible effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000004126 nerve fiber Anatomy 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- VCMMXZQDRFWYSE-UHFFFAOYSA-N plumbagin Chemical compound C1=CC=C2C(=O)C(C)=CC(=O)C2=C1O VCMMXZQDRFWYSE-UHFFFAOYSA-N 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention relates to the field of image processing, and discloses a vehicle-mounted-based automatic calibration control method which comprises the following steps: acquiring image data, identifying a calibration point, and calculating a pixel coordinate of the calibration point; acquiring world coordinate system coordinates of the vehicle-mounted positioning points O' and O, calculating the heading of the vehicle body, and triggering calibration when the position relation between the vehicle-mounted positioning points and the positioning points is judged to accord with a preset condition; establishing a vehicle body coordinate system by taking the vehicle-mounted positioning point O as an original point, and calculating the vehicle body coordinate system coordinate of the positioning point; calculating a transformation matrix M from the vehicle body coordinate system to the pixel coordinate system according to the vehicle body coordinate system coordinates and the pixel coordinates of the calibration points; calculating a conversion function from the coordinates of the vehicle body coordinate system of the point P to the coordinates of the world coordinate system by the coordinates of the world coordinate systems of the vehicle-mounted positioning points O' and O and the coordinates of the vehicle body coordinate system of any point P; and storing the result and ending the calibration. Some technical effects of the invention are as follows: and automatic calibration of the monocular camera is realized.
Description
Technical Field
The invention relates to the field of image processing, in particular to a monocular calibration technology in the field of image processing.
Background
Vision is an important means for human beings to observe the world and to know the world, and accounts for 70% of the information that human beings obtain from the external environment. Human beings obtain the light that peripheral object reflects or oneself sends with eyes, and the light forms the image on the retina, passes through nerve fiber and conveys to the brain, and the brain processes and understands the visual information, forms the vision finally. The computer vision simulates the function of human vision, the camera is used for acquiring images of the surrounding environment, and the computer is used for processing the images. The computer vision can complete the work which can not be qualified by the human vision, such as the accurate measurement of the size, the distance and the like of the object to be measured. The computer vision technology can be widely applied to the fields of surveying and mapping, vision detection, automatic driving and the like.
One of the basic tasks of computer vision is to calculate geometric information of an object in three-dimensional space from image information acquired by a camera, and to reconstruct or recognize the object therefrom, and further to recognize the real world. Wherein, camera calibration is a necessary way to accomplish the task.
The current popular method is often completed by manual interaction. The manual interaction participation degree is high, so that the calibration method is lack of automation and low in repeatability, and the manual step needs to be repeated once for calibration.
Disclosure of Invention
In order to at least solve the problem of automatic calibration of monocular camera calibration, the invention provides an automatic calibration method of a monocular camera based on vehicle, which has the following technical scheme:
the method comprises the following steps: acquiring image data, identifying a calibration point, and calculating a pixel coordinate of the calibration point; acquiring world coordinate system coordinates of the vehicle-mounted positioning points O' and O, calculating the heading of the vehicle body, and triggering calibration when the position relation between the vehicle-mounted positioning points and the positioning points is judged to accord with a preset condition; establishing a right-hand rectangular coordinate system, namely a vehicle body coordinate system, by taking the vehicle-mounted positioning point O as an origin, the vehicle body course as a y-axis, the vertical ground direction as a z-axis and the vertical vehicle body course direction as an x-axis; acquiring world coordinate system coordinates of the calibration points, and calculating vehicle body coordinate system coordinates of the calibration points; calculating a transformation matrix M from the vehicle body coordinate system to the pixel coordinate system according to the vehicle body coordinate system coordinates and the pixel coordinates of the calibration points; calculating a conversion function from the coordinates of the vehicle body coordinate system of the point P to the coordinates of the world coordinate system by the coordinates of the world coordinate systems of the vehicle-mounted positioning points O' and O and the coordinates of the vehicle body coordinate system of any point P; and storing the result and ending the calibration.
Preferably, identifying the index points and calculating the pixel coordinates of the index points is performed by: processing the image data to generate a circumscribed rectangle of the marker in the image; the lower side of the circumscribed rectangle extends downwards for a preset pixel coordinate length to generate an interested area; traversing the region of interest to generate a marking region of a preset marker related to the marker; and traversing the marked area to generate a central line, wherein the pixel coordinate of the upper end point of the central line is the pixel coordinate of the calibration point.
Preferably, the vehicle-mounted positioning point is provided with an RTK positioning device to acquire world coordinate system coordinates; the coordinate of the world coordinate system of the vehicle-mounted positioning point O is the coordinate of the real-time world coordinate system of the RTK positioning device, and the coordinate of the world coordinate system of the vehicle-mounted positioning point O' is the coordinate of the world coordinate system of the RTK positioning device at any historical moment.
Preferably, the RTK positioning device is mounted in the very center of the roof.
Preferably, the heading of the vehicle body is a space vector indicating the advancing direction of the vehicleA defined direction;
preferably, the judgment is carried out by calculating an included angle α of a connecting line between the vehicle body heading and the calibration point, and the calibration is triggered when one of the following preset conditions is met:
the preset condition 1 is that the included angle α between the course of the car body and the connecting line of any two same-side parallel calibration points0Is less than or equal to the first threshold value,
preset condition 2 is that the included angle α between the course of the car body and the connecting line of any two different-side symmetrical calibration points0And | α0-90 ° | is less than or equal to the first threshold value.
Preferably, the first threshold is 5 °.
Preferably, the world coordinate system coordinates of the index point are obtained by any one of the following methods: the method comprises the following steps: manually measuring to obtain world coordinate system coordinates of the calibration point; the method 2 comprises the following steps: and the RTK positioning device is arranged on the calibration point to transmit.
Preferably, the body coordinate system of the index point is calculated as: is provided with an arbitrary calibration point A1Let a vectorAnd vectorIs α1The method comprises the following steps:
passing point C1Making a vertical line perpendicular to the ground plane, and making the foot B1Then | B1C1I is the Z coordinate of the O point in the world coordinate system minus A1The Z coordinate value of the point in the world coordinate system is as follows:
A1the coordinates in the vehicle body coordinate system with O as the origin of coordinates are:
Wherein α, β and theta are rotation angles around the x-axis, y-axis and z-axis respectively, and Tx、Ty、TzRespectively, in the x-axis, y-axis, and z-axis directions.
Preferably, the point P is a transfer function of body coordinate system coordinates to world coordinate system coordinates:
let the coordinates of the O point and the O' point in the world coordinate system be (x)O,yO,zO),(xO′,yO′,zO′) The included angle between the clockwise direction and the true north direction of the earth is α2And then:
i.e. the angle to the y-axis
①xO′>xO,yO′>yO:α2=2π-μ;
②xO′<xO,yO′>yO:α2=μ;
③xO′>xO,yO′<yO:α2=π+μ;
④xO′<xO,yO′<yO:α2=π-μ。
Let the coordinate of the point P in the coordinate system of the vehicle body be (x)P,yP,zP) Then, the coordinates of the point P in the world coordinate system are:
x=xPcosα2-yPsinα2+xo
y=xPsinα2+yPcosα2+yo
z=zP+zo。
the method at least provides a technical scheme for automatically calibrating the monocular camera based on the vehicle, and at least can well realize the automatic calibration of the monocular camera.
Drawings
For a better understanding of the technical solution of the present invention, reference is made to the following drawings, which are included to assist in describing the prior art or embodiments. These drawings will selectively demonstrate articles of manufacture or methods related to either the prior art or some embodiments of the invention. The basic information for these figures is as follows:
FIG. 1 is a flowchart of a vehicle-based monocular calibration method in one embodiment.
FIG. 2 is a schematic diagram illustrating an exemplary location of a calibration point.
Fig. 3 is a schematic diagram of an installation position of an RTK positioning point in one embodiment.
FIG. 4 is a diagram illustrating the calculation of coordinates of index points according to an embodiment.
FIG. 5 is a schematic view of the pavement arrangement of the calibration object in one embodiment.
FIG. 6 is a schematic diagram illustrating a predetermined marker placement in one embodiment.
Detailed Description
The technical means or technical effects related to the present invention will be further described below, and it is obvious that the examples provided are only some embodiments of the present invention, and not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step, will be within the scope of the present invention based on the embodiments of the present invention and the explicit or implicit representations or hints.
On the general idea, the invention discloses a vehicle-mounted monocular calibration method, which comprises the following steps: acquiring image data, identifying a calibration point, and calculating a pixel coordinate of the calibration point; acquiring world coordinate system coordinates of the vehicle-mounted positioning points O' and O, calculating the heading of the vehicle body, and triggering calibration when the position relation between the vehicle-mounted positioning points and the positioning points is judged to accord with a preset condition; establishing a right-hand rectangular coordinate system, namely a vehicle body coordinate system, by taking the vehicle-mounted positioning point O as an origin, the vehicle body course as a y-axis, the vertical ground direction as a z-axis and the vertical vehicle body course direction as an x-axis; acquiring world coordinate system coordinates of the calibration points, and calculating vehicle body coordinate system coordinates of the calibration points; calculating a transformation matrix M from the vehicle body coordinate system to the pixel coordinate system according to the vehicle body coordinate system coordinates and the pixel coordinates of the calibration points; calculating a conversion function from the coordinates of the vehicle body coordinate system of the point P to the coordinates of the world coordinate system by the coordinates of the world coordinate systems of the vehicle-mounted positioning points O' and O and the coordinates of the vehicle body coordinate system of any point P; and storing the result and ending the calibration.
Based on the general concept, those skilled in the art should understand that the vehicle in the vehicle of the present invention refers to a vehicle driven or towed by a power device, and the power is generally from an internal combustion engine or an electric motor. The positioning information refers to position information provided by GNSS, including but not limited to world coordinate system coordinates. GNSS, a satellite navigation system, includes, but is not limited to, GPS in the united states, GLONASS in russia, Galileo in the european union, and BDS in china.
Some technical effects of the invention are as follows: the method realizes automatic calibration, reduces manual intervention participation and has better repeatability.
In some embodiments, as shown in fig. 2 to 4, a camera or an apparatus having a camera function is mounted and fixed on a vehicle. Generally, a camera or an apparatus having an image pickup function is mounted and fixed to a front of a vehicle, particularly, a front window glass of the vehicle, so as to obtain a good working view environment. The device with the camera function refers to a device capable of shooting and acquiring image data such as videos or pictures, such as an acquisition terminal device for acquiring map data in the mapping field and a terminal device for visually identifying road conditions in the automatic driving field.
In some embodiments, as shown in fig. 2 to 4, vehicle-mounted positioning points are set on the vehicle, and the vehicle-mounted positioning points are used for acquiring positioning information of the vehicle in real time or non-real time. Generally, the vehicle-mounted positioning point can be provided with a navigation device existing in the vehicle, and other vehicle-mounted positioning points and positioning devices can be additionally selected and arranged at other positions. In one embodiment, the center of the selected roof is set as a vehicle-mounted positioning point, and a positioning device is installed.
In some embodiments, as shown in fig. 2 to 5, the calibration points are symmetrically disposed on two sides of the road surface, and a line connecting any two calibration points on each side is parallel to the median line of the road surface. The advantage of this arrangement is that subsequent data processing is facilitated.
In some embodiments, a positioning device is disposed on the calibration point, and the positioning device is used for receiving and sending positioning information of the calibration point.
In some embodiments, the positioning information of all positioning points may also be obtained through pre-measurement, and then used for subsequent use.
It is understood that the above embodiment operations may be one-off, i.e. set for the first time, and then may not need to be reset without environmental change; when the calibration is performed again, the operation steps of the above embodiment can be omitted.
In some embodiments, image data is acquired, a calibration point is identified, and pixel coordinates of the calibration point are calculated; acquiring world coordinate system coordinates of the vehicle-mounted positioning points O' and O, calculating the heading of the vehicle body, and triggering calibration when the position relation between the vehicle-mounted positioning points and the positioning points is judged to accord with a preset condition; establishing a right-hand rectangular coordinate system, namely a vehicle body coordinate system, by taking the vehicle-mounted positioning point O as an origin, the vehicle body course as a y-axis, the vertical ground direction as a z-axis and the vertical vehicle body course direction as an x-axis; acquiring world coordinate system coordinates of the calibration points, and calculating vehicle body coordinate system coordinates of the calibration points; calculating a transformation matrix M from the vehicle body coordinate system to the pixel coordinate system according to the vehicle body coordinate system coordinates and the pixel coordinates of the calibration points; calculating a conversion function from the coordinates of the vehicle body coordinate system of the point P to the coordinates of the world coordinate system by the coordinates of the world coordinate systems of the vehicle-mounted positioning points O' and O and the coordinates of the vehicle body coordinate system of any point P; and storing the result and ending the calibration.
In some embodiments, video image data is acquired, the number of the calibration points is identified, and positioning information of the vehicle-mounted positioning point is acquired; acquiring positioning information of a first calibration position at a first calibration distance; acquiring positioning information of a second calibration position at a second calibration distance; calculating the heading of the vehicle body according to the positioning information of the first calibration position and the second calibration position; calculating an included angle between the heading of the vehicle body and a connecting line of the first calibration point and the second calibration point; and triggering calibration when the included angle is less than or equal to 90 degrees and less than or equal to a first threshold value.
In some embodiments, a camera or a device having a camera function mounted in advance on a vehicle operates to generate image data. The image data may be any one or a combination of video and pictures.
In some embodiments, the image data further includes timestamp information of each frame, and positioning information of the vehicle-mounted positioning point at the shooting time.
In some embodiments, the image data is processed to identify the number of the calibration points, and meanwhile, the positioning information of the vehicle-mounted positioning point is obtained. Recognition as used herein refers to the use of image processing techniques to resolve the index points in the image data.
In some embodiments, when all the calibration points are identified, the acquisition of the positioning information of the vehicle-mounted positioning point is started, so that the data volume can be reduced.
In some embodiments, the acquired positioning information of the vehicle-mounted positioning point is RTK positioning information. An RTK (real-time kinematic) real-time dynamic measurement is one of relative positioning technologies, and high-precision dynamic relative positioning is realized mainly through a real-time data link between a reference station and a rover station and a carrier relative positioning fast resolving technology.
In some embodiments, the vehicle advances, the index point enters the working radius of the camera, and image data are captured and processed; and identifying all the calibration points from the image data, and starting to acquire the positioning information of the vehicle-mounted positioning points and the positioning information of the calibration points. And calculating the distance from the vehicle-mounted positioning point to any positioning point according to the positioning information of the vehicle-mounted positioning point and the positioning information of the positioning point. The vehicle continues to advance, and the calibration point enters the camera to calibrate the working radius. And identifying all calibration points at the first calibration distance, and acquiring and recording the positioning information of the first calibration position at the moment. And identifying all the calibration points at the second calibration distance, and acquiring and recording the positioning information of the second calibration position at the moment. The first nominal distance is less than or equal to the maximum nominal working radius and the second nominal distance is greater than or equal to the minimum nominal working radius. The camera working radius refers to the maximum working distance at which the camera can acquire image data that meets the image processing work requirements. The calibration working radius refers to the working distance at which the camera can acquire image data meeting the calibration working requirement.
In some embodiments, the maximum calibrated working radius is 8m and the minimum calibrated working radius is 3 m.
In some embodiments, the positioning information of the calibration point can be obtained from the outside in advance and manually input into the automatic calibration control system; or the calibration point can be sent by a positioning device of the calibration point, and the automatic calibration control system receives input.
In some embodiments, the vehicle body heading is calculated based on the positioning information of the first and second calibrated positions. The heading of the vehicle body here refers to a spatial vector characterizing the heading of the vehicle.
In some embodiments, the RTK positioning information of the O' point of the first calibration position is obtained through the vehicle-mounted positioning point, so as to obtain the precise world coordinate (x) of the O pointO,yO,zO) (ii) a Acquiring RTK positioning information of the O point of the second calibration position through the vehicle-mounted positioning point to obtain an accurate point world coordinate (x)O′,yO′,zO′). Subtracting the world coordinate systems to obtain a vector
In some embodiments, the vehicle-mounted positioning point is provided with an RTK positioning device to acquire world coordinate system coordinates; the coordinate of the world coordinate system of the vehicle-mounted positioning point O is the coordinate of the real-time world coordinate system of the RTK positioning device, and the coordinate of the world coordinate system of the vehicle-mounted positioning point O' is the coordinate of the world coordinate system of the RTK positioning device at any historical moment.
In some embodiments, an angle is calculated between the heading of the vehicle body and a line connecting the first and second calibration points.
In some embodiments, a first calibration point A is obtained1And a second index point A2World coordinates of (a) to obtain a vectorBy arranging in opposite directionsSum vectorAnd calculating the included angle between the vehicle body course and the connecting line of the first calibration point and the second calibration point. Wherein, the first index point A1And a second index point A2Are symmetrical calibration points respectively positioned at two sides of the road centerline.
In some embodiments, when the included angle-90 ≦ the first threshold value is detected, a calibration work instruction is issued to the calibration apparatus, triggering calibration.
In some embodiments, a first calibration point A is obtained1And a third index point A3World coordinates of (a) to obtain a vectorBy vectorSum vectorAnd calculating the included angle between the vehicle body course and the connecting line of the first calibration point and the third calibration point. Wherein, the first index point A1And a third index point A3Is two calibration points respectively positioned on the same side of the road surface, and a straight line A1A3Parallel to the road surface midline. At the moment, the calibration is triggered by setting the included angle to be less than or equal to a first threshold value.
In some embodiments, the first threshold is set at 5 °.
In some embodiments, after triggering calibration, calibration parameters are calculated, including but not limited to camera internal reference and pose parameters. Then, according to the calibration parameters, any one calibration point is identified by calibration equipment, and the theoretical world coordinate of the calibration point is calculated; and when the absolute value of the error between the theoretical world coordinate and the actual world coordinate of the calibration point is smaller than a second threshold value, storing the calibration parameters and terminating the calibration.
The theoretical world coordinate of the calibration point refers to a calibrated theoretical coordinate value calculated by the technical scheme. The actual world coordinates of the index point mayThe automatic calibration control system is obtained from the outside in advance and manually input; or the calibration point can be sent by a positioning device of the calibration point, and the control system is automatically calibrated to receive and input. The absolute value of the error between the theoretical world coordinate and the actual world coordinate of the index point is calculated, and the theoretical world coordinate (x) of the index point is setTheory of things,yTheory of things,zTheory of things) Real coordinate system (x)Fruit of Chinese wolfberry,yFruit of Chinese wolfberry,zFruit of Chinese wolfberry) Then there is the absolute value of the error | x of the corresponding coordinate axisTheory of things-xFruit of Chinese wolfberry|、|yTheory of things-yFruit of Chinese wolfberryI and I zTheory of things-zFruit of Chinese wolfberryL. When, | xTheory of things-xFruit of Chinese wolfberry|、|yTheory of things-yFruit of Chinese wolfberryI and I zTheory of things-zFruit of Chinese wolfberryAnd when all the I are smaller than the second threshold value, storing the calibration parameters, and completing and terminating the calibration.
In some embodiments, the second threshold is set at 20 cm.
In some embodiments, the vehicle-mounted positioning point RTK is used0The RTK can acquire the world coordinates of the positioning point in real time when the RTK is placed at the top of the vehicle body. Respectively setting 6 point RTKs by using the RTKs on two sides of a road 5-10 m in front of a vehicle1、RTK2、RTK3、RTK4、RTK5、RTK6World coordinate Aw1,Aw2,Aw3,Aw4,Aw5,Aw6Respectively correspond to the calibration points A1,A2,A3,A4,A5,A6。RTK1、RTK3、RTK5Dot connection and RTK2、RTK4、RTK6The connecting lines of the points are respectively parallel to the middle line of the road surface. The binocular camera is fixedly arranged on the front windshield, and the position of the 6 points is ensured to be in the camera visual field range. Starting the vehicle to acquire the coordinates of the positioning point of the vehicle body, and driving forwards to form two adjacent RTKs0And determining the point O as the heading of the vehicle body, and capturing a left eye image. And establishing a right-hand rectangular coordinate system OXYZ, namely a vehicle body coordinate system, by taking the point O as a coordinate origin, the vehicle body course O' O as a Y axis, the vertical ground direction as a Z axis and the vertical vehicle body course direction as an X axis.
passing point C1Making a vertical line perpendicular to the ground plane, and making the foot B1Then | B1C1I is the Z coordinate of the O point in the world coordinate system minus A1The Z coordinate value of the point in the world coordinate system is as follows:
A1the coordinates in the vehicle body coordinate system with O as the origin of coordinates are:
in the same way, can obtain A2,A3,A4,A5,A6On-vehicleCoordinates in the body coordinate system. Identify A2,A3,A4,A5,A6Corresponding image coordinates A in the left eye image1′,A2′,A3′,A4′,A5′,A6′。
Then, a transformation matrix M for converting the coordinates of the vehicle body into the coordinates of the pixel coordinate system is calculated.
Transformation matrixMatrix parameter fx,fy,cx,cyAnd R is an internal parameter of the camera, R is a rotation matrix, and T is a three-dimensional translation vector. m denotes the parameter values at different positions in the matrix, and the subscripts are the row and column numbers, respectively.
Specifically, the transformation matrix relationship is a ═ M-1B; wherein A is a coordinate point of a coordinate system of the vehicle body, and B is a coordinate point of a pixel.
Since the calibration points are all located on the same road plane, the z-coordinate value is a constant. According to the calibration principle, the monocular calibration parameters have an internal parameter fx,fy,cx,cyThe external parameter rotation matrix R and the translation matrix T have 10 parameters in total. Where T is the translation vector containing three parameters (translation in the x, y, z directions). The four calibration points under the coordinate system of the vehicle body can list 12 equations to solve internal and external parameters. Since the projection transformation matrix from the three-dimensional coordinate point in the vehicle body coordinate system to the two-dimensional point in the pixel coordinate system is a 3 × 4 irreversible matrix, and since z is a constant, two extra parameters are eliminated, the transformation matrix can be rewritten into a 3 × 3 reversible matrix. That is, the coordinate points in the pixel coordinate system may be obtained from the coordinate points in the vehicle coordinate system, or the coordinate points in the road plane where z is a constant in the vehicle coordinate system may be obtained from the coordinate points in the pixel coordinate system.
namely:
Wherein: rx,Ry,RzIs a rotation matrix around the x-axis, y-axis, z-axis, respectively, Tx,Ty,TzThe translation is in the directions of an x axis, a y axis and a z axis respectively.
Assuming that the rotation angle around the x-axis, the y-axis and the z-axis is α, theta, then:
Since the z coordinate is equal to 0, the third column is eliminated resulting in the M matrix:
wherein α, β and theta are rotation angles around the x-axis, y-axis and z-axis respectively, and Tx、Ty、TzRespectively in the directions of the x-axis, y-axis and z-axis, fxDenotes the focal length of the camera in the x-axis direction, fyIndicating the focal length in the y-axis direction.
And then calculating a conversion function from the coordinates of the coordinate system of the vehicle body to the coordinates of the coordinate system of the world:
let the coordinates of the O point and the O' point in the world coordinate system be (x)O,yO,zO),(xO′,yO′,zO′) The included angle between the clockwise direction and the true north direction of the earth is α2And then:
i.e. the angle to the y-axis
①xO′>xO,yO′>yO:α2=2π-μ;
②xO′<xO,yO′>yO:α2=μ;
③xO′>xO,yO′<yO:α2=π+μ;
④xO′<xO,yO′<yO:α2=π-μ。
Let the coordinate of the point P in the coordinate system of the vehicle body be (x)P,yP,zP) Then, the coordinates of the point P in the world coordinate system are:
x=xPcosα2-yPsinα2+xo
y=xPsinα2+yPcosα2+yo
z=zP+zo。
from this, the theoretical world coordinates of the index point can be calculated.
In some embodiments, a three-dimensional volumetric marker monochrome tile is conventionally placed next to the calibration point. The conventional placement means that any one side of the single-color small square of the three-dimensional calibration object passes through the calibration point. Generally, in the conventional technical solution, it is strictly required that a certain right-angle vertex of a single-color small block of the three-dimensional calibration object coincides with the calibration point.
In some embodiments, the midpoint of any one edge of the solid color cube of the three-dimensional volumetric marker coincides with the midpoint of any one edge of the solid color cube.
In some embodiments, the three-dimensional volumetric calibration object monochrome dice are parallel to the optical axis centerline of the camera. The advantage of this arrangement is that subsequent data processing is facilitated.
In some embodiments, as shown in fig. 6, a rectangular strip is provided as the preset marker, where the preset marker is placed according to the following rules: the middle point of any side of the rectangular strip is superposed with the calibration point and is externally connected with the calibrated side of the single-color small square of the three-dimensional calibration object; two side edges of the rectangular strip do not exceed the single-color small square of the three-dimensional calibration object. The rectangular strip can be made of common monochromatic paper, the length-width ratio is arbitrary, and the length-width size is smaller than the side length of the monochromatic small square of the three-dimensional calibration object. Specifically, the color of the rectangular bar may be selected according to the external environment of the index point recognition, and black is generally selected. The rectangular strip can be fixed beside the calibration point by gluing and the like.
It can be understood that the preset marker can be set once, that is, set for the first time, and the rectangular strip can be retained later without resetting; when the camera calibration is performed again, the step of setting the rectangular strip can be omitted.
In some embodiments, the camera is operated to capture video or pictures of the front within the working radius to obtain image data.
In some embodiments, the image data is processed to generate a circumscribed rectangle of the marker in the image; the lower side of the circumscribed rectangle extends downwards for a preset pixel coordinate length to generate an interested area; traversing the region of interest to generate a marking region of a preset marker related to the marker; and traversing the marked area to generate a central line, wherein the pixel coordinate of the upper end point of the central line is the pixel coordinate of the calibration point.
In some embodiments, the image data is processed by a semantic segmentation method of deep learning. In general, the following steps may be taken: collecting video data containing an identification target; converting the video data into picture data; performing target marking on the picture by using a marking tool to generate sample data; training by using sample data to generate a network model; and calling the model to identify the target.
In some embodiments, the modeling method of the deep learning neural network model is as follows: firstly, training a pre-training model obtained by VGG16 training, and outputting a trained FCN-32s model; taking the FCN-32s model as a pre-training model, training by using a new sample, and outputting a trained FCN-16s model; and (4) taking the FCN-16s model as a pre-training model, training by using a new sample, and outputting the trained FCN-8s model. And taking the FCN-8s model as a pre-training model, training by using a new sample, and outputting the trained FCN-4s model as a target model. Here, FCN is a full volume accumulator network (full volumetric networks). VGG is Visual Geometry Group. It should be noted that the required model obtained by training the rest deep neural networks such as googleNet can also be used.
In some embodiments, the image data findContours are processed to obtain all external quadrangles except the three-dimensional stereo calibration object small square image; and the circumscribed rectangle with the minimum output area.
In some embodiments, the region of interest is generated by extending the lower edge of the output bounding rectangle by a preset pixel coordinate length in pixel coordinates. Generally, the consideration factor of the preset pixel coordinate length value is mainly the pixel error size when the circumscribed rectangle corresponding to the three-dimensional calibration object small block image is obtained. The pixel error refers to the pixel difference between the theoretical value and the actual value of the three-dimensional calibration object small square image.
In some embodiments, the preset pixel coordinate length range is: the length of the preset pixel coordinate is more than or equal to 10 and less than or equal to 20.
In some embodiments, the pixel coordinate length is 15, i.e., the lower edge of the circumscribed rectangle extends downward by 15 unit pixel coordinate lengths.
In some embodiments, the predetermined marker associated with the calibration object is a rectangular strip; the upper edge of the rectangular strip is connected with the lower edge of the calibration object, the upper midpoint of the rectangular strip is overlapped with the calibration point, and the side length of the rectangular strip is smaller than that of the calibration object, so that the left side and the right side of the image of the rectangular strip are not more than the left side and the right side of the image of the calibration object in the image.
In some embodiments, the three-dimensional volumetric calibration object is parallel to the optical axis of the camera. And traversing the region of interest to generate a marking region of the preset marker related to the marker.
In some embodiments, a method of generating a marker region of a marker-related preset marker: setting I (x, y) as any pixel point in the region of interest, and setting I (x-delta, y) and I (x + delta, y) as two pixel points of I (x, y) which are symmetrical along the y axis, wherein delta is the pixel coordinate length of a preset marker in an image;
and the combination of the above-mentioned materials is that,
d1=I(x,y)-I(x-δ,y)
d2=I(x,y)-I(x+δ,y)
wherein d is1,d2The pixel difference value of any pixel point and the corresponding symmetrical pixel point is obtained;
D=d1+d2-|I(x+δ,y)-I(x-δ,y)|
d represents the sum of pixel difference values of any pixel point and the corresponding symmetric pixel point, and then the pixel difference value of two symmetric pixel points is subtracted to represent the pixel difference value of any pixel point and the symmetric pixel point;
let L (x, y) be the pixel value binarization function of the pixel point, when d is satisfied1>0,d2L (x, y) is 255 if D > L and is 0 if the above condition is not satisfied;
here, the binarization function threshold L is α × I (x, y), and α is a threshold coefficient.
In some embodiments, the threshold coefficient α is in the range of 0.3 ≦ α ≦ 0.8.
In another aspect, in some embodiments, a storage medium is provided. The storage medium stores computer program instructions that, when executed by the processor, repeatedly perform at least once the following steps: acquiring video image data, identifying the number of the positioning points, and acquiring positioning information of the vehicle-mounted positioning point; acquiring positioning information of a first calibration position at a first calibration distance; acquiring positioning information of a second calibration position at a second calibration distance; calculating the heading of the vehicle body according to the positioning information of the first calibration position and the second calibration position; calculating an included angle between the heading of the vehicle body and a connecting line of the first calibration point and the second calibration point; and when the included angle is less than or equal to-90 degrees and less than or equal to a first threshold value, sending a calibration instruction to trigger calibration.
In some embodiments, the storage medium stores computer program instructions that, when executed by the processor, repeatedly perform at least once the following steps: identifying any one of the calibration points, and calculating theoretical world coordinates of the calibration point; and when the absolute value of the error between the theoretical world coordinate and the actual world coordinate of the calibration point is smaller than a second threshold value, storing the calibration parameters and terminating the calibration.
In another aspect, in some embodiments, an automatically calibrated control system is provided and includes a calibration point identification module, a vehicle body heading determination module, a triggering module, and a verification module. The index point identification module is used for acquiring and processing video image data and identifying index points; the vehicle body course judging module is used for acquiring and processing positioning information of the vehicle-mounted positioning point and judging the position relation between the vehicle body course and the positioning point; the trigger module is used for sending a calibration instruction and starting calibration; the calibration module is used for calibrating the calibration result.
The various embodiments or features mentioned herein may be combined with each other as additional alternative embodiments without conflict, within the knowledge and ability level of those skilled in the art, and a limited number of alternative embodiments formed by a limited number of combinations of features not listed above are still within the scope of the present disclosure, as understood or inferred by those skilled in the art from the figures and above.
Finally, it is emphasized that the above-mentioned embodiments, which are typical and preferred embodiments of the present invention, are only used for explaining and explaining the technical solutions of the present invention in detail for the convenience of the reader, and are not used to limit the protection scope or application of the present invention.
Therefore, any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be covered within the protection scope of the present invention.
Claims (10)
1. A vehicle-mounted monocular calibration method is characterized in that: the method comprises the following steps:
acquiring image data, identifying a calibration point, and calculating a pixel coordinate of the calibration point;
acquiring world coordinate system coordinates of the vehicle-mounted positioning points O' and O, calculating the heading of the vehicle body, and triggering calibration when the position relation between the vehicle-mounted positioning points and the positioning points is judged to accord with a preset condition;
establishing a right-hand rectangular coordinate system, namely a vehicle body coordinate system, by taking the vehicle-mounted positioning point O as an origin, the vehicle body course as a y-axis, the vertical ground direction as a z-axis and the vertical vehicle body course direction as an x-axis;
acquiring world coordinate system coordinates of the calibration points, and calculating vehicle body coordinate system coordinates of the calibration points;
calculating a transformation matrix M from the vehicle body coordinate system to the pixel coordinate system according to the vehicle body coordinate system coordinates and the pixel coordinates of the calibration points;
calculating a conversion function from the coordinates of the vehicle body coordinate system of the point P to the coordinates of the world coordinate system by the coordinates of the world coordinate systems of the vehicle-mounted positioning points O' and O and the coordinates of the vehicle body coordinate system of any point P;
and storing the result and ending the calibration.
2. The method of claim 1, wherein: the pixel coordinates of the identification index point and the calculation index point are calculated through the following steps:
processing the image data to generate a circumscribed rectangle of the marker in the image;
the lower side of the circumscribed rectangle extends downwards for a preset pixel coordinate length to generate an interested area;
traversing the region of interest to generate a marking region of a preset marker related to the marker;
and traversing the marked area to generate a central line, wherein the pixel coordinate of the upper end point of the central line is the pixel coordinate of the calibration point.
3. The method of claim 1, wherein: the RTK positioning device is installed on the vehicle-mounted positioning point to acquire world coordinate system coordinates; the coordinates of the world coordinate system of the vehicle-mounted positioning point O are the coordinates of the real-time world coordinate system of the RTK positioning device, and the coordinates of the world coordinate system of the vehicle-mounted positioning point O' are the coordinates of the world coordinate system of the RTK positioning device at any historical moment.
4. The method of claim 3, wherein: the RTK positioning device is installed in the center of the top of the vehicle.
5. The method of claim 1, wherein: the heading of the vehicle body is a space vector indicating the advancing direction of the vehicleThe position relation between the judgment and the calibration point meets the preset conditions, the judgment is carried out by calculating the included angle α of the connecting line between the heading of the vehicle body and the calibration point, and the calibration is triggered when one of the following preset conditions is met:
preset condition 1: a first included angle between the heading of the vehicle body and a connecting line of any two parallel calibration points on the same side is less than or equal to a first threshold value,
preset condition 2: and a second included angle between the heading of the vehicle body and a connecting line of any two opposite-side symmetrical calibration points, wherein the absolute value of the second included angle is-90 degrees and is less than or equal to a first threshold value.
6. The method of claim 5, wherein: the first threshold value is 5 °.
7. The method of claim 1, wherein: the world coordinate system coordinates of the calibration points are obtained by any one of the following methods:
the method comprises the following steps: the coordinates of the world coordinate system of the calibration point are obtained through manual measurement,
the method 2 comprises the following steps: and the RTK positioning device is arranged on the calibration point to transmit.
8. The method of claim 1, wherein: and the calculation of the vehicle body coordinate system of the calibration point is as follows: is provided with an arbitrary calibration point A1Let a vectorAnd vectorIs α1The method comprises the following steps:
passing point C1Making a vertical line perpendicular to the ground plane, and making the foot B1Then | B1C1I is the Z coordinate of the O point in the world coordinate system minus A1The Z coordinate value of the point in the world coordinate system is as follows:
A1the coordinates in the vehicle body coordinate system with O as the origin of coordinates are:
10. The method of claim 1, wherein: the point P is a conversion function from the coordinates of the vehicle body coordinate system to the coordinates of the world coordinate system:
let the coordinates of the O point and the O' point in the world coordinate system be (x)O,yO,zO),(xO′,yO′,zO′) The included angle between the clockwise direction and the true north direction of the earth is α2And then:
i.e. the angle to the y-axis
①xO′>xO,yO′>yO:α2=2π-μ
②xO′<xO,yO′>yO:α2=μ
③xO′>xO,yO′<yO:α2=π+μ
④xO′<xO,yO′<yO:α2=π-μ
Let the coordinate of the point P in the coordinate system of the vehicle body be (x)P,yP,zP) Then, the coordinates of the point P in the world coordinate system are:
x=xPcosα2-yPsinα2+xo
y=xPsinα2+yPcosα2+yo
z=zP+zo。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910833525.6A CN111145262B (en) | 2019-09-04 | 2019-09-04 | Vehicle-mounted-based monocular calibration method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910833525.6A CN111145262B (en) | 2019-09-04 | 2019-09-04 | Vehicle-mounted-based monocular calibration method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111145262A true CN111145262A (en) | 2020-05-12 |
CN111145262B CN111145262B (en) | 2024-01-26 |
Family
ID=70516804
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910833525.6A Active CN111145262B (en) | 2019-09-04 | 2019-09-04 | Vehicle-mounted-based monocular calibration method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111145262B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112550014A (en) * | 2020-11-20 | 2021-03-26 | 开迈斯新能源科技有限公司 | Automatic charging method, device and system |
CN113033441A (en) * | 2021-03-31 | 2021-06-25 | 广州敏视数码科技有限公司 | Pedestrian collision early warning method based on wide-angle imaging |
CN113820698A (en) * | 2021-09-13 | 2021-12-21 | 广州小鹏自动驾驶科技有限公司 | Obstacle ranging method and device, electronic equipment and readable medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106740872A (en) * | 2017-01-13 | 2017-05-31 | 驭势科技(北京)有限公司 | Intelligent automobile sensor self-checking system and method, accessory system and intelligent automobile |
DE102016104729A1 (en) * | 2016-03-15 | 2017-09-21 | Connaught Electronics Ltd. | Method for extrinsic calibration of a camera, computing device, driver assistance system and motor vehicle |
CN107862719A (en) * | 2017-11-10 | 2018-03-30 | 未来机器人(深圳)有限公司 | Scaling method, device, computer equipment and the storage medium of Camera extrinsic |
CN108805934A (en) * | 2017-04-28 | 2018-11-13 | 华为技术有限公司 | A kind of method for calibrating external parameters and device of vehicle-mounted vidicon |
CN108898638A (en) * | 2018-06-27 | 2018-11-27 | 江苏大学 | A kind of on-line automatic scaling method of vehicle-mounted camera |
CN109767475A (en) * | 2018-12-28 | 2019-05-17 | 广州小鹏汽车科技有限公司 | A kind of method for calibrating external parameters and system of sensor |
CN109991613A (en) * | 2017-12-29 | 2019-07-09 | 长城汽车股份有限公司 | Localization method, positioning device, vehicle and readable storage medium storing program for executing |
CN111145260A (en) * | 2019-08-30 | 2020-05-12 | 广东星舆科技有限公司 | Vehicle-mounted binocular calibration method |
-
2019
- 2019-09-04 CN CN201910833525.6A patent/CN111145262B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102016104729A1 (en) * | 2016-03-15 | 2017-09-21 | Connaught Electronics Ltd. | Method for extrinsic calibration of a camera, computing device, driver assistance system and motor vehicle |
CN106740872A (en) * | 2017-01-13 | 2017-05-31 | 驭势科技(北京)有限公司 | Intelligent automobile sensor self-checking system and method, accessory system and intelligent automobile |
CN108805934A (en) * | 2017-04-28 | 2018-11-13 | 华为技术有限公司 | A kind of method for calibrating external parameters and device of vehicle-mounted vidicon |
CN107862719A (en) * | 2017-11-10 | 2018-03-30 | 未来机器人(深圳)有限公司 | Scaling method, device, computer equipment and the storage medium of Camera extrinsic |
CN109991613A (en) * | 2017-12-29 | 2019-07-09 | 长城汽车股份有限公司 | Localization method, positioning device, vehicle and readable storage medium storing program for executing |
CN108898638A (en) * | 2018-06-27 | 2018-11-27 | 江苏大学 | A kind of on-line automatic scaling method of vehicle-mounted camera |
CN109767475A (en) * | 2018-12-28 | 2019-05-17 | 广州小鹏汽车科技有限公司 | A kind of method for calibrating external parameters and system of sensor |
CN111145260A (en) * | 2019-08-30 | 2020-05-12 | 广东星舆科技有限公司 | Vehicle-mounted binocular calibration method |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112550014A (en) * | 2020-11-20 | 2021-03-26 | 开迈斯新能源科技有限公司 | Automatic charging method, device and system |
CN113033441A (en) * | 2021-03-31 | 2021-06-25 | 广州敏视数码科技有限公司 | Pedestrian collision early warning method based on wide-angle imaging |
CN113033441B (en) * | 2021-03-31 | 2024-05-10 | 广州敏视数码科技有限公司 | Pedestrian collision early warning method based on wide-angle imaging |
CN113820698A (en) * | 2021-09-13 | 2021-12-21 | 广州小鹏自动驾驶科技有限公司 | Obstacle ranging method and device, electronic equipment and readable medium |
CN113820698B (en) * | 2021-09-13 | 2024-04-16 | 广州小鹏自动驾驶科技有限公司 | Obstacle ranging method, obstacle ranging device, electronic equipment and readable medium |
Also Published As
Publication number | Publication date |
---|---|
CN111145262B (en) | 2024-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109074668B (en) | Path navigation method, related device and computer readable storage medium | |
US20230360260A1 (en) | Method and device to determine the camera position and angle | |
CN110146869B (en) | Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium | |
KR102121974B1 (en) | Disaster damage investigation·analysis system using drone and disaster damage investigation·analysis method | |
US11781863B2 (en) | Systems and methods for pose determination | |
CN113657224B (en) | Method, device and equipment for determining object state in vehicle-road coordination | |
US10909395B2 (en) | Object detection apparatus | |
CN112189225B (en) | Lane line information detection apparatus, method, and computer-readable recording medium storing computer program programmed to execute the method | |
CN110842940A (en) | Building surveying robot multi-sensor fusion three-dimensional modeling method and system | |
US20190392228A1 (en) | Integrated sensor calibration in natural scenes | |
CN101216937B (en) | Parameter calibration method for moving containers on ports | |
CN110969663A (en) | Static calibration method for external parameters of camera | |
US10996337B2 (en) | Systems and methods for constructing a high-definition map based on landmarks | |
CN112232275B (en) | Obstacle detection method, system, equipment and storage medium based on binocular recognition | |
CN111145262A (en) | Vehicle-mounted monocular calibration method | |
CN111145260A (en) | Vehicle-mounted binocular calibration method | |
CN108387206A (en) | A kind of carrier three-dimensional attitude acquisition method based on horizon and polarised light | |
CN114755662A (en) | Calibration method and device for laser radar and GPS with road-vehicle fusion perception | |
CN109345591A (en) | A kind of vehicle itself attitude detecting method and device | |
CN114529585A (en) | Mobile equipment autonomous positioning method based on depth vision and inertial measurement | |
CN112446915A (en) | Picture-establishing method and device based on image group | |
JP4132068B2 (en) | Image processing apparatus, three-dimensional measuring apparatus, and program for image processing apparatus | |
EP3816938A1 (en) | Region clipping method and recording medium storing region clipping program | |
CN115100290B (en) | Monocular vision positioning method, monocular vision positioning device, monocular vision positioning equipment and monocular vision positioning storage medium in traffic scene | |
CN111145263A (en) | Vehicle-mounted-based automatic camera calibration method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |