Nothing Special   »   [go: up one dir, main page]

CN103065323A - Subsection space aligning method based on homography transformational matrix - Google Patents

Subsection space aligning method based on homography transformational matrix Download PDF

Info

Publication number
CN103065323A
CN103065323A CN2013100130458A CN201310013045A CN103065323A CN 103065323 A CN103065323 A CN 103065323A CN 2013100130458 A CN2013100130458 A CN 2013100130458A CN 201310013045 A CN201310013045 A CN 201310013045A CN 103065323 A CN103065323 A CN 103065323A
Authority
CN
China
Prior art keywords
centerdot
wave radar
millimeter wave
distance
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013100130458A
Other languages
Chinese (zh)
Other versions
CN103065323B (en
Inventor
付梦印
靳璐
杨毅
宗民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201310013045.8A priority Critical patent/CN103065323B/en
Publication of CN103065323A publication Critical patent/CN103065323A/en
Application granted granted Critical
Publication of CN103065323B publication Critical patent/CN103065323B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a subsection space aligning method based on a homography transformational matrix. According to the subsection space aligning method based on the homography transformational matrix, large marked distance is sectioned, the homography transformational matrix between a camera coordinate system and a millimeter-wave radar coordinate system of each subsection is acquired, errors caused by using a same homography transformational matrix to represent coordinate relations of two sensors in the prior art are avoided, and space aligning of target detection of the large marked distance can be achieved. Relations of different coordinate systems between the camera and the millimeter-wave radar are deduced and represented, and finally the relations of the coordinate systems between the camera and the millimeter-wave radar are represented by the homography transformational matrix N. The two sensors are used for obtaining target data and solving the homography transformational matrix N, and a camera internal parameter matrix and a rotation matrix which are composed of solving scaling factors, focal distance and the like are avoided and a camera external parameter matrix composed of translation vectors is avoided. Therefore, operation process is greatly simplified and operation time is saved.

Description

Segmented space alignment method based on homography transformation matrix
Technical Field
The invention relates to the technical field of unmanned vehicle multi-sensor information fusion, in particular to a segmented space alignment method based on a homography transformation matrix.
Background
The unmanned vehicle is also called as an outdoor intelligent mobile robot, is a device which integrates multiple functions of environment perception, dynamic decision and planning, behavior control and execution and the like into a whole and has high intelligent degree, and the rapidity and the accuracy of the environment perception and the multi-sensor information fusion technology are indistinguishable. The multi-sensor information fusion technology is that a computer makes full use of sensor resources, and through reasonable domination and use of various measurement information, complementary and redundant information are combined according to certain optimization criteria in space and time to generate consistent explanation or description of an observation environment and generate a new fusion result. In the environment sensing module, a vision sensor and a millimeter wave radar are two commonly used sensors. The visual sensor has wide detection range, can obtain the size and outline information of the target in the external environment, but is easily influenced by external factors, and has the problem of target loss. The millimeter wave radar has high resolution and strong anti-interference capability, can accurately obtain the distance, relative speed and azimuth angle information of the target in various weather environments, but cannot identify the shape and size of the target, so that the complementary characteristics are utilized to fuse the information of the two, and more comprehensive and reliable environment information is obtained. And spatial alignment is a prerequisite for information fusion of the two. The essence of spatial alignment is to estimate the transformation matrix relationship between the camera and the radar coordinate system. At present, in a traditional space alignment method, in a calibration distance range of 20 meters, points on a target are randomly detected on different distance points, so that coordinate system expression of the target in a camera and expression of the target in a radar coordinate system are respectively obtained, according to data obtained by two sensors, a camera internal parameter matrix formed by a scaling factor, a focal length and the like and a camera external parameter matrix formed by a rotation matrix and a translation vector are estimated, the calculation process is relatively complicated, and errors are easily introduced; in addition, when the transformation matrix is solved for the target with the calibration distance of more than 20 meters according to the algorithm, the range is large, so that huge errors are caused, and the spatial alignment fails.
Disclosure of Invention
In view of this, the invention provides a method for aligning a segmented space based on a homography transformation matrix, which can realize the spatial alignment of a camera and a millimeter wave radar loaded on an unmanned vehicle in a larger calibration distance range, and can simplify the calculation process for solving the homography transformation matrix.
The invention relates to a segmented space alignment method based on a homography transformation matrix, which comprises the following steps:
step 1: establishing a homography transformation matrix-based relation between a camera coordinate system and a millimeter wave radar coordinate system:
defining an image coordinate system O 'uv of the camera, wherein O' is located in the upper left corner of the imaging plane of the camera; the u axis is parallel to the scanning line direction of the camera; the v-axis is perpendicular to the camera scan line direction;
defining O 'rho theta as a millimeter wave radar polar coordinate system, and O' as the center of the millimeter wave radar surface; rho is the linear distance between the target and the millimeter wave radar; theta is the angle of the target deviating from the central line of the scanning plane of the millimeter wave radar, the relation between the image coordinate system O 'uv of the camera and the millimeter wave radar polar coordinate system O' rho theta is expressed as follows:
u v 1 = N ρ sin θ ρ cos θ 1 - - - ( 7 )
wherein, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , defining a homography transformation matrix;
step 2: determining a proper calibration distance between the unmanned vehicle and a calibration target:
definition O "" XrYrZrThe rectangular coordinate system of the millimeter wave radar is represented, and O' is the center of the surface of the millimeter wave radar; y isrThe axis is the central line of a scanning plane of the millimeter wave radar, is vertical to the surface of the millimeter wave radar and points to the right front; xrAxis and YrVertical, pointing to the right; zrAxis perpendicular to Xr、YrA determined plane pointing upwards;
the relationship between the millimeter wave radar rectangular coordinate system and the millimeter wave radar polar coordinate system is as follows:
X r Y r 1 = ρ sin θ ρ cos θ 1 - - - ( 7 ) ′
the distance between the calibration target and the unmanned vehicle is on the longitudinal axis Y of the millimeter wave radar rectangular coordinate systemrThe projection on the image is called a calibration distance; in the millimeter waveIn the detection range of the radar, determining a proper calibration distance L according to the maximum movement speed of the unmanned vehicle; dividing the calibration distance L into a short-distance range L1 and a long-distance range L2 from near to far, dividing the calibration distance L into m1 sections in the short-distance range L1, dividing the calibration distance L into m2 sections in the long-distance range L2, and ensuring that L1/m1 is smaller than L2/m 2;
and step 3: respectively acquiring images and data information of a calibration target through a camera and a millimeter wave radar which are loaded in the unmanned vehicle:
respectively placing the calibration targets at different sections of which the calibration distance L is divided in the step 2, respectively detecting the targets at the distances of m1+ m2 by the millimeter wave radar and the camera, and aiming at the target at each section of distance during detection, enabling the target to be located along YrDividing the axial direction into m rows, and dividing each row along XrEqually dividing the axial direction into h sections, and controlling the millimeter wave radar to acquire coordinate data (X) of each sectionM rk,YM rk) Controlling the camera to capture image data f of each segmentk MWherein M ═ 1., (M1+ M2), k ═ 1, 2., (M2);
and 4, step 4: image data f for each segment in each segment obtained by the camera in step 3k MSeparately calculating the coordinates of the centroid (u) of the imagek M,vk M);
And 5: solving a homography space transformation matrix representing the relation between the millimeter wave radar coordinate system and the camera coordinate system:
millimeter wave radar coordinate data corresponding to all small segments obtained for each segment of distance separated from the whole calibration distance L (a)
Figure BDA00002733233500031
YM rk) And image data (u) of the cameraM k,vM k) Forming a corresponding data set in each segment, and substituting each data set into equations (7) and (7)' respectively to obtain:
u 1 M · · · u k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 11 M n 12 M n 13 M - - - ( 8 )
v 1 M · · · v k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 21 M n 22 M n 23 M - - - ( 9 )
and 1 · · · 1 = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 )
definition of P M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M · · · u k M T , V M = v 1 M · · · v k M T , Ik×1=[1…1]TThen homography spatial transformation matrix NMThe least squares solution of (d) can be expressed as: N M = N 1 M T N 2 M T N 3 M T T , wherein,
Figure BDA00002733233500046
andrespectively as follows: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M and N 3 = ( P M P M T ) - 1 P M T I k × 1 ;
step 6: the space alignment of the vision sensor and the millimeter wave radar is realized:
and (3) judging which segment the distance is in step 2 according to the actual distance of the calibration target scanned by the millimeter wave radar, and searching a homography space transformation matrix corresponding to the distance from m1+ m2 results obtained by calculation in step 5 to realize space alignment.
In the step 2, when the calibration distance is 50 meters, 0-20 meters are in a short distance range and are divided into 4 sections; the distance range is 20-50m, and the distance range is divided into 3 sections.
The method for calculating the centroid coordinates of each small segment of image in the calibration target image in the step 4 is as follows:
s40, manually selecting a candidate area containing a calibration target;
s41, carrying out median filtering on the candidate area image to eliminate the noise in the image;
s42, performing Sobel operator edge detection on the candidate region image to obtain a binarized calibration target edge image;
s43, in an image coordinate system with pixels as units, finding u-axis pixel point coordinates u of the coordinate minimum value and the coordinate maximum value in the calibration target edge image along the u-axismin,umaxEdge ofv-axis pixel point coordinate v for searching and calibrating coordinate minimum value and coordinate maximum value in target edge imagemin,vmaxConnecting the above 4 points with straight lines in clockwise or counterclockwise direction to form a quadrilateral region, and using formula in the quadrilateral region u k M = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) And v M k = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) calculating the coordinates of the centroid (u) of the calibration targetk M,vk M) Wherein f isk MAnd (i, j) represents the gray value of a pixel point (i, j) in the quadrilateral region corresponding to the kth subsection of the target on the Mth distance subsection.
The invention has the following beneficial effects:
1) by segmenting the larger calibration distance and respectively solving the homography transformation matrix between the camera and the millimeter wave radar coordinate system for each segment, the error caused by expressing the coordinate relation between two sensors by using the same homography transformation matrix in the prior art is avoided, and the space alignment of the detection of the target with the larger calibration distance can be realized;
2) by deducing the relationship between different coordinate systems representing the camera and the millimeter wave radar, and finally adopting a homography transformation matrix N to represent the relationship between the coordinate systems, and solving target data respectively obtained by two sensors to obtain the homography transformation matrix N, the internal parameter matrix of the camera formed by solving a scaling factor, a focal length and the like and the external parameter matrix of the camera formed by a rotation matrix and a translation vector are avoided, the operation process is greatly simplified, and the operation time is saved;
3) according to the fact that the attention degree of a sensor in the unmanned vehicle to a near-distance target and a far-distance target is different, the subsection fineness of the near-distance target and the subsection fineness of the far-distance target are different, the spatial alignment is guaranteed, and meanwhile the calculation amount can be reduced;
4) in an image coordinate system, after an edge image of a target is obtained, pixel points of the maximum value and the minimum value in two axial directions are respectively found, the 4 pixel points are surrounded into a quadrangle, and the centroid of the quadrangle is found.
Drawings
FIG. 1 is a schematic view of a camera pin hole model;
FIG. 2 is a schematic diagram of a millimeter wave radar coordinate system;
FIG. 3 is a schematic diagram of a mapping relationship between an image coordinate system and a millimeter wave radar rectangular coordinate system.
Detailed Description
The invention is described in detail below by way of example with reference to the accompanying drawings.
The invention provides a segmented space alignment method based on a homography transformation matrix, which comprises the following steps:
step 1: establishing a homography transformation matrix-based relation between a camera coordinate system and a millimeter wave radar coordinate system:
as shown in FIG. 1, OXcYcZcRepresenting a camera coordinate system, with an origin O located at the optical center of the camera; xcThe axis is parallel to the scanning line direction of the camera and points to the increasing direction of the scanning pixels; y iscThe axis is vertical to the scanning line direction of the camera and points to the increasing direction of the scanning line; zcThe axis is perpendicular to the imaging plane and points in the direction of the camera's line of sight. O 'uv denotes an image coordinate system in units of pixels, O' being located at the upper left corner of the imaging plane; u axis and XcParallel connection; v-axis and YcParallel. O "xy represents the image coordinate system in millimeters, O" isA focal point of the image plane; x axis and XcParallel connection; y axis and YcParallel. f is the focal length of the camera and I denotes the imaging plane. Assuming that the point P is in the coordinate system OXcYcZcThe coordinates under O 'xy and O' uv are respectively (X)c,Yc,Zc) (x, y) and (u, v) from the point P in the coordinate system OXcYcZcAnd a geometric proportional relation in a coordinate system O' xy:
Figure BDA00002733233500062
the above relationship is expressed in homogeneous form as:
Z c x y 1 = f 0 0 0 0 f 0 0 0 0 1 0 X c Y c Z c 1 - - - ( 1 )
the scaling and translation relations of the P point under the coordinate system O 'xy and the coordinate system O' uv are as follows: x is S (u-u)0),y=S(v-v0) Expressed in homogeneous form are:
x y 1 = S 1 0 - u 0 0 1 - v 0 0 0 1 u v 1 - - - ( 2 )
where S is a scaling factor, (u)0,v0) Is the coordinate of the coordinate system O "xy origin O" in the coordinate system O' uv. Assume world coordinate system O' XwYwZwThe coordinate of the point P in this coordinate system is (X)w,Yw,Zw) The relationship between the two coordinate systems is expressed in a homogeneous form as follows:
X c Y c Z c 1 = R T 0 1 X w Y w Z w 1 - - - ( 3 )
where R and T represent the rotation matrix and translation vector, respectively. By combining formula (1), formula (2) and formula (3), there is a relationship:
u v 1 = 1 β 1 0 u 0 0 1 v 0 0 0 1 f 0 0 0 0 f 0 0 0 0 1 0 R T 0 1 X w Y w Z w 1 - - - ( 4 )
wherein, β ═ Zc·S。
As shown in FIG. 2, O "" XrYrZrThe rectangular coordinate system of the millimeter wave radar is represented, and O' is the center of the surface of the millimeter wave radar; y isrThe axis is the central line of a scanning plane of the millimeter wave radar, is vertical to the surface of the millimeter wave radar and points to the right front; xrAxis and YrVertical, pointing to the right; zrAxis perpendicular to Xr、YrA determined plane, pointing upwards. O "" rho θ represents the millimeter wave radar polar coordinate system, the origin and the coordinate system O "" XrYrZrThe original points of the two are overlapped; rho is the linear distance between the target and the millimeter wave radar; theta is the angle of the target deviating from the central line of the millimeter wave radar scanning plane. Point P is at O "" ρ θ and O "" XrYrZrThe coordinates of (p, θ) and (X) arer,Yr,Zr) Since the scanning plane of the millimeter wave radar is a two-dimensional plane next to the coordinate system O' rho theta, Z is presentr0, point P is at O "" ρ θ and O "" XrYrZrThe following trigonometric relationship is expressed as:
X r Y r Z r = ρ sin θ ρ cos θ 0 - - - ( 5 )
assuming that the rectangular coordinate system of the millimeter wave radar is a world coordinate system, and by means of the intermediate variable of the world coordinate system and combining the equations (4) and (5), the relationship between the image coordinate system O' uv and the millimeter wave polar coordinate system O "" [ rho ] is expressed as:
u v 1 = 1 β 1 0 u 0 0 1 v 0 0 0 1 f 0 0 0 0 f 0 0 0 0 1 0 R T 0 1 ρ sin θ ρ cos θ 0 1 - - - ( 6 )
as shown in fig. 3, when the camera and the millimeter wave radar observe the same target, the target point scanned by the millimeter wave radar may be projected into the image collected by the camera by equation (6). However, the zoom factor S, the camera focal length f and other camera internal parameters, and the rotation matrix R, the translation vector T and other external parameters need to be estimated, the calculation process is complicated, and for simplifying the calculation, an equivalent transformation relation N is used to represent the relation between O' uv and O "") p θ, which is expressed as:
u v 1 = N ρ sin θ ρ cos θ 1 - - - ( 7 )
wherein, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , referred to as a homography transformation matrix.
Step 2: determining a proper calibration distance and reasonably segmenting the calibration distance:
the method comprises the steps that a calibrated target distance rho measured under a millimeter wave radar polar coordinate system is called a calibrated distance, the calibrated distance L is divided into a short-distance range L1 and a long-distance range L2 from near to far, the calibrated distance L is divided into m1 sections in the short-distance range L1, the calibrated distance L is divided into m2 sections in the long-distance range L2, and the short-distance range section can be larger than the short-distance range section because the short-distance target needs to be accurately detected and only needs to be roughly judged in the unmanned vehicle detection process, wherein the L1/m1 is smaller than the L2/m 35 2;
in the embodiment, the specified maximum speed of the unmanned vehicle is 36km/h, and according to the relation between the distance area concerned by the unmanned vehicle during driving and the speed of the unmanned vehicle, 50m is determined to be a proper calibration distance. Within a calibration distance of 50m, it is found through experiments that: the single homography space transformation matrix is utilized to cause alignment failure, so that 4 sections with the interval of 5m are arranged within the range of 0-20m in the calibration distance according to the requirement of the unmanned vehicle speed and the difference of the attention distance area; within the nominal distance of 20-50m, the interval of 10m is a section of 3 sections.
And step 3: respectively acquiring images and data information of a calibration target through a camera and a millimeter wave radar which are loaded in the unmanned vehicle:
in order to ensure the accuracy of spatial alignment, five rows are respectively taken in the longitudinal direction in each segment, and 7 sets of image and data information corresponding to the calibration target are taken in the transverse direction of each row. And a camera of an IEEE1394 interface arranged on the unmanned vehicle and a millimeter wave radar of a CAN bus interface are used for acquiring the image and data information of a fixed target under the same scene, and the information is transmitted to the industrial personal computer.
And 4, step 4: image data f for each segment in each segment obtained by the camera in step 3k MSeparately calculating the coordinates of the centroid (u) of the imagek M,vk M);
Generally, a clustering method adopted by a millimeter wave radar is a nearest field method, and therefore, calibration target data information scanned by the millimeter wave radar in a millimeter wave radar coordinate system corresponds to a coordinate value of a position of a centroid in an image coordinate system. Gray values are represented for the digital image f, f (i, j), and points in the image area are represented for (i, j). Calculating the centroid coordinates of the calibration target in the image information by adopting the following steps:
and S40, manually selecting a candidate area containing the calibration target to reduce the interference of the background on the calculation of the centroid of the calibration target.
And S41, performing median filtering on the candidate area to eliminate noise interference in the image. Median filtering is implemented by the expression f (i, j) ═ media { f (i-k, j-l). Where (k, l) ∈ ω, ω is the 3 × 3 domain of pixels.
And S42, performing Sobel operator edge detection on the candidate region image to obtain a candidate region edge image containing the calibration target. Template for Sobel operator mask = - 1 - 2 - 1 0 0 0 1 2 1 , And performing convolution operation on the candidate area image and the mask to obtain a binary image of the candidate area.
S43, in an image coordinate system with pixels as units, finding u-axis pixel point coordinates u of the coordinate minimum value and the coordinate maximum value in the calibration target edge image along the u-axismin,umaxFinding the v-axis pixel point coordinate v of the coordinate minimum value and the coordinate maximum value in the calibration target edge image along the v-axismin,vmaxConnecting the above 4 points with straight lines in clockwise or counterclockwise direction to form a quadrilateral region, and using formula in the quadrilateral region u k M = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) And v M k = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) calculating the coordinates of the centroid (u) of the calibration targetk M,vk M) Wherein f isk M(i, j) represents the gray value of a pixel point (i, j) in a quadrilateral region corresponding to the kth subsection of the target on the Mth distance subsection;
and 5: solving a homography space transformation matrix representing the relation between the millimeter wave radar coordinate system and the camera coordinate system:
millimeter wave radar coordinate data corresponding to all small segments obtained for each segment of distance separated from the whole calibration distance L (a)
Figure BDA00002733233500101
YM rk) And image data (u) of the cameraM k,vM k) Forming a corresponding data set in each segment, and substituting each data set into equations (5) and (7) respectively to obtain:
u 1 M · · · u k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 11 M n 12 M n 13 M - - - ( 8 )
v 1 M · · · v k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 21 M n 22 M n 23 M - - - ( 9 )
and 1 · · · 1 = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 )
definition of P M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M · · · u k M T , V M = v 1 M · · · v k M T , Ik×1=[1…1]TThen homography spatial transformation matrix NMThe least squares solution of (d) can be expressed as: N M = N 1 M T N 2 M T N 3 M T T , wherein,
Figure BDA000027332335001010
and
Figure BDA000027332335001011
respectively as follows: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M and N 3 = ( P M P M T ) - 1 P M T I k × 1 ;
step 6: the space alignment of the vision sensor and the millimeter wave radar is realized:
according to the actual distance of the calibration target scanned by the millimeter wave radar, firstly, judging which segment the distance is in the step 2, and then, calculating N of the homography space transformation matrix in the segment according to the step 5M(M ═ 1, 2.., 7), spatial alignment is achieved.
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (3)

1. A segment space alignment method based on a homography transformation matrix is characterized by comprising the following steps:
step 1: establishing a homography transformation matrix-based relation between a camera coordinate system and a millimeter wave radar coordinate system:
defining an image coordinate system O 'uv of the camera, wherein O' is located in the upper left corner of the imaging plane of the camera; the u axis is parallel to the scanning line direction of the camera; the v-axis is perpendicular to the camera scan line direction;
defining O 'rho theta as a millimeter wave radar polar coordinate system, and O' as the center of the millimeter wave radar surface; rho is the linear distance between the target and the millimeter wave radar; theta is the angle of the target deviating from the central line of the scanning plane of the millimeter wave radar, the relation between the image coordinate system O 'uv of the camera and the millimeter wave radar polar coordinate system O' rho theta is expressed as follows:
u v 1 = N ρ sin θ ρ cos θ 1 - - - ( 7 )
wherein, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , defining a homography transformation matrix;
step 2: determining a proper calibration distance between the unmanned vehicle and a calibration target:
definition O "" XrYrZrThe rectangular coordinate system of the millimeter wave radar is represented, and O' is the center of the surface of the millimeter wave radar; y isrThe axis is the central line of a scanning plane of the millimeter wave radar, is vertical to the surface of the millimeter wave radar and points to the right front; xrAxis and YrVertical, pointing to the right; zrAxis perpendicular to Xr、YrA determined plane pointing upwards;
the relationship between the millimeter wave radar rectangular coordinate system and the millimeter wave radar polar coordinate system is as follows:
X r Y r 1 = ρ sin θ ρ cos θ 1 - - - ( 7 ) ′
the distance between the calibration target and the unmanned vehicle is on the longitudinal axis Y of the millimeter wave radar rectangular coordinate systemrThe projection on the image is called a calibration distance; in the detection range of the millimeter wave radar, determining a proper calibration distance L according to the maximum movement speed of the unmanned vehicle; dividing the calibration distance L into a short-distance range L1 and a long-distance range L2 from near to far, dividing the calibration distance L into m1 sections in the short-distance range L1, dividing the calibration distance L into m2 sections in the long-distance range L2, and ensuring that L1/m1 is smaller than L2/m 2;
and step 3: respectively acquiring images and data information of a calibration target through a camera and a millimeter wave radar which are loaded in the unmanned vehicle:
respectively placing the calibration targets at different sections of which the calibration distance L is divided in the step 2, respectively detecting the targets at the distances of m1+ m2 by the millimeter wave radar and the camera, and aiming at the target at each section of distance during detection, enabling the target to be located along YrDividing the axial direction into m rows, and dividing each row along XrEqually dividing the axial direction into h sections, and controlling the millimeter wave radar to acquire coordinate data (X) of each sectionM rk,YM rk) Controlling the camera to capture image data f of each segmentk MWherein M ═ 1., (M1+ M2), k ═ 1, 2., (M2);
and 4, step 4: image data f for each segment in each segment obtained by the camera in step 3k MSeparately calculating the coordinates of the centroid (u) of the imagek M,vk M);
And 5: solving a homography space transformation matrix representing the relation between the millimeter wave radar coordinate system and the camera coordinate system:
millimeter wave radar coordinate data corresponding to all small segments obtained for each segment of distance separated from the whole calibration distance L (a)
Figure FDA00002733233400021
YM rk) And image data (u) of the cameraM k,vM k) Forming a corresponding data set in each segment, and substituting each data set into equations (7) and (7)' respectively to obtain:
u 1 M · · · u k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 11 M n 12 M n 13 M - - - ( 8 )
v 1 M · · · v k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 21 M n 22 M n 23 M - - - ( 9 )
and 1 · · · 1 = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 )
definition of P M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M · · · u k M T , V M = v 1 M · · · v k M T , Ik×1=[1…1]TThen homography spatial transformation matrix NMThe least squares solution of (d) can be expressed as: N M = N 1 M T N 2 M T N 3 M T T , wherein,
Figure FDA00002733233400032
and
Figure FDA00002733233400033
respectively as follows: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M and N 3 = ( P M P M T ) - 1 P M T I k × 1 ;
step 6: the space alignment of the vision sensor and the millimeter wave radar is realized:
and (3) judging which segment the distance is in step 2 according to the actual distance of the calibration target scanned by the millimeter wave radar, and searching a homography space transformation matrix corresponding to the distance from m1+ m2 results obtained by calculation in step 5 to realize space alignment.
2. The method according to claim 1, wherein when the calibration distance in step 2 is 50m, the short distance range is 0-20m, and the segments are divided into 4 segments; the distance range is 20-50m, and the distance range is divided into 3 sections.
3. The method as claimed in claim 1, wherein the method of calculating the coordinates of the centroid of each small segment of the calibration target image in step 4 is as follows:
s40, manually selecting a candidate area containing a calibration target;
s41, carrying out median filtering on the candidate area image to eliminate the noise in the image;
s42, performing Sobel operator edge detection on the candidate region image to obtain a binarized calibration target edge image;
s43, in an image coordinate system with pixels as units, finding u-axis pixel point coordinates u of the coordinate minimum value and the coordinate maximum value in the calibration target edge image along the u-axismin,umaxFinding the v-axis pixel point coordinate v of the coordinate minimum value and the coordinate maximum value in the calibration target edge image along the v-axismin,vmaxConnecting the above 4 points with straight lines in clockwise or counterclockwise direction to form a quadrilateral region, and using formula in the quadrilateral region u k M = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) And v M k = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) calculating the coordinates of the centroid (u) of the calibration targetk M,vk M) Wherein f isk MAnd (i, j) represents the gray value of a pixel point (i, j) in the quadrilateral region corresponding to the kth subsection of the target on the Mth distance subsection.
CN201310013045.8A 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix Active CN103065323B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310013045.8A CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310013045.8A CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Publications (2)

Publication Number Publication Date
CN103065323A true CN103065323A (en) 2013-04-24
CN103065323B CN103065323B (en) 2015-07-15

Family

ID=48107940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310013045.8A Active CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Country Status (1)

Country Link
CN (1) CN103065323B (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200483A (en) * 2014-06-16 2014-12-10 南京邮电大学 Human body central line based target detection method under multi-camera environment
CN104280019A (en) * 2013-07-10 2015-01-14 德尔福电子(苏州)有限公司 All-round looking system calibration device based on flexible calibration plate
CN104464173A (en) * 2014-12-03 2015-03-25 国网吉林省电力有限公司白城供电公司 Power transmission line external damage protection system based on space image three-dimensional measurement
CN104965202A (en) * 2015-06-18 2015-10-07 奇瑞汽车股份有限公司 Barrier detection method and device
CN105818763A (en) * 2016-03-09 2016-08-03 乐卡汽车智能科技(北京)有限公司 Method, device and system for confirming distance of object around vehicle
CN106730106A (en) * 2016-11-25 2017-05-31 哈尔滨工业大学 The coordinate scaling method of the micro-injection system of robot assisted
CN108109173A (en) * 2016-11-25 2018-06-01 宁波舜宇光电信息有限公司 Vision positioning method, camera system and automation equipment
CN108226906A (en) * 2017-11-29 2018-06-29 深圳市易成自动驾驶技术有限公司 A kind of scaling method, device and computer readable storage medium
CN109471096A (en) * 2018-10-31 2019-03-15 奇瑞汽车股份有限公司 Multi-Sensor Target matching process, device and automobile
CN110658518A (en) * 2018-06-29 2020-01-07 杭州海康威视数字技术股份有限公司 Target intrusion detection method and device
CN110660186A (en) * 2018-06-29 2020-01-07 杭州海康威视数字技术股份有限公司 Method and device for identifying target object in video image based on radar signal
CN110879598A (en) * 2019-12-11 2020-03-13 北京踏歌智行科技有限公司 Information fusion method and device of multiple sensors for vehicle
CN111429530A (en) * 2020-04-10 2020-07-17 浙江大华技术股份有限公司 Coordinate calibration method and related device
CN111538008A (en) * 2019-01-18 2020-08-14 杭州海康威视数字技术股份有限公司 Transformation matrix determining method, system and device
CN112162252A (en) * 2020-09-25 2021-01-01 南昌航空大学 Data calibration method for millimeter wave radar and visible light sensor
CN112348863A (en) * 2020-11-09 2021-02-09 Oppo广东移动通信有限公司 Image alignment method, image alignment device and terminal equipment
US20210318426A1 (en) * 2018-05-21 2021-10-14 Johnson Controls Tyco IP Holdings LLP Building radar-camera surveillance system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033972A (en) * 2007-02-06 2007-09-12 华中科技大学 Method for obtaining three-dimensional information of space non-cooperative object
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
US20110025841A1 (en) * 2009-07-29 2011-02-03 Ut-Battelle, Llc Estimating vehicle height using homographic projections
CN102062576A (en) * 2010-11-12 2011-05-18 浙江大学 Device for automatically marking additional external axis robot based on laser tracking measurement and method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033972A (en) * 2007-02-06 2007-09-12 华中科技大学 Method for obtaining three-dimensional information of space non-cooperative object
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
US20110025841A1 (en) * 2009-07-29 2011-02-03 Ut-Battelle, Llc Estimating vehicle height using homographic projections
CN102062576A (en) * 2010-11-12 2011-05-18 浙江大学 Device for automatically marking additional external axis robot based on laser tracking measurement and method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
XIANRU LIU ET AL.: "Advanced Obstacles Detection and Tracking by Fusing Millimeter Wave Radar and Image Sensor Data", 《INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS 2010》 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104280019A (en) * 2013-07-10 2015-01-14 德尔福电子(苏州)有限公司 All-round looking system calibration device based on flexible calibration plate
CN104200483B (en) * 2014-06-16 2018-05-18 南京邮电大学 Object detection method based on human body center line in multi-cam environment
CN104200483A (en) * 2014-06-16 2014-12-10 南京邮电大学 Human body central line based target detection method under multi-camera environment
CN104464173A (en) * 2014-12-03 2015-03-25 国网吉林省电力有限公司白城供电公司 Power transmission line external damage protection system based on space image three-dimensional measurement
CN104965202A (en) * 2015-06-18 2015-10-07 奇瑞汽车股份有限公司 Barrier detection method and device
CN105818763A (en) * 2016-03-09 2016-08-03 乐卡汽车智能科技(北京)有限公司 Method, device and system for confirming distance of object around vehicle
CN105818763B (en) * 2016-03-09 2018-06-22 睿驰智能汽车(广州)有限公司 A kind of method, apparatus and system of determining vehicle periphery object distance
CN106730106A (en) * 2016-11-25 2017-05-31 哈尔滨工业大学 The coordinate scaling method of the micro-injection system of robot assisted
CN108109173A (en) * 2016-11-25 2018-06-01 宁波舜宇光电信息有限公司 Vision positioning method, camera system and automation equipment
CN106730106B (en) * 2016-11-25 2019-10-08 哈尔滨工业大学 The coordinate scaling method of the micro-injection system of robot assisted
CN108109173B (en) * 2016-11-25 2022-06-28 宁波舜宇光电信息有限公司 Visual positioning method, camera system and automation equipment
CN108226906A (en) * 2017-11-29 2018-06-29 深圳市易成自动驾驶技术有限公司 A kind of scaling method, device and computer readable storage medium
CN108226906B (en) * 2017-11-29 2019-11-26 深圳市易成自动驾驶技术有限公司 A kind of scaling method, device and computer readable storage medium
US11733370B2 (en) * 2018-05-21 2023-08-22 Johnson Controls Tyco IP Holdings LLP Building radar-camera surveillance system
US20210318426A1 (en) * 2018-05-21 2021-10-14 Johnson Controls Tyco IP Holdings LLP Building radar-camera surveillance system
CN110660186A (en) * 2018-06-29 2020-01-07 杭州海康威视数字技术股份有限公司 Method and device for identifying target object in video image based on radar signal
CN110658518B (en) * 2018-06-29 2022-01-21 杭州海康威视数字技术股份有限公司 Target intrusion detection method and device
CN110660186B (en) * 2018-06-29 2022-03-01 杭州海康威视数字技术股份有限公司 Method and device for identifying target object in video image based on radar signal
CN110658518A (en) * 2018-06-29 2020-01-07 杭州海康威视数字技术股份有限公司 Target intrusion detection method and device
CN109471096B (en) * 2018-10-31 2023-06-27 奇瑞汽车股份有限公司 Multi-sensor target matching method and device and automobile
CN109471096A (en) * 2018-10-31 2019-03-15 奇瑞汽车股份有限公司 Multi-Sensor Target matching process, device and automobile
CN111538008A (en) * 2019-01-18 2020-08-14 杭州海康威视数字技术股份有限公司 Transformation matrix determining method, system and device
CN110879598A (en) * 2019-12-11 2020-03-13 北京踏歌智行科技有限公司 Information fusion method and device of multiple sensors for vehicle
CN111429530A (en) * 2020-04-10 2020-07-17 浙江大华技术股份有限公司 Coordinate calibration method and related device
CN111429530B (en) * 2020-04-10 2023-06-02 浙江大华技术股份有限公司 Coordinate calibration method and related device
CN112162252A (en) * 2020-09-25 2021-01-01 南昌航空大学 Data calibration method for millimeter wave radar and visible light sensor
CN112162252B (en) * 2020-09-25 2023-07-18 南昌航空大学 Data calibration method for millimeter wave radar and visible light sensor
CN112348863A (en) * 2020-11-09 2021-02-09 Oppo广东移动通信有限公司 Image alignment method, image alignment device and terminal equipment

Also Published As

Publication number Publication date
CN103065323B (en) 2015-07-15

Similar Documents

Publication Publication Date Title
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
US10930015B2 (en) Method and system for calibrating multiple cameras
CN105445721B (en) Based on laser radar and video camera combined calibrating method with feature protrusion V-type calibration object
US9454816B2 (en) Enhanced stereo imaging-based metrology
Alismail et al. Automatic calibration of a range sensor and camera system
CN106529587B (en) Vision course recognition methods based on object detection
Xie et al. Infrastructure based calibration of a multi-camera and multi-lidar system using apriltags
EP3032818B1 (en) Image processing device
US20190120934A1 (en) Three-dimensional alignment of radar and camera sensors
CN112907676A (en) Calibration method, device and system of sensor, vehicle, equipment and storage medium
Yan et al. Joint camera intrinsic and lidar-camera extrinsic calibration
US10760907B2 (en) System and method for measuring a displacement of a mobile platform
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN104574406A (en) Joint calibration method between 360-degree panorama laser and multiple visual systems
CN110827361B (en) Camera group calibration method and device based on global calibration frame
CN109523579B (en) Method and device for matching video image of unmanned aerial vehicle with three-dimensional map
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
Rosero et al. Calibration and multi-sensor fusion for on-road obstacle detection
CN117496467A (en) Special-shaped lane line detection method based on fusion of monocular camera and 3D LIDAR
CN116091603A (en) Box workpiece pose measurement method based on point characteristics
Manivannan et al. Vision based intelligent vehicle steering control using single camera for automated highway system
Hasheminasab et al. Linear Feature-based image/LiDAR integration for a stockpile monitoring and reporting technology
Del Pizzo et al. Reliable vessel attitude estimation by wide angle camera
Kim et al. Urban localization based on aerial imagery by correcting projection distortion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant