Nothing Special   »   [go: up one dir, main page]

CN114792345B - Calibration method based on monocular structured light system - Google Patents

Calibration method based on monocular structured light system Download PDF

Info

Publication number
CN114792345B
CN114792345B CN202210733202.1A CN202210733202A CN114792345B CN 114792345 B CN114792345 B CN 114792345B CN 202210733202 A CN202210733202 A CN 202210733202A CN 114792345 B CN114792345 B CN 114792345B
Authority
CN
China
Prior art keywords
camera
projector
vector
formula
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210733202.1A
Other languages
Chinese (zh)
Other versions
CN114792345A (en
Inventor
杨静
时岭
高勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Lanxin Technology Co ltd
Original Assignee
Hangzhou Lanxin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Lanxin Technology Co ltd filed Critical Hangzhou Lanxin Technology Co ltd
Priority to CN202210733202.1A priority Critical patent/CN114792345B/en
Publication of CN114792345A publication Critical patent/CN114792345A/en
Application granted granted Critical
Publication of CN114792345B publication Critical patent/CN114792345B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a calibration method based on a monocular structured light system, which comprises the following steps: after the projector projects the coding pattern, the control device decodes the coding image collected by the camera to obtain the coding values of all pixel points in the coding image; extracting characteristic points of the calibration board based on a calibration board image collected by a camera, acquiring depth values of all pixel points in the range of the calibration board according to basic parameters of the calibration board, and acquiring coding values of all pixel points in the range of the calibration board; and according to the mathematical relation among the coding values, the pixel point information and the depth values of all the pixel points in the range of the calibration plate, constructing distortion models of the camera vector and the projector vector, solving the constructed models based on the depth values of the pixel points, and acquiring camera internal parameters, projector internal parameters and system external parameters. The calibration method can realize that only searching and simple calculation are needed in the three-dimensional reconstruction process to obtain the three-dimensional reconstruction result, and has high speed and high accuracy.

Description

Calibration method based on monocular structured light system
Technical Field
The invention relates to the technical field of robots, in particular to a calibration method based on a monocular structured light system and a three-dimensional reconstruction method of an image.
Background
At present, surface structured light three-dimensional imaging is a non-contact high-precision three-dimensional measurement technology and is widely applied to the fields of measurement, detection, automation and the like. Compared with a binocular structured light system, the monocular structured light system has the advantages of lower cost, higher algorithm speed and fewer blind areas, can ensure enough data quality, and obtains more and more extensive attention and application. Unlike a mature and reliable calibration method for a binocular system, the calibration and algorithm of a monocular structured light system are complex, mainly the calibration of projector parameters, and at present, the research is more, the process is complex and not easy to operate, or the precision is low, so that the requirements cannot be met. Based on this, the industry has proposed calibration and three-dimensional reconstruction methods based on monocular structured light systems.
Specifically, there are generally several calibration and reconstruction methods available: the high-precision translation stage is used for collecting plane data at different distances as a reference surface, and the standard data are restored during reconstruction, so that the calibration precision depends on the precision of the translation stage, the large-view and long-distance calibration is difficult to realize, and the on-site calibration cannot be realized; the method is not suitable for systems which can only project one-dimensional line structured light, such as a galvanometer, an MEMS micro galvanometer and the like, and the error is easy to accumulate by step calibration of the camera and the projector, and the calculation precision of the homography matrix of the camera and the projector is not high; the eight-parameter method directly calculates the relation between the projection phase and the camera coordinate, and the method is simple in operation and calculation, but has poor distortion processing on the projector and large error.
Therefore, a calibration method and a three-dimensional reconstruction method based on a monocular structured light system, which are simple to operate, high in calibration precision and high in processing speed, are urgently needed.
Disclosure of Invention
Technical problem to be solved
In view of the above disadvantages and shortcomings of the prior art, the present invention provides a calibration method based on a monocular structured light system and a three-dimensional reconstruction method of an image.
(II) technical scheme
In order to achieve the purpose, the invention adopts the main technical scheme that:
in a first aspect, an embodiment of the present invention provides a calibration method based on a monocular structured light system, where the monocular structured light system includes: projector for projecting a coding pattern, a camera for acquiring the coding pattern, a control device connecting the projector and the camera, the calibration method comprising:
s10, after the projector projects the coding pattern, the control device decodes the coding image collected by the camera to obtain the coding values of all the pixel points in the coding image;
s20, the control device extracts the characteristic points of the calibration board based on the calibration board image collected by the camera, and obtains the depth values of all the pixel points in the calibration board range according to the basic parameters of the calibration board, and obtains the coding values of all the pixel points in the calibration board range based on the coding values of all the pixel points in the coding image;
s30, according to the mathematical relation among the encoding values, the pixel point information and the depth values of all the pixel points in the range of the calibration board, constructing a distortion model of a camera vector and a projector vector with unknown quantity, and solving the constructed model based on the depth values of the pixel points to obtain camera internal parameters, projector internal parameters and system external parameters;
the distortion model of the projector vector is a virtual model according to a three-dimensional reconstruction result;
the system external parameter is an external parameter rotation matrix when a projector vector is converted into a camera coordinate system
Figure 100002_DEST_PATH_IMAGE001
And external reference translation vector
Figure 452051DEST_PATH_IMAGE002
Optionally, the S30 includes:
s31, acquiring a distortion model of the camera vector, and determining camera internal parameters serving as unknown quantities;
s32, a distortion model of the virtual projector vector, and determining projector internal parameters as unknown quantities;
a first dimension of a distortion model of the projector vector
Figure 100002_DEST_PATH_IMAGE003
For the code/phase dimension, the second dimension
Figure 926894DEST_PATH_IMAGE004
Is the virtual row number;
s33, converting the projector vector into a camera coordinate system, and determining virtual parameters of the projector vector according to the geometric constraint relation;
s34, constructing a solving equation of the depth value of the intersection point of the camera vector and the projector vector based on the distortion model of the camera vector and the distortion model of the virtual projector vector;
and S35, solving an equation based on the depth value of each pixel point in the calibration board range to obtain camera internal parameters, projector internal parameters and system external parameters.
Optionally, the S30 includes:
s31 aiming at camera vector
Figure 100002_DEST_PATH_IMAGE005
The two-dimensional coordinates of each camera pixel point are expressed by adopting a standard camera model of a formula (2):
Figure 564680DEST_PATH_IMAGE006
formula (2);
in the formula (2), the first and second groups,
Figure 100002_DEST_PATH_IMAGE007
expressed as camera vectors
Figure 11842DEST_PATH_IMAGE008
The component in the horizontal direction is,
Figure 100002_DEST_PATH_IMAGE009
expressed as camera vectors
Figure 280144DEST_PATH_IMAGE010
The component in the vertical direction is,
Figure 100002_DEST_PATH_IMAGE011
is the row where the current pixel point is located,
Figure 242283DEST_PATH_IMAGE012
the column in which the current pixel point is located,
Figure 100002_DEST_PATH_IMAGE013
is the focal length of the camera in the horizontal direction,
Figure 73973DEST_PATH_IMAGE014
is the focal length in the vertical direction of the camera,
Figure 100002_DEST_PATH_IMAGE015
Figure 451340DEST_PATH_IMAGE016
is the coordinates of the optical center;
camera vector based on formula (2)
Figure 100002_DEST_PATH_IMAGE017
The distortion model of the medium two-dimensional coordinates is expressed as formula (3):
Figure 77494DEST_PATH_IMAGE018
formula (3);
Figure 100002_DEST_PATH_IMAGE019
all of which are radial distortion coefficients,
Figure 543241DEST_PATH_IMAGE020
all of which are tangential distortion coefficients,
Figure 100002_DEST_PATH_IMAGE021
is an intermediate variable;
Figure 178622DEST_PATH_IMAGE022
Figure 100002_DEST_PATH_IMAGE023
as a camera vector
Figure 397114DEST_PATH_IMAGE017
The internal parameters of (a) are unknown quantities;
s32 projector vector under virtual projector coordinate system
Figure 928589DEST_PATH_IMAGE024
The distortion model of (2);
the standard projector model is:
Figure 100002_DEST_PATH_IMAGE025
formula (4);
virtual projector vector according to basic parameters of projector
Figure 616054DEST_PATH_IMAGE026
The distortion model of (a) is formula (5) or formula (5 a):
Figure 100002_DEST_PATH_IMAGE027
formula (5);
alternatively, equation (5 a) is:
Figure 851863DEST_PATH_IMAGE028
in the formula (5) and the formula (5 a),
Figure 100002_DEST_PATH_IMAGE029
expressed as projector vector
Figure 878856DEST_PATH_IMAGE030
In the component of the code/phase dimension,
Figure 100002_DEST_PATH_IMAGE031
expressed as projector vector
Figure 174708DEST_PATH_IMAGE032
In the component of the virtual row number, p is the encoded value corresponding to each pixel point in the projector vector,
Figure 100002_DEST_PATH_IMAGE033
for the number of rows corresponding to each pixel point in the projector vector,
Figure 536419DEST_PATH_IMAGE034
is the focal length of the projector in the horizontal direction,
Figure 100002_DEST_PATH_IMAGE035
is the focal length of the projector in the vertical direction,
Figure 323722DEST_PATH_IMAGE036
is the coordinates of the optical center;
Figure 100002_DEST_PATH_IMAGE037
are all the distortion coefficients of the projector,
Figure 454489DEST_PATH_IMAGE038
Figure 100002_DEST_PATH_IMAGE039
is the intermediate variable(s) of the variable,
Figure 671975DEST_PATH_IMAGE040
an internal reference of the projector vector is an unknown quantity;
s33, vector projector
Figure 100002_DEST_PATH_IMAGE041
Converting into a camera coordinate system, wherein the coordinate system conversion relation formula is as follows:
Figure 520982DEST_PATH_IMAGE042
formula (6);
camera vector in camera coordinate system
Figure 100002_DEST_PATH_IMAGE043
Projector vector in camera coordinate system
Figure 833015DEST_PATH_IMAGE044
Translation vector
Figure 100002_DEST_PATH_IMAGE045
The coplanar relationship is:
Figure 896917DEST_PATH_IMAGE046
formula (1);
Figure DEST_PATH_IMAGE047
in order to refer to the rotation matrix externally,
Figure 737834DEST_PATH_IMAGE048
the vector is an external reference translation vector,
Figure 339717DEST_PATH_IMAGE047
and
Figure 675014DEST_PATH_IMAGE048
as an external parameter of the system is of unknown quantity,
Figure DEST_PATH_IMAGE049
to convert the projector vector to virtual parameters of the projector vector determined from geometric constraint relationships in the camera coordinate system.
Optionally, the S34 includes:
calculating a depth value according to formula (7) based on the camera vector and the projector vector acquired by formula (1) to formula (6);
solving the unknown quantity based on the fact that the depth value obtained by the S20 is consistent with the depth value obtained by the formula (7), and obtaining camera internal parameters, projector internal parameters and system external parameters;
Figure 780374DEST_PATH_IMAGE050
equation (7).
Optionally, the calibration board is a checkerboard calibration board, and the basic parameters of the calibration board include: calibrating the total number of the checkerboards in the board and the size information of each check in the board;
the coded image is an image coded by one or more coding modes of Gray code, phase shift method and multi-frequency extrapolation.
Optionally, the S20 includes:
the control device extracts the checkerboard angular points serving as the characteristic points according to the checkerboard images, obtains lines and columns of the angular points as two-dimensional coordinates, establishes a world coordinate system, and obtains three-dimensional coordinates of the characteristic points according to basic parameters of the checkerboard;
and acquiring a homography matrix of the calibration board and the camera based on the two-dimensional image coordinates and the three-dimensional coordinates of the feature points, and calculating the depth values of all pixel points in the checkerboard range according to the homography matrix.
In a second aspect, an embodiment of the present invention further provides a method for three-dimensional reconstruction of an image, including:
acquiring an image of a target to be detected, and acquiring a camera vector and a projector vector of each pixel point under a camera coordinate system based on pre-acquired camera internal parameters, projector internal parameters and system external parameters;
based on the camera vector and the projector vector, obtaining the three-dimensional space coordinates of each pixel point in the image;
the pre-acquired camera internal parameters, projector internal parameters and system external parameters are calibrated by using any one of the calibration methods of the first aspect.
Optionally, based on a camera vector and a projector vector, obtaining three-dimensional space coordinates of each pixel point in the image includes:
calculating a depth value z of an intersection of the camera vector and the projector vector according to formula (7);
Figure 792192DEST_PATH_IMAGE050
formula (7);
wherein,
Figure 615791DEST_PATH_IMAGE051
is a camera vector in a camera coordinate system,
Figure 100002_DEST_PATH_IMAGE052
is the projector vector in the camera coordinate system,
Figure 548588DEST_PATH_IMAGE053
is a translation vector;
based on formula (8), obtaining a spatial coordinate value of each pixel point in the image under the camera coordinate system:
Figure 100002_DEST_PATH_IMAGE054
formula (8);
z is a depth value and is a depth value,
Figure 508454DEST_PATH_IMAGE055
expressed as camera vectors
Figure 100002_DEST_PATH_IMAGE056
The component in the horizontal direction is,
Figure 238643DEST_PATH_IMAGE057
expressed as camera vectors
Figure DEST_PATH_IMAGE058
The component in the vertical direction.
Optionally, acquiring an image of the target to be measured, and acquiring a camera vector of each pixel point under a camera coordinate system based on the pre-acquired camera internal reference, the projector internal reference and the system external reference, wherein the projector vector includes:
searching a camera vector and a projector vector of each pixel point under a camera coordinate system based on a pre-acquired lookup table;
wherein, in the calibration method, a corresponding relation table of pixel points and each parameter is constructed according to camera internal parameters, projector internal parameters and system external parameters corresponding to each pixel point,
the corresponding relation table is a lookup table used for quickly searching camera parameters, projector parameters and each pixel point participating outside the system during three-dimensional reconstruction.
In a third aspect, an embodiment of the present invention further provides a control apparatus, including: a memory for storing a computer program and a processor for executing the computer program stored in the memory and performing the steps of the method of any of the first or second aspects.
(III) advantageous effects
The calibration process of the calibration method is simple to operate, is completely the same as the calibration process of a conventional binocular camera, only needs one calibration plate, can be synchronously completed with the calibration of hands and eyes, and is very convenient. Only one-dimensional line structured light needs to be projected during calibration, the working mode is completely consistent with normal working, and additional configuration is not needed. The method of the invention well solves the distortion problem of the projector, and is suitable for different projector models, including DMD projectors, galvanometers, MEMS micro galvanometer and other one-dimensional linear structured light projection. All parameters are calibrated synchronously, no accumulated error exists, and the precision is higher than that of the conventional method.
The method provided by the embodiment of the invention can realize that only a lookup table is needed and simple calculation is carried out in the three-dimensional reconstruction process, and the three-dimensional reconstruction result is obtained, and the method is high in speed and high in accuracy.
Drawings
Fig. 1 is a schematic flowchart of a calibration method based on a monocular structured light system according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a method for three-dimensional reconstruction of an image according to an embodiment of the present invention.
Detailed Description
For the purpose of better explaining the present invention and to facilitate understanding, the present invention will be described in detail by way of specific embodiments with reference to the accompanying drawings.
In this embodiment, vertical stripes can be projected without the line structured light of horizontal stripes, and it should be understood that the line structured light is the structured light with encoded images.
Specifically, the controller controls the camera to collect coded images and calibration plate images of more than one calibration plate position, then a model is constructed according to the principle that the camera vector, the projector vector and the translation vector are coplanar, unknown quantity parameters are solved, and camera internal parameters, projector internal parameters and system external parameters are obtained. The virtual projector internal parameters of the distortion of the projector are fully considered in the process of solving the unknown quantity, so that the three-dimensional reconstruction accuracy based on the camera internal parameters, the projector internal parameters and the system external parameters is higher.
The method can be suitable for different projector models, including DMD projectors, galvanometers, MEMS micro galvanometer and other one-dimensional line structured light projection, all parameters are calibrated synchronously, and no accumulated error exists.
Example one
As shown in fig. 1, an embodiment of the present invention provides a calibration method based on a monocular structured light system, where an execution subject of the method of this embodiment may be any computing device/control device, and the monocular structured light system of this embodiment includes: the projector for projecting the one-dimensional coding pattern, the camera for collecting the coding pattern, and the control device for connecting the projector and the camera, the specific implementation method comprises the following steps:
and S10, after the projector projects the coding pattern, the control device decodes the coding image collected by the camera to obtain the coding values of all the pixel points in the coding image.
This step can be implemented in a conventional manner, with a projector that projects one or more coding patterns.
S20, the control device extracts the characteristic points of the calibration board based on the calibration board image collected by the camera, and obtains the depth values of all the pixel points in the calibration board range according to the basic parameters of the calibration board, and obtains the coding values of all the pixel points in the calibration board range based on the coding values of all the pixel points in the coding image.
Generally, the projector projects full bright light, coded images and the like according to instructions of the control device, and in order to acquire the calibration plate image, the projector projects full bright light according to the instructions, so that the camera acquires the calibration plate image.
The feature points in this step may be predefined corner points.
For example, the control device extracts the checkerboard angular points serving as the feature points according to the checkerboard image, obtains the rows and columns of the angular points as two-dimensional image coordinates, establishes a world coordinate system, and obtains the three-dimensional coordinates of the feature points according to the basic parameters of the checkerboard;
and acquiring a homography matrix of the calibration plate and the camera based on the two-dimensional image coordinates and the three-dimensional coordinates of the feature points, and calculating the depth values of all pixel points in the checkerboard range according to the homography matrix.
S30, according to the mathematical relation among the coded values of all pixel points in the range of the calibration board, pixel point information (such as rows, columns, identifications and the like where the pixel points are located) and depth values, constructing a distortion model of a camera vector and a projector vector with unknown quantity, and solving the constructed model based on the depth values of the pixel points to obtain camera internal parameters, projector internal parameters and system external parameters;
the distortion model of the projector vector is a virtual model according to a three-dimensional reconstruction result;
the system external parameters are an external parameter rotation matrix R and an external parameter translation vector when the projector vector is converted into a camera coordinate system
Figure 549539DEST_PATH_IMAGE059
The pixel information may include the pixel information mentioned in the following embodiments
Figure 475907DEST_PATH_IMAGE011
Figure 555858DEST_PATH_IMAGE012
Figure 191370DEST_PATH_IMAGE013
Figure 989562DEST_PATH_IMAGE014
And
Figure 922883DEST_PATH_IMAGE015
Figure 857341DEST_PATH_IMAGE016
and the like, and the present embodiment is not limited thereto, and is configured and adjusted according to actual needs.
In addition, the distortion model of the projector vector in this embodiment may also be a distortion model of a corresponding camera vector, where the first dimension of the virtual model is a coding/phase dimension, and the second dimension is a virtual line number. The embodiment mainly ensures the precision of the three-dimensional reconstruction result and the distortion model of the virtual projector vector on the basis of ensuring the precision.
It is understood that, for the convenience of the subsequent three-dimensional reconstruction and the improvement of the processing speed, after the step S30, the following step S40 may be further performed to obtain a lookup table of the parameters and the pixel points of the camera vector and the projector vector.
And S40, acquiring camera internal parameters, projector internal parameters and system external parameters, and constructing a corresponding relation table of pixel points and each parameter, wherein the corresponding relation table is a lookup table of camera internal parameters, projector internal parameters and system external participation pixel points which are used for fast lookup in three-dimensional reconstruction.
It should be noted that, in theory, the calibration step in the calibration method of this embodiment can be completed by using one encoded image to obtain the internal reference of the camera vector, the internal reference of the projector vector, and the system external reference, and further obtain the lookup tables of all parameters and pixel points. The calibration method can realize that only a lookup table is needed and simple calculation is carried out in the three-dimensional reconstruction process, and the three-dimensional reconstruction result is obtained, and the calibration method is high in speed and high in accuracy.
In specific practical application, in order to improve the accuracy of the acquired parameters, the position of the calibration board can be adjusted, the information of the camera vector internal parameters, the projector vector internal parameters and the system external parameters corresponding to the calibration boards at different positions is acquired, the average value is further calculated, the most accurate information of the camera vector internal parameters, the projector vector internal parameters and the system external parameters is acquired, and therefore a lookup table with higher accuracy and faster lookup is acquired.
Example two
In this embodiment, the method of the present invention is described in detail, in this embodiment, both the 3D camera (camera for short) and the calibration board are connected to the controller, and all the calculation processes are completed in the controller. The specific scheme is to provide a calibration method firstly, the calibration method obtains a lookup table of a camera vector and a projector vector, and further based on the lookup table, the embodiment also provides a three-dimensional reconstruction method of an image.
Before the method of the embodiment is executed, a calibration board can be placed at least at one designated position in the common visual field range of the camera and the projector, and the projector projects the one-dimensional coded structured light pattern and defines the coding direction, so that the placing directions of the camera and the projector are consistent with the coding direction. In practical applications, the calibration board in this embodiment may be a checkerboard calibration board, and the basic parameters of the calibration board may include: the overall size of the calibration board, the total number of the checkerboards in the calibration board, the size information of each check in the calibration board and the like. The current position information of the calibration plate is calculated based on the encoded image and the size information of the calibration plate.
The calibration method based on the monocular structured light system in the embodiment may include the following steps:
and step 01, connecting the camera and the projector with a controller, controlling the projector to project the coded pattern by the controller, synchronously acquiring images by the camera, and decoding the acquired coded images by the controller.
In this embodiment, the controller may implement decoding by using a plurality of encoded images to obtain an encoded value of each pixel point.
For example, the encoded image is an image encoded by using one or more of gray code, phase shift method, and multi-frequency extrapolation.
And step 02, identifying the boundary information of the calibration plate in the coded image by the controller, and acquiring the coded value of each pixel point corresponding to the calibration plate according to the identified boundary information.
Of course, in other embodiments, the controller may directly identify the encoded value of each pixel corresponding to the calibration board in the encoded image without the process of identifying the boundary information and then determining the pixels of the calibration board.
The coded image in this step may be an image coded in a gray code manner, and thus, the controller may directly obtain the coded value in the existing manner, and this step does not detail the process of obtaining the coded value.
Step 03, in this step, a zhangyingyou scaling method can be adopted to obtain the depth value of each pixel point within the scaling board range.
Of course, for ease of understanding, a brief explanation is provided here. The controller establishes a world coordinate system according to the basic parameters of the calibration plate, and obtains three-dimensional coordinates of the feature points, which may be corner points of a checkerboard in this embodiment. And extracting the characteristic points of the calibration plate according to the image acquired by the camera, acquiring the two-dimensional coordinates of the characteristic points, and acquiring the homography matrix of the checkerboard and the camera by adopting a pnp solving principle. The three-dimensional coordinates of each pixel in the camera coordinate system can be obtained through the homography matrix, namely the depth value is obtained.
That is, a checkerboard image photographed by a camera extracts feature points (checkerboard corner points), obtains two-dimensional coordinates (rows and columns of the corner points), establishes a world coordinate system, obtains three-dimensional coordinates of the feature points according to checkerboard parameters, further calculates a homography matrix through the three-dimensional coordinates and the two-dimensional coordinates, and calculates three-dimensional coordinates of all pixels in a checkerboard range according to the homography matrix. The pixel points can be used for subsequent calibration, and if only the characteristic points are used in the calibration process, the number of the pixel points is small, so that all parameters to be solved cannot be obtained, and therefore the three-dimensional coordinates of all the pixel points in the range of the calibration plate need to be calculated.
It can be understood that, since the encoded value of each pixel has a specified mathematical relationship with the three-dimensional coordinates, a mathematical model is established based on the mathematical relationship in step 04, and relevant parameters including camera internal parameters, projector internal parameters and system external parameters are obtained.
And step 04, calculating information of camera internal parameters, projector internal parameters and external parameters used in three-dimensional reconstruction by the controller based on the acquired three-dimensional coordinates and the principle that the current camera vector, projector vector and translation vector are coplanar.
It will be appreciated that the calibration process is the process of constructing an equation, i.e., an equation of the depth value z as shown in the following formula (7).
For a better understanding of this step 04, the following description is made in connection with sub-steps 041 to 044. It is understood that each camera vector corresponds to each pixel point within the calibration board, the camera vector and the projector vector are paired, the projector vector is generated virtually, and the projector vector and the camera vector together correspond to one pixel point.
And a sub-step 041, calculating the projector vector corresponding to each pixel according to the principle that the camera vector, the projector vector and the translation vector are coplanar.
Camera coordinate system phase vector
Figure DEST_PATH_IMAGE060
Projector vector under camera coordinate system
Figure 194912DEST_PATH_IMAGE061
Translation vector
Figure DEST_PATH_IMAGE062
The principle of coplanarity is:
Figure 214821DEST_PATH_IMAGE063
formula (1)
For camera vector
Figure 748570DEST_PATH_IMAGE060
The two-dimensional coordinates of the corresponding pixel can be represented by a standard camera model, such as formula (2):
Figure DEST_PATH_IMAGE064
equation (2).
In the formula (2), the first and second groups,
Figure 537535DEST_PATH_IMAGE065
expressed as camera vectors
Figure DEST_PATH_IMAGE066
The component in the horizontal direction is,
Figure 43078DEST_PATH_IMAGE067
expressed as camera vectors
Figure 550283DEST_PATH_IMAGE066
The component in the vertical direction is,
Figure DEST_PATH_IMAGE068
is the row where the current pixel point is located,
Figure 356565DEST_PATH_IMAGE069
the column in which the current pixel point is located,
Figure DEST_PATH_IMAGE070
is the focal length of the camera in the horizontal direction,
Figure 78664DEST_PATH_IMAGE071
is the focal length in the vertical direction of the camera,
Figure DEST_PATH_IMAGE072
are the coordinates of the optical center. The optical center is the inherent structure of the optical center, camera.
Camera vector based on formula (2)
Figure 679410DEST_PATH_IMAGE073
The distortion model of the medium two-dimensional coordinates is expressed as formula (3):
Figure DEST_PATH_IMAGE074
formula (3)
Figure 486960DEST_PATH_IMAGE075
All of which are radial distortion coefficients,
Figure DEST_PATH_IMAGE076
all are tangential distortion coefficients, and r is an intermediate variable;
Figure 831353DEST_PATH_IMAGE077
Figure DEST_PATH_IMAGE078
as a camera vector
Figure 657227DEST_PATH_IMAGE079
As an unknown quantity, to be solved, i.e. calibrated.
Substep 042 of defining projector vector in projector coordinate system
Figure DEST_PATH_IMAGE080
Is defined as a first dimension
Figure 179606DEST_PATH_IMAGE081
For coding, second dimension
Figure DEST_PATH_IMAGE082
Is the virtual number of rows. The first dimension and the second dimension are both dimensions in the projector coordinate system.
The virtual model may be a model without rows and columns assigned, and the first dimension may also be understood as a phase dimension, both of which are predefined according to user requirements.
Defining projector vectors with reference to the above equation (2)
Figure 723720DEST_PATH_IMAGE083
Formula (4):
Figure DEST_PATH_IMAGE084
formula (4)
The distortion model of the virtual projector vector is either equation (5) or equation (5 a) depending on the basic parameters of the projector:
Figure 871805DEST_PATH_IMAGE085
formula (5);
alternatively, equation (5 a) is:
Figure DEST_PATH_IMAGE086
in the formula (5) and the formula (5 a),
Figure 565567DEST_PATH_IMAGE087
expressed as projector vector
Figure DEST_PATH_IMAGE088
In the component of the code/phase dimension,
Figure 586743DEST_PATH_IMAGE089
expressed as projector vector
Figure DEST_PATH_IMAGE090
In the components of the virtual number of lines, p is the encoded value corresponding to each pixel point in the projector vector, y is the number of lines corresponding to each pixel point in the projector vector,
Figure 290257DEST_PATH_IMAGE091
is the coordinates of the optical center;
Figure DEST_PATH_IMAGE092
are all the distortion coefficients of the projector,
Figure 38770DEST_PATH_IMAGE038
Figure 793231DEST_PATH_IMAGE093
Figure DEST_PATH_IMAGE094
Figure 906680DEST_PATH_IMAGE095
Figure DEST_PATH_IMAGE096
and (4) determining that the internal reference of the projector vector belongs to unknown quantity to be calibrated.
Here, y is a virtual parameter of the projector vector, and needs to be calculated according to equation (1). I.e. p is derived from the encoded image taken by the camera, the camera vector
Figure 159807DEST_PATH_IMAGE097
Is a certain imageCalculated for a pixel point, p is the value read from this pixel point, and then y is calculated according to equation (1). Each pixel has a distinct u, v, and p, and each pixel corresponds to a dynamically resolvable y value of y.
If the projector is a DMD projector, the distortion model of the distortion parameter can be considered to be similar to the camera distortion model described above, as shown in equation (5 a), and the distortion amount in this case can be a radial distortion parameter and a tangential distortion parameter.
If the projector is an MEMS micro-galvanometer, the linear structured light of the MEMS micro-galvanometer has no radial distortion at this time, and the nonlinearity of the coding direction is mainly considered, and the distortion model of the virtual projector vector at this time is formula (5).
Substep 043 of assigning projector vectors in projector coordinate system
Figure DEST_PATH_IMAGE098
The projector vector needs to be converted into the camera coordinate system
Figure 400427DEST_PATH_IMAGE099
The coordinate system conversion relationship is formula (6):
Figure DEST_PATH_IMAGE100
formula (6)
And R is an external reference rotation matrix, unknown quantity and to be calibrated.
Figure 55399DEST_PATH_IMAGE101
And the external reference translation vector is an unknown quantity to be calibrated.
Substep 044, calculating a depth value according to formula (7) based on the camera vector and the projector vector acquired by formula (1) to formula (6);
solving the unknown quantity based on the fact that the depth value obtained by the S20 is consistent with the depth value obtained by the formula (7), and obtaining camera internal parameters, projector internal parameters and system external parameters;
Figure DEST_PATH_IMAGE102
equation (7).
That is to say, the depth values of the pixel points with the number corresponding to the parameter to be solved are selected as the z value on the right side, the two-dimensional coordinates of the selected pixel points are used as the known numerical values on the left side, and solution is carried out to obtain the camera internal parameter, the projector internal parameter and the system external parameter;
further, camera internal parameters, projector internal parameters and system external parameters are obtained to construct a corresponding relation table of pixel points and each parameter, and a lookup table facilitating parameter and pixel point lookup is obtained.
The calibration process of the method is simple to operate, is completely the same as the calibration process of a conventional binocular camera, only needs one calibration plate, can be completed synchronously with the calibration of hands and eyes, and is very convenient. Only one-dimensional line structured light needs to be projected during calibration, the working mode is completely consistent with normal working, and additional configuration is not needed. The distortion problem of the projector is well solved, and the projector is suitable for different projector models, including DMD projectors, galvanometers, MEMS micro galvanometer and other one-dimensional line structure light projection. All parameters are calibrated synchronously, no accumulated error exists, and the precision is higher than that of the conventional method.
EXAMPLE III
The embodiment further provides a method for three-dimensional reconstruction of an image, an implementation subject of which may be any computing device/controller, as shown in fig. 2, the method for three-dimensional reconstruction may include the following steps:
and A01, acquiring an image of the target to be measured, and acquiring a camera vector and a projector vector of each pixel point under a camera coordinate system based on the pre-acquired camera internal parameters, projector internal parameters and system external parameters.
The internal reference of the camera vector, the internal reference of the projector vector and the external system reference which are acquired in advance are calibrated by adopting the calibration method of any one of the first embodiment and the second embodiment.
A02, obtaining the three-dimensional space coordinates of each pixel point in the image based on the camera vector and the projector vector of each pixel point in the camera coordinate system;
for example, according to formula (7), a depth value z of the intersection of the camera vector and the projector vector is calculated;
Figure 605329DEST_PATH_IMAGE102
formula (7);
wherein,
Figure 296817DEST_PATH_IMAGE103
is a camera vector in a camera coordinate system,
Figure DEST_PATH_IMAGE104
is the projector vector in the camera coordinate system,
Figure 387133DEST_PATH_IMAGE105
is a translation vector;
based on formula (8), obtaining a spatial coordinate value of each pixel point in the image under the camera coordinate system:
Figure DEST_PATH_IMAGE106
formula (8);
z is a depth value and is a depth value,
Figure 834295DEST_PATH_IMAGE107
expressed as camera vectors
Figure DEST_PATH_IMAGE108
The component in the horizontal direction is,
Figure 368175DEST_PATH_IMAGE109
expressed as camera vectors
Figure 799157DEST_PATH_IMAGE108
The component in the vertical direction.
It can be understood that the spatial coordinates of each pixel point are the final result to be obtained by the 3D camera, and based on the foregoing calibration method, the reconstruction process of the embodiment is simple in calculation and fast in speed.
The present embodiment also provides a control apparatus, including: a memory and a processor; the processor is configured to execute the computer program stored in the memory to implement the steps of performing the calibration method described in any of the first embodiment and the second embodiment or performing the three-dimensional reconstruction method provided in the third embodiment.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the terms first, second, third and the like are for convenience only and do not denote any order. These words are to be understood as part of the name of the component.
Furthermore, it should be noted that in the description of the present specification, the description of the term "one embodiment", "some embodiments", "examples", "specific examples" or "some examples", etc., means that a specific feature, structure, material or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, the claims should be construed to include preferred embodiments and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention should also include such modifications and variations.

Claims (7)

1. A calibration method based on a monocular structured light system is characterized in that the monocular structured light system comprises the following steps: projector for projecting a coding pattern, a camera for acquiring the coding pattern, a control device connecting the projector and the camera, the calibration method comprising:
s10, after the projector projects the coding pattern, the control device decodes the coding image collected by the camera to obtain the coding values of all the pixel points in the coding image;
s20, the control device extracts the characteristic points of the calibration board based on the calibration board image collected by the camera, and obtains the depth values of all the pixel points in the calibration board range according to the basic parameters of the calibration board, and obtains the coding values of all the pixel points in the calibration board range based on the coding values of all the pixel points in the coding image;
s30, according to the mathematical relation among the encoding values, the pixel point information and the depth values of all the pixel points in the range of the calibration board, constructing a distortion model of a camera vector and a projector vector with unknown quantity, and solving the constructed model based on the depth values of the pixel points to obtain camera internal parameters, projector internal parameters and system external parameters;
the distortion model of the projector vector is a virtual model according to a three-dimensional reconstruction result;
the system external parameter is an external parameter rotation matrix when a projector vector is converted into a camera coordinate system
Figure DEST_PATH_IMAGE001
And external reference translation vector
Figure 903988DEST_PATH_IMAGE002
The S30 includes:
s31, obtaining a distortion model of the camera vector, and determining camera internal parameters as unknown quantities; in particular, for camera vectors
Figure DEST_PATH_IMAGE003
The two-dimensional coordinates of each camera pixel point are expressed by adopting a standard camera model of a formula (2):
Figure 493232DEST_PATH_IMAGE004
formula (2);
in the formula (2), the first and second groups,
Figure DEST_PATH_IMAGE005
expressed as camera vectors
Figure 7390DEST_PATH_IMAGE006
The component in the horizontal direction is,
Figure DEST_PATH_IMAGE007
expressed as camera vectors
Figure 74703DEST_PATH_IMAGE008
The component in the vertical direction is,
Figure DEST_PATH_IMAGE009
is the row where the current pixel point is located,
Figure 623496DEST_PATH_IMAGE010
the column in which the current pixel point is located,
Figure DEST_PATH_IMAGE011
is the focal length of the camera in the horizontal direction,
Figure 383642DEST_PATH_IMAGE012
is the focal length in the vertical direction of the camera,
Figure DEST_PATH_IMAGE013
Figure 70582DEST_PATH_IMAGE014
is the coordinates of the optical center;
camera vector based on formula (2)
Figure DEST_PATH_IMAGE015
The distortion model of the medium two-dimensional coordinates is expressed as formula (3):
Figure 207165DEST_PATH_IMAGE016
formula (3);
Figure DEST_PATH_IMAGE017
all of which are radial distortion coefficients,
Figure 610465DEST_PATH_IMAGE018
all of which are tangential distortion coefficients,
Figure DEST_PATH_IMAGE019
is an intermediate variable;
Figure 541512DEST_PATH_IMAGE020
Figure DEST_PATH_IMAGE021
as a camera vector
Figure 965015DEST_PATH_IMAGE015
The internal parameters of (a) are unknown quantities;
s32, a distortion model of the virtual projector vector, and determining projector internal parameters as unknown quantities;
projector vector in virtual projector coordinate system
Figure 170869DEST_PATH_IMAGE022
The distortion model of (2);
the standard projector model is:
Figure DEST_PATH_IMAGE023
formula (4);
virtual projector vector according to basic parameters of projector
Figure 897516DEST_PATH_IMAGE024
The distortion model of (a) is formula (5) or formula (5 a):
Figure DEST_PATH_IMAGE025
formula (5);
alternatively, equation (5 a) is:
Figure 999464DEST_PATH_IMAGE026
in the formula (5) and the formula (5 a),
Figure DEST_PATH_IMAGE027
expressed as projector vector
Figure 975511DEST_PATH_IMAGE028
In the component of the code/phase dimension,
Figure DEST_PATH_IMAGE029
expressed as projector vector
Figure 922738DEST_PATH_IMAGE030
In the component of the virtual row number, p is the encoded value corresponding to each pixel point in the projector vector,
Figure DEST_PATH_IMAGE031
for the number of rows corresponding to each pixel point in the projector vector,
Figure 300630DEST_PATH_IMAGE032
is the focal length of the projector in the horizontal direction,
Figure DEST_PATH_IMAGE033
is the focal length of the projector in the vertical direction,
Figure 307900DEST_PATH_IMAGE034
is the coordinates of the optical center;
Figure DEST_PATH_IMAGE035
are all the distortion coefficients of the projector,
Figure 974505DEST_PATH_IMAGE036
Figure DEST_PATH_IMAGE037
is the intermediate variable(s) of the variable,
Figure 787740DEST_PATH_IMAGE038
an internal reference of the projector vector is an unknown quantity;
s33, converting the projector vector into a camera coordinate system, and determining virtual parameters of the projector vector according to the geometric constraint relation; specifically, the projector vector
Figure DEST_PATH_IMAGE039
Converting into a camera coordinate system, wherein the coordinate system conversion relation formula is as follows:
Figure 954892DEST_PATH_IMAGE040
formula (6);
direction of camera in camera coordinate systemMeasurement of
Figure DEST_PATH_IMAGE041
Projector vector in camera coordinate system
Figure 664222DEST_PATH_IMAGE042
Translation vector
Figure DEST_PATH_IMAGE043
The coplanar relationship is:
Figure 349281DEST_PATH_IMAGE044
formula (1);
Figure DEST_PATH_IMAGE045
in order to refer to the rotation matrix externally,
Figure 903890DEST_PATH_IMAGE046
the vector is an external reference translation vector,
Figure 521953DEST_PATH_IMAGE045
and
Figure 933343DEST_PATH_IMAGE046
as an external parameter of the system is of unknown quantity,
Figure 574540DEST_PATH_IMAGE047
converting the projector vector into virtual parameters of the projector vector determined according to the geometric constraint relation under a camera coordinate system;
s34, constructing a solving equation of the depth value of the intersection point of the camera vector and the projector vector based on the distortion model of the camera vector and the distortion model of the virtual projector vector;
specifically, based on the camera vector and the projector vector acquired by formula (1) to formula (6), the depth value is calculated according to formula (7);
solving the unknown quantity based on the fact that the depth value obtained by the S20 is consistent with the depth value obtained by the formula (7), and obtaining camera internal parameters, projector internal parameters and system external parameters;
Figure DEST_PATH_IMAGE048
formula (7);
and S35, solving an equation based on the depth value of each pixel point in the calibration board range to obtain camera internal parameters, projector internal parameters and system external parameters.
2. Calibration method according to claim 1,
the calibration plate is a checkerboard calibration plate, and the basic parameters of the calibration plate comprise: calibrating the total number of the checkerboards in the board and the size information of each check in the board;
the coded image is an image coded by one or more coding modes of Gray code, phase shift method and multi-frequency extrapolation.
3. The calibration method according to claim 2, wherein the S20 includes:
the control device extracts the checkerboard angular points serving as the characteristic points according to the checkerboard images, obtains lines and columns of the angular points as two-dimensional coordinates, establishes a world coordinate system, and obtains three-dimensional coordinates of the characteristic points according to basic parameters of the checkerboard;
and acquiring a homography matrix of the calibration plate and the camera based on the two-dimensional image coordinates and the three-dimensional coordinates of the feature points, and calculating the depth values of all pixel points in the checkerboard range according to the homography matrix.
4. A method of three-dimensional reconstruction of an image, comprising:
acquiring an image of a target to be measured, and acquiring a camera vector and a projector vector of each pixel point under a camera coordinate system based on pre-acquired camera internal parameters, projector internal parameters and system external parameters;
based on the camera vector and the projector vector, obtaining the three-dimensional space coordinates of each pixel point in the image;
wherein the pre-acquired camera internal parameters, projector internal parameters and system external parameters are calibrated by the calibration method of any one of the above claims 1 to 3.
5. The three-dimensional reconstruction method of claim 4, wherein obtaining the three-dimensional space coordinates of each pixel point in the image based on the camera vector and the projector vector comprises:
calculating a depth value z of an intersection of the camera vector and the projector vector according to formula (7);
Figure 932840DEST_PATH_IMAGE048
formula (7);
wherein,
Figure 670989DEST_PATH_IMAGE049
is a camera vector in a camera coordinate system,
Figure DEST_PATH_IMAGE050
is the projector vector in the camera coordinate system,
Figure 722122DEST_PATH_IMAGE051
is a translation vector;
based on formula (8), obtaining a spatial coordinate value of each pixel point in the image under the camera coordinate system:
Figure DEST_PATH_IMAGE052
formula (8);
z is a depth value of the image,
Figure 319456DEST_PATH_IMAGE053
expressed as camera vectors
Figure DEST_PATH_IMAGE054
The component in the horizontal direction is,
Figure 747026DEST_PATH_IMAGE055
expressed as camera vectors
Figure DEST_PATH_IMAGE056
The component in the vertical direction.
6. The three-dimensional reconstruction method according to claim 4, wherein the step of acquiring an image of the target to be measured and acquiring a camera vector of each pixel point in a camera coordinate system based on the pre-acquired camera internal reference, the projector internal reference and the system external reference comprises:
searching a camera vector and a projector vector of each pixel point under a camera coordinate system based on a pre-acquired lookup table;
wherein, in the calibration method, a corresponding relation table of pixel points and each parameter is constructed according to camera internal parameters, projector internal parameters and system external parameters corresponding to each pixel point,
the corresponding relation table is a lookup table used for quickly searching camera parameters, projector parameters and each pixel point participating outside the system during three-dimensional reconstruction.
7. A control device, comprising a memory for storing a computer program and a processor for executing the computer program stored in the memory and for performing the steps of the method of any of the preceding claims 1 to 6.
CN202210733202.1A 2022-06-27 2022-06-27 Calibration method based on monocular structured light system Active CN114792345B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210733202.1A CN114792345B (en) 2022-06-27 2022-06-27 Calibration method based on monocular structured light system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210733202.1A CN114792345B (en) 2022-06-27 2022-06-27 Calibration method based on monocular structured light system

Publications (2)

Publication Number Publication Date
CN114792345A CN114792345A (en) 2022-07-26
CN114792345B true CN114792345B (en) 2022-09-27

Family

ID=82463574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210733202.1A Active CN114792345B (en) 2022-06-27 2022-06-27 Calibration method based on monocular structured light system

Country Status (1)

Country Link
CN (1) CN114792345B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546311B (en) * 2022-09-28 2023-07-25 中国传媒大学 Projector calibration method based on scene information
CN116091619A (en) * 2022-12-27 2023-05-09 北京纳通医用机器人科技有限公司 Calibration method, device, equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109506589A (en) * 2018-12-25 2019-03-22 东南大学苏州医疗器械研究院 A kind of measuring three-dimensional profile method based on light field imaging
CN112927340A (en) * 2021-04-06 2021-06-08 中国科学院自动化研究所 Three-dimensional reconstruction acceleration method, system and equipment independent of mechanical placement

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10861177B2 (en) * 2015-11-11 2020-12-08 Zhejiang Dahua Technology Co., Ltd. Methods and systems for binocular stereo vision
CN107945268B (en) * 2017-12-15 2019-11-29 深圳大学 A kind of high-precision three-dimensional method for reconstructing and system based on binary area-structure light
CN111028297B (en) * 2019-12-11 2023-04-28 凌云光技术股份有限公司 Calibration method of surface structured light three-dimensional measurement system
CN113008163B (en) * 2021-03-01 2022-09-27 西北工业大学 Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109506589A (en) * 2018-12-25 2019-03-22 东南大学苏州医疗器械研究院 A kind of measuring three-dimensional profile method based on light field imaging
CN112927340A (en) * 2021-04-06 2021-06-08 中国科学院自动化研究所 Three-dimensional reconstruction acceleration method, system and equipment independent of mechanical placement

Also Published As

Publication number Publication date
CN114792345A (en) 2022-07-26

Similar Documents

Publication Publication Date Title
US20240153143A1 (en) Multi view camera registration
US9965870B2 (en) Camera calibration method using a calibration target
US10916033B2 (en) System and method for determining a camera pose
CN107705333B (en) Space positioning method and device based on binocular camera
CN114792345B (en) Calibration method based on monocular structured light system
JP4245963B2 (en) Method and system for calibrating multiple cameras using a calibration object
CN106408556B (en) A kind of small items measuring system scaling method based on general imaging model
CN101697233B (en) Structured light-based three-dimensional object surface reconstruction method
CN103106688B (en) Based on the indoor method for reconstructing three-dimensional scene of double-deck method for registering
CN103400366B (en) Based on the dynamic scene depth acquisition methods of fringe structure light
CN110648274B (en) Method and device for generating fisheye image
CN113298886B (en) Calibration method of projector
Mahdy et al. Projector calibration using passive stereo and triangulation
CN104794718A (en) Single-image CT (computed tomography) machine room camera calibration method
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN112785685B (en) Assembly guiding method and system
CN113160393A (en) High-precision three-dimensional reconstruction method and device based on large field depth and related components thereof
CN111145268A (en) Video registration method and device
CN104156962A (en) Method of calibrating spatial position relationship of camera and projector based on trapezoidal patterns
JP2002135807A (en) Method and device for calibration for three-dimensional entry
Uyanik et al. A method for determining 3D surface points of objects by a single camera and rotary stage
CN116233392B (en) Calibration method and device of virtual shooting system, electronic equipment and storage medium
CN116433848A (en) Screen model generation method, device, electronic equipment and storage medium
CN116659550A (en) Automatic correction method for monocular ranging based on focusing depth method
CN118115586A (en) Pose labeling method, pose estimation method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant