CN113034565A - Monocular structured light depth calculation method and system - Google Patents
Monocular structured light depth calculation method and system Download PDFInfo
- Publication number
- CN113034565A CN113034565A CN202110321751.3A CN202110321751A CN113034565A CN 113034565 A CN113034565 A CN 113034565A CN 202110321751 A CN202110321751 A CN 202110321751A CN 113034565 A CN113034565 A CN 113034565A
- Authority
- CN
- China
- Prior art keywords
- structured light
- light camera
- image
- reference image
- acquiring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application is suitable for the field of image processing, and provides a monocular structured light depth calculation method, which comprises the following steps: performing epipolar line correction on the first structured light camera, acquiring a conversion matrix and constructing a second structured light camera; acquiring a first reference image by using a first structured light camera, and mapping the first reference image to a first coordinate system corresponding to a second structured light camera according to a conversion matrix to obtain a second reference image; transforming the second reference image into a normalized projection image in a second coordinate system; acquiring a third reference image corresponding to the normalized projection image through a second structured light camera; and acquiring a target image of the target area by using the second structured light camera, and performing parallax calculation with the third reference image to acquire depth information. According to the scheme, the structured light system with the original system external parameter in the non-ideal state is converted into the structured light system with the system external parameter in the ideal state, and high-precision depth measurement is achieved.
Description
Technical Field
The application belongs to the field of image processing, and particularly relates to a monocular structured light depth calculation method and system.
Background
Monocular structured light is composed of a projection module and a camera, and in order to realize accurate depth measurement, the relative postures of the projection module and the camera are generally required to be strictly restricted, so that the optical axis of the camera is perpendicular to a base line formed by the projection module and the camera. However, there is an error inevitably in actual assembly, and if the depth is directly calculated by the depth calculation method in an ideal posture, high-precision depth information cannot be obtained, which leads to a reduction in measurement precision of the structured light system.
Disclosure of Invention
The embodiment of the application provides a monocular structured light depth calculation method and system, and can solve the problem that the measurement accuracy of a structured light system is reduced due to the fact that the existing depth calculation method cannot obtain high-accuracy depth information.
In a first aspect, an embodiment of the present application provides a method for calculating a depth of monocular structured light, including:
performing epipolar line correction on the first structured light camera, acquiring a conversion matrix and constructing a second structured light camera;
acquiring a first reference image by using a first structured light camera, and mapping the first reference image to a first coordinate system corresponding to a second structured light camera according to a conversion matrix to obtain a second reference image;
transforming the second reference image into a normalized projection image in a second coordinate system; the second coordinate system is a coordinate system corresponding to the projection module;
acquiring a third reference image corresponding to the normalized projection image through a second structured light camera;
and acquiring a target image of the target area by using the second structured light camera, and performing parallax calculation with the third reference image to acquire depth information.
Further, performing epipolar line correction on the first structured light camera, acquiring a conversion matrix and constructing a second structured light camera, including:
and performing epipolar line correction on the first structured light camera according to the camera parameters of the first structured light camera to obtain a second structured light camera, and acquiring a conversion matrix between the first structured light camera and the second structured light camera.
Further, mapping the first reference image to a first coordinate system corresponding to the second structured light camera according to the transformation matrix to obtain a second reference image, including:
acquiring a target projection image of a first reference image on a first reference plane;
acquiring coordinate information of a target projection image in a third coordinate system corresponding to the first structure optical camera;
and mapping the target projection image to a first coordinate system corresponding to the second structured light camera according to the coordinate information and the conversion matrix to obtain a second reference image.
Further, transforming the second reference image into a normalized projection image in a second coordinate system, comprising:
transforming the second reference image to a coordinate system of the first projection module by using a baseline vector of the second structured light camera, and acquiring a coordinate of the second reference image in a second coordinate system corresponding to the first projection module;
and normalizing the coordinates of the second reference image in the coordinate system of the first projection module along the coordinate axis to obtain a normalized projection image.
Further, acquiring a third reference image corresponding to the normalized projection image by the second structured light camera includes:
projecting the normalized projection image to a second reference plane through coordinate scaling to obtain a scaled projection image;
and acquiring internal parameters of the second structured light camera, and mapping the zooming projection image onto an imaging plane of the second structured light camera according to the internal parameters to obtain a third reference image.
Further, acquiring a target image of the target area by using the second structured light camera, and performing parallax calculation with the third reference image to acquire depth information, including:
acquiring a target image of a target area by using a second structured light camera;
performing parallax calculation according to the pixel coordinates of the target image and the pixel coordinates of the third reference image in the first coordinate system corresponding to the second structured light camera to obtain a parallax value;
depth information is calculated from the disparity value and camera parameters of the second structured light camera.
In a second aspect, an embodiment of the present application provides a monocular structured light depth calculation system, including:
a projection module for projecting a structured light pattern onto a target area;
the acquisition module is used for acquiring the structured light pattern reflected back by the target area and generating a target image by utilizing a preset conversion matrix;
the processing module is used for carrying out depth calculation by utilizing a preset third reference image and the target image;
and the storage module is used for storing the preset conversion matrix and the third reference image acquired based on the calculation method.
Further, the obtaining of the preset transformation matrix comprises:
and carrying out polar line correction on the acquisition module machine according to the camera parameters of the acquisition module, and acquiring the conversion matrix.
Further, the processing module performing depth calculation includes:
performing parallax calculation by using the pixel coordinates of the target image and the pixel coordinates of the third reference image to obtain a parallax value;
and calculating depth information according to the parallax value and the camera parameters of the acquisition module.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the method for calculating the depth information of the monocular structured light according to the first aspect is implemented.
In the embodiment of the application, epipolar line correction is carried out on a first structured light camera, a conversion matrix is obtained, and a second structured light camera is constructed; acquiring a first reference image by using a first structured light camera, and mapping the first reference image to a first coordinate system corresponding to a second structured light camera according to a conversion matrix to obtain a second reference image; transforming the second reference image into a normalized projection image in a second coordinate system; acquiring a third reference image corresponding to the normalized projection image through a second structured light camera; and acquiring a target image of the target area by using the second structured light camera, and performing parallax calculation with the third reference image to acquire depth information. According to the scheme, the structured light system is calibrated to obtain the internal parameters of the camera and the accurate external parameters between the projection module and the camera, the shot speckle images and the reference images are corrected by using the internal and external parameters obtained by calibration, the structured light system with the external parameters of the original system in a non-ideal state is converted into the structured light system with the external parameters of the system in an ideal state, and high-precision depth measurement is realized.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a method for calculating a depth of monocular structured light according to a first embodiment of the present application;
fig. 2 is a schematic view of a structured light system in a depth calculation method for monocular structured light according to a first embodiment of the present application;
fig. 3 is a schematic flowchart of S102 in a method for calculating a depth of monocular structured light according to a first embodiment of the present application;
fig. 4 is a schematic flowchart of S103 in a method for calculating a depth of monocular structured light according to a first embodiment of the present application;
fig. 5 is a schematic flowchart of S105 in a method for calculating a depth of monocular structured light according to a first embodiment of the present application;
fig. 6 is a schematic diagram of a monocular structured light depth calculating system according to a second embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Fig. 1 is a schematic flowchart of a monocular structured light depth calculating method according to a first embodiment of the present application. An execution main body of the monocular structured light depth calculating method in this embodiment is a system having a function of calculating depth information of monocular structured light. The method for calculating the depth information of the monocular structured light as shown in fig. 1 may include:
s101: and performing epipolar line correction on the first structured light camera, acquiring a conversion matrix and constructing a second structured light camera.
In one embodiment, step S1 more specifically includes:
and performing epipolar line correction on the first structured light camera according to the camera parameters of the first structured light camera to obtain a second structured light camera, and acquiring a conversion matrix between the first structured light camera and the second structured light camera.
It should be understood that the first structured light camera is defined as a second structured light camera after epipolar rectification, and the second structured light camera is a virtual camera whose imaging plane is parallel to the baseline between the projection module and the second structured light camera. The projection module and the first structured light camera are configured as a first structured light system, and the projection module and the second structured light camera are configured as a second structured light system, as shown in fig. 2.
More specifically, with the optical center of the first structured light camera as the origin, epipolar line correction is performed on the first structured light camera to make the image plane of the first structured light camera parallel to the baseline of the first structured light system, so as to construct a second structured light camera, and obtain a transformation matrix between the first structured light camera and the second structured light camera. Wherein the baseline of the first structured light system characterizes a line between the optical center of the projection module and the optical center of the first structured light camera.
In one embodiment, the camera coordinate system of the first structured light camera is constructed assuming that the optical center of the first structured light camera is the origin, the direction parallel to the camera image plane of the first structured light camera and pointing to the right side of the first structured light camera is the x 'axis, and the optical axis direction of the first structured light camera is the z' axis. The base line direction vector between the first projection module and the first structured light camera is T ═ T, as derived from the structured light calibration parameters of the first structured light camerax Ty Tz]Unit vector of base line directionFrom the optical axis z' of the first structured light camera, the y-axis vector of the camera coordinate system of the second structured light camera can be found as:
E2=z′×e1
A unit vector e in the base line direction with the optical center of the first structured light camera as the origin1Is the x-axis, e2For the y-axis, the camera coordinate system of the second structured light camera is constructed, and then the unit vector of the z-axis of the camera coordinate system of the second structured light camera can be represented as e3=e1×e2。
Assuming that a conversion matrix for making an imaging plane of the first structured light camera parallel to the base line is RrectThen, then
S102: and acquiring a first reference image by using the first structured light camera, and mapping the first reference image to a first coordinate system corresponding to the second structured light camera according to the conversion matrix to obtain a second reference image.
In one embodiment, as shown in fig. 3, step S102 more specifically includes:
s1020: a target projection image of the first reference image on the first reference plane is acquired.
More specifically, the first reference image may be back projected from the image plane of the first structured light camera onto the first reference plane of the first structured light camera using the internal parameters of the first structured light camera.
S1021: coordinate information of the target projection image in the third coordinate system corresponding to the first structured light camera, that is, coordinate information of each point in the target projection image on the first reference plane of the first structured light camera in the third coordinate system of the first structured light camera, is acquired.
S1022: and mapping the target projection image to a first coordinate system corresponding to the second structured light camera according to the coordinate information and the conversion matrix to obtain a second reference image.
S103: transforming the second reference image into a normalized projection image in a second coordinate system; the second coordinate system is a coordinate system corresponding to the projection module.
In one embodiment, the second reference image may be transformed into a projection image in a second coordinate system with the optical center of the projection module in the second structured light system as the origin through a baseline vector of the second structured light camera, and the coordinate information of the projection image is normalized in the z-axis direction to obtain a normalized projection image.
More specifically, as shown in fig. 4, step S103 includes:
s1030: and transforming the second reference image to the coordinate system of the first projection module by using the baseline vector of the second structured light camera, and acquiring the coordinates of the second reference image in the second coordinate system corresponding to the first projection module.
Based on the step S1, the second structured light system is a system that has undergone epipolar line correction, the optical axis of the projection module in the second structured light system is parallel to the optical axis of the second structured light camera, and the imaging plane of the second structured light camera is parallel to the projection module and the baseline of the second structured light camera. Preferably, the second reference image may be transformed to the projection pattern in the second coordinate system with the optical center of the projection module as the origin, using an offset matrix between the projection module and the second structured light camera.
Assuming that the coordinates of the second reference image in the second coordinate system of the second structured light camera areAnd if the offset matrix between the projection module and the second structured light camera is T, transforming the second reference image into coordinates under a second coordinate system with the optical center of the first projection module as an origin as follows:
it should be noted that, as the optical centers of the projection module and the second structured light camera are not changed, the offset matrix between the projection module and the second structured light camera is equivalent to the baseline vector between the projection module and the first structured light camera.
S1031: and normalizing the coordinates of the second reference image in the coordinate system of the first projection module along the coordinate axis to obtain a normalized projection image.
In one embodiment, the z-axis of the second coordinate system with the optical center of the projection module as the origin is parallel to the z-axis of the first coordinate system corresponding to the second structured light camera, so that the coordinates of the second reference image in the second coordinate system of the projection module are normalized along the z-axis of the second coordinate system of the projection module, and the normalized coordinates areAccording to the normalized coordinate information, the z-axis coordinate represented by the normalized coordinate information is 1 and is parallel to the projection module and the second structured light camera baselineThe projection images are normalized.
S104: and acquiring a third reference image corresponding to the normalized projection image through the second structured light camera.
In one embodiment, the normalized projection image may be projected to the second reference plane by coordinate scaling to obtain a scaled projection image, and the scaled projection image may be mapped to the imaging plane of the second structured light camera according to the internal reference to obtain a third reference image.
It should be noted that the second structured light camera is the first structured light camera after epipolar line correction, and an imaging plane of the second structured light camera is parallel to a baseline between the first projection module and the second structured light camera. The second reference plane preset by the second structured light camera is parallel to the imaging plane of the second structured light camera, and the normalized projection image is also parallel to the baseline between the projection module and the second structured light camera. Therefore, the normalized projection image can be projected to a preset second reference plane by coordinate scaling, wherein the coordinate projected to the second reference plane isAnd L is the distance between the second reference surface and the second structured light system.
S105: and acquiring a target image of the target area by using the second structured light camera, and performing parallax calculation with the third reference image to acquire depth information.
In one embodiment, as shown in fig. 5, step S5 more specifically includes:
s1050: and acquiring a target image of the target area by using the second structured light camera.
And acquiring a target image of a target area, wherein the target area is a preset area. The target image of the target area is an image of the target area corresponding to the first structured light camera.
In one embodiment, the obtaining of the target image of the target area by the second structured light camera is equivalent to performing epipolar line correction on the target image obtained by the first structured light camera, so that the imaging plane corresponding to the target image is parallel to the base line of the second structured light camera and the first projection module. Therefore, an initial image of the target area acquired by the first structured light camera may be acquired; and obtaining a target image according to the conversion matrix and the initial image.
In particular, according to a transformation matrix RrectAn initial image acquired by the first structured light camera may be transformed into a target image in the first coordinate system of the second structured light camera.
Further, a point on the normalized image plane of the first structured light camera is transformed to the normalized image plane of the second structured light camera, so that an initial image acquired by the first structured light camera is transformed to a target image in the coordinate system of the second structured light camera, and the specific method comprises the following steps:
wherein the Norm Z operation normalizes the coordinates in the Z direction,shown as a point on the image plane of the first structured light camera,indicated as a point on the second structured-light camera image plane.
It should be noted that the above formula can also be in other expression forms, and is not limited herein.
S1051: and performing parallax calculation according to the pixel coordinates of the target image and the pixel coordinates of the third reference image in the first coordinate system corresponding to the second structured light camera to obtain a parallax value d.
In one embodiment, a certain scattered spot in the target image is known, and the corrected third reference image and the scattered spot with the same name in the target image are located on the same pixel row, that is, the same row of the third reference image can be directly searched to find a matching point matching with the certain scattered spot, and the pixel coordinates of the speckle point in the target image and the third reference image are obtained, so as to perform parallax calculation.
S1052: depth information is calculated from the disparity value and camera parameters of the second structured light camera.
In one embodiment, assuming that the parallax value of a certain speckle is d, the base length of the projection module and the first structured light camera is b, the focal length of the first structured light camera is f, and the distance of the second reference plane is zrefAccording to the trigonometry, the depth information of the target image is obtained as follows:
it should be understood that, by traversing all the pixel points of the target image, the complete depth information of the target image can be obtained; the formula for calculating the depth is not limited to the above formula and is not limited thereto.
In the embodiment of the application, a conversion matrix between a first structured light camera and a second structured light camera is acquired, and a first reference image acquired by the first structured light camera is acquired; mapping the first reference image to a first coordinate system corresponding to the second structured light camera according to the conversion matrix to obtain a second reference image; transforming the second reference image into a normalized projection image in a second coordinate system; acquiring a third reference image corresponding to the normalized projection image through a second structured light camera; acquiring a target image of a target area; and performing parallax calculation on the third reference image and the target image to obtain depth information. According to the scheme, the structured light system is calibrated to obtain the internal parameters of the camera and the accurate external parameters between the projection module and the camera, the shot speckle images and the reference images are corrected by using the internal and external parameters obtained by calibration, the structured light system with the external parameters of the original system in a non-ideal state is converted into the structured light system with the external parameters of the system in an ideal state, and high-precision depth measurement is realized.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 6 is a schematic diagram of a monocular structured light depth calculating system provided in a second embodiment of the present application, where the calculating system 300 includes:
a projection module 301 for projecting a structured light pattern onto a target area 305;
an acquisition module 302, configured to acquire the structured light pattern reflected back through the target area 305 and generate a target image by using a preset transformation matrix;
the processing module 303 is configured to perform depth calculation by using a preset third reference image and a target image;
the storage module 304 is configured to store a preset transformation matrix and a third reference image obtained based on the above method.
In one embodiment, epipolar rectification can be performed on the acquisition module machine according to camera parameters of the acquisition module, and a conversion matrix of the acquisition module after the epipolar rectification is obtained.
In one embodiment, the processing module calculating the depth further comprises:
performing parallax calculation by using the pixel coordinates of the target image and the pixel coordinates of the third reference image to obtain a parallax value;
and calculating depth information according to the parallax value and the camera parameters of the acquisition module.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps that can be implemented in the above method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (10)
1. A method for calculating the depth of monocular structured light is characterized by comprising the following steps:
performing epipolar line correction on the first structured light camera, acquiring a conversion matrix and constructing a second structured light camera;
acquiring a first reference image by using the first structured light camera, and mapping the first reference image to a first coordinate system corresponding to the second structured light camera according to the conversion matrix to obtain a second reference image;
transforming the second reference image into a normalized projection image in a second coordinate system; the second coordinate system is a coordinate system corresponding to the projection module;
acquiring a third reference image corresponding to the normalized projection image through the second structured light camera;
and acquiring a target image of a target area by using the second structured light camera, and performing parallax calculation with the third reference image to acquire depth information.
2. The method of claim 1, wherein the epipolar correction of the first structured light camera, obtaining the transformation matrix and constructing the second structured light camera comprises:
performing epipolar line correction on a first structured light camera according to camera parameters of the first structured light camera to obtain a second structured light camera, and acquiring a conversion matrix between the first structured light camera and the second structured light camera.
3. The method according to claim 1, wherein the mapping the first reference image into the first coordinate system corresponding to the second structured light camera according to the transformation matrix to obtain a second reference image comprises:
acquiring a target projection image of the first reference image on a first reference plane;
acquiring coordinate information of the target projection image in a third coordinate system corresponding to the first structure optical camera;
and mapping the target projection image to a first coordinate system corresponding to the second structured light camera according to the coordinate information and the conversion matrix to obtain a second reference image.
4. The monocular structured light depth calculating method of claim 1, wherein transforming the second reference image into a normalized projection image in a second coordinate system comprises:
transforming the second reference image to a coordinate system of a first projection module by using the baseline vector of the second structured light camera, and acquiring the coordinate of the second reference image in a second coordinate system corresponding to the first projection module;
and normalizing the coordinates of the second reference image in the coordinate system of the first projection module along the coordinate axis to obtain a normalized projection image.
5. The monocular structured light depth calculating method as set forth in claim 1, wherein the acquiring of the third reference image corresponding to the normalized projection image by the second structured light camera includes:
projecting the normalized projection image to a second reference plane through coordinate scaling to obtain a scaled projection image;
acquiring internal parameters of the second structured light camera, and mapping the scaled projection image onto an imaging plane of the second structured light camera according to the internal parameters to obtain a third reference image.
6. The method of claim 1, wherein the acquiring a target image of a target area by the second structured light camera and performing a disparity calculation with the third reference image to acquire depth information comprises:
acquiring a target image of a target area by using the second structured light camera;
performing parallax calculation according to the pixel coordinates of the target image and the pixel coordinates of the third reference image in the first coordinate system corresponding to the second structured light camera to obtain a parallax value;
and calculating depth information according to the parallax value and the camera parameters of the second structured light camera.
7. A monocular structured light depth calculation system comprising:
a projection module for projecting a structured light pattern onto a target area;
the acquisition module is used for acquiring the structured light pattern reflected back by the target area and generating a target image by utilizing a preset conversion matrix;
the processing module is used for carrying out depth calculation by utilizing a preset third reference image and the target image;
a storage module, configured to store the third reference image obtained based on the calculation method of claims 1 to 5 and preset the transformation matrix.
8. The monocular structured light depth calculating system of claim 7, wherein the obtaining of the transformation matrix comprises:
and carrying out polar line correction on the acquisition module machine according to the camera parameters of the acquisition module, and acquiring the conversion matrix.
9. The monocular structured light depth calculating system of claim 7, wherein the processing module performs the depth calculation comprising:
performing parallax calculation by using the pixel coordinates of the target image and the pixel coordinates of the third reference image to obtain a parallax value;
and calculating depth information according to the parallax value and the camera parameters of the acquisition module.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110321751.3A CN113034565B (en) | 2021-03-25 | 2021-03-25 | Depth calculation method and system for monocular structured light |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110321751.3A CN113034565B (en) | 2021-03-25 | 2021-03-25 | Depth calculation method and system for monocular structured light |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113034565A true CN113034565A (en) | 2021-06-25 |
CN113034565B CN113034565B (en) | 2023-07-04 |
Family
ID=76474032
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110321751.3A Active CN113034565B (en) | 2021-03-25 | 2021-03-25 | Depth calculation method and system for monocular structured light |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113034565B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113763448A (en) * | 2021-08-24 | 2021-12-07 | 北京的卢深视科技有限公司 | Depth imaging method, electronic device, and computer-readable storage medium |
CN113870430A (en) * | 2021-12-06 | 2021-12-31 | 杭州灵西机器人智能科技有限公司 | Workpiece data processing method and device |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150348313A1 (en) * | 2014-02-18 | 2015-12-03 | Panasonic Intellectual Property Corporation Of America | Projection system, semiconductor integrated circuit, and image correction method |
CN108917639A (en) * | 2018-05-15 | 2018-11-30 | 深圳奥比中光科技有限公司 | Depth Imaging system and its temperature error bearing calibration |
CN109146980A (en) * | 2018-08-12 | 2019-01-04 | 浙江农林大学 | The depth extraction and passive ranging method of optimization based on monocular vision |
US20190178634A1 (en) * | 2017-12-12 | 2019-06-13 | Samsung Electronics Co., Ltd. | High contrast structured light patterns for qis sensors |
CN109889799A (en) * | 2017-12-06 | 2019-06-14 | 西安交通大学 | Monocular structure light depth perception method and device based on RGBIR camera |
CN111243002A (en) * | 2020-01-15 | 2020-06-05 | 中国人民解放军国防科技大学 | Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement |
CN111540004A (en) * | 2020-04-16 | 2020-08-14 | 北京清微智能科技有限公司 | Single-camera polar line correction method and device |
CN112070844A (en) * | 2020-08-27 | 2020-12-11 | 合肥的卢深视科技有限公司 | Calibration method and device of structured light system, calibration tool diagram, equipment and medium |
CN112184811A (en) * | 2020-09-22 | 2021-01-05 | 合肥的卢深视科技有限公司 | Monocular space structured light system structure calibration method and device |
-
2021
- 2021-03-25 CN CN202110321751.3A patent/CN113034565B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150348313A1 (en) * | 2014-02-18 | 2015-12-03 | Panasonic Intellectual Property Corporation Of America | Projection system, semiconductor integrated circuit, and image correction method |
CN109889799A (en) * | 2017-12-06 | 2019-06-14 | 西安交通大学 | Monocular structure light depth perception method and device based on RGBIR camera |
US20190178634A1 (en) * | 2017-12-12 | 2019-06-13 | Samsung Electronics Co., Ltd. | High contrast structured light patterns for qis sensors |
CN108917639A (en) * | 2018-05-15 | 2018-11-30 | 深圳奥比中光科技有限公司 | Depth Imaging system and its temperature error bearing calibration |
CN109146980A (en) * | 2018-08-12 | 2019-01-04 | 浙江农林大学 | The depth extraction and passive ranging method of optimization based on monocular vision |
CN111243002A (en) * | 2020-01-15 | 2020-06-05 | 中国人民解放军国防科技大学 | Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement |
CN111540004A (en) * | 2020-04-16 | 2020-08-14 | 北京清微智能科技有限公司 | Single-camera polar line correction method and device |
CN112070844A (en) * | 2020-08-27 | 2020-12-11 | 合肥的卢深视科技有限公司 | Calibration method and device of structured light system, calibration tool diagram, equipment and medium |
CN112184811A (en) * | 2020-09-22 | 2021-01-05 | 合肥的卢深视科技有限公司 | Monocular space structured light system structure calibration method and device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113763448A (en) * | 2021-08-24 | 2021-12-07 | 北京的卢深视科技有限公司 | Depth imaging method, electronic device, and computer-readable storage medium |
CN113870430A (en) * | 2021-12-06 | 2021-12-31 | 杭州灵西机器人智能科技有限公司 | Workpiece data processing method and device |
Also Published As
Publication number | Publication date |
---|---|
CN113034565B (en) | 2023-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108734744B (en) | Long-distance large-view-field binocular calibration method based on total station | |
CN109405765B (en) | High-precision depth calculation method and system based on speckle structured light | |
JP6573419B1 (en) | Positioning method, robot and computer storage medium | |
JP4095491B2 (en) | Distance measuring device, distance measuring method, and distance measuring program | |
JP3347508B2 (en) | Captured image processing device and captured image processing method | |
US20020113878A1 (en) | Camera calibration device and method, and computer system | |
WO2018201677A1 (en) | Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system | |
CN112184811B (en) | Monocular space structured light system structure calibration method and device | |
CN111123242B (en) | Combined calibration method based on laser radar and camera and computer readable storage medium | |
KR101926953B1 (en) | Matching method of feature points in planar array of four - camera group and measurement method based theron | |
WO2023201578A1 (en) | Extrinsic parameter calibration method and device for monocular laser speckle projection system | |
CN110009687A (en) | Color three dimension imaging system and its scaling method based on three cameras | |
CN113034565A (en) | Monocular structured light depth calculation method and system | |
CN1561502A (en) | Strapdown system for three-dimensional reconstruction | |
CA3233222A1 (en) | Method, apparatus and device for photogrammetry, and storage medium | |
CN113034612A (en) | Calibration device and method and depth camera | |
CN112381921B (en) | Edge reconstruction method and system | |
CN112229323A (en) | Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method | |
CN110542540A (en) | optical axis alignment correction method of structured light module | |
CN114926538A (en) | External parameter calibration method and device for monocular laser speckle projection system | |
CN110470216B (en) | Three-lens high-precision vision measurement method and device | |
CN115375773A (en) | External parameter calibration method and related device for monocular laser speckle projection system | |
CN115018922A (en) | Distortion parameter calibration method, electronic device and computer readable storage medium | |
Zhou et al. | CCD camera calibration based on natural landmarks | |
CN116862999B (en) | Calibration method, system, equipment and medium for three-dimensional measurement of double cameras |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |