Nothing Special   »   [go: up one dir, main page]

CN111243028B - Electronic equipment and lens association method and device - Google Patents

Electronic equipment and lens association method and device Download PDF

Info

Publication number
CN111243028B
CN111243028B CN201811334220.2A CN201811334220A CN111243028B CN 111243028 B CN111243028 B CN 111243028B CN 201811334220 A CN201811334220 A CN 201811334220A CN 111243028 B CN111243028 B CN 111243028B
Authority
CN
China
Prior art keywords
light spot
lens module
spot image
lens
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811334220.2A
Other languages
Chinese (zh)
Other versions
CN111243028A (en
Inventor
王春茂
浦世亮
徐鹏
俞海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201811334220.2A priority Critical patent/CN111243028B/en
Priority to PCT/CN2019/112850 priority patent/WO2020093873A1/en
Publication of CN111243028A publication Critical patent/CN111243028A/en
Application granted granted Critical
Publication of CN111243028B publication Critical patent/CN111243028B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides an electronic device and a lens association method and device, wherein the device comprises the following steps: the projection device comprises a first lens module, a second lens module, a projection component and a processor; wherein the projection component projects a spot; the first lens module is used for acquiring a first facula image aiming at facula; the second lens module is used for acquiring a second light spot image aiming at the light spot; and the processor is used for calibrating and obtaining a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image. Therefore, in the scheme, the projection part projects the light spots, each lens module respectively collects images aiming at the light spots, the processor automatically calibrates the association result aiming at the images collected by each lens module, the whole calibration process does not need human participation, and the convenience of lens association is improved.

Description

Electronic equipment and lens association method and device
Technical Field
The present application relates to the field of security technologies, and in particular, to an electronic device, a lens association method, and an apparatus.
Background
In some scenes, it is generally necessary to associate different shots, where shot association refers to forming a correspondence between the same objects in different shot images. For example, the long focal length lens can be used for collecting images with higher definition, the short focal length lens can be used for collecting images with larger visual angles, the long focal length lens and the short focal length lens can be correlated, and for the same scene, the images with higher definition and the images with larger visual angles are obtained.
If the professional marks the internal and external parameters for the lenses to be associated, the internal and external parameters are associated with the coordinates of the overlapping areas among the lenses. In the scheme, after a plurality of lenses are associated, the positions, angles and focal lengths are fixed, and if the adjustment is needed, the association can be carried out only by professionals, so that the scheme has poor convenience in use.
Disclosure of Invention
The embodiment of the application aims to provide electronic equipment and a lens association method and device so as to improve convenience of lens association.
To achieve the above object, an embodiment of the present application provides an electronic device, including: the projection device comprises a first lens module, a second lens module, a projection component and a processor; wherein,,
the projection component is used for projecting light spots;
the first lens module is used for acquiring a first light spot image aiming at the light spot;
the second lens module is used for acquiring a second light spot image aiming at the light spot;
the processor is used for calibrating and obtaining a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
Optionally, the first lens module includes a long focal length lens, the second lens module includes a short focal length lens, the first light spot image corresponds to a partial area of the second light spot image, and an angle of the first lens module changes synchronously with an angle of the projection component.
Optionally, the device further includes a third lens module, and an angle of the third lens module synchronously changes with an angle of the projection component;
the third lens module is used for acquiring a third light spot image aiming at the light spot, and the third light spot image corresponds to a partial area of the second light spot image;
the processor is specifically configured to calibrate and obtain a correlation result among the first lens module, the second lens module and the three lens modules based on the first light spot image, the second light spot image and the third light spot image.
Optionally, the first lens module includes a first lens and a first imaging device, where the first lens is configured to collect an optical signal for the light spot, and the first imaging device is configured to convert the optical signal collected by the first lens into an electrical signal, so as to obtain a first light spot image;
the second lens module comprises a second lens and a second imaging device, the second lens is used for collecting optical signals aiming at the light spots, and the second imaging device is used for converting the optical signals collected by the second lens into electric signals to obtain second light spot images.
In order to achieve the above object, an embodiment of the present application further provides a lens association method, including:
acquiring a first facula image; the first light spot image is an image collected by the first lens module aiming at light spots projected by the projection component;
acquiring a second light spot image; the second light spot image is an image acquired by a second lens module aiming at the light spot;
and calibrating to obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
Optionally, the light spot includes a positioning area; the calibrating to obtain the association result between the first lens module and the second lens module based on the first light spot image and the second light spot image includes:
determining a plurality of corner points in the first light spot image as first corner points;
determining a plurality of corner points in the second light spot image as second corner points;
matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs;
and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the centering point of the matching point.
Optionally, the first lens module includes a long focal length lens, the second lens module includes a short focal length lens, and the first light spot image corresponds to a partial area of the second light spot image.
Optionally, the light spot further includes a coded texture region, and the positioning region includes a rectangular block having a size different from that of the rectangular block included in the coded texture region.
In order to achieve the above object, an embodiment of the present application further provides a lens association apparatus, including:
the first acquisition module is used for acquiring a first light spot image; the first light spot image is an image collected by the first lens module aiming at light spots projected by the projection component;
the second acquisition module is used for acquiring a second light spot image; the second light spot image is an image acquired by a second lens module aiming at the light spot;
and the calibration module is used for calibrating and obtaining the association result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
Optionally, the light spot includes a positioning area; the calibration module is specifically configured to:
determining a plurality of corner points in the first light spot image as first corner points;
determining a plurality of corner points in the second light spot image as second corner points;
matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs;
and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the centering point of the matching point.
Optionally, the first lens module includes a long focal length lens, the second lens module includes a short focal length lens, and the first light spot image corresponds to a partial area of the second light spot image.
Optionally, the light spot further includes a coded texture region, and the positioning region includes a rectangular block having a size different from that of the rectangular block included in the coded texture region.
To achieve the above object, an embodiment of the present application further provides a computer-readable storage medium having a computer program stored therein, which when executed by a processor, implements any one of the above-described lens association methods.
The electronic equipment provided by the embodiment of the application comprises: the projection device comprises a first lens module, a second lens module, a projection component and a processor; wherein the projection component projects a spot; the first lens module is used for acquiring a first facula image aiming at facula; the second lens module is used for acquiring a second light spot image aiming at the light spot; and the processor is used for calibrating and obtaining a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image. Therefore, in the scheme, the projection part projects the light spots, each lens module respectively collects images aiming at the light spots, the processor automatically calibrates the association result aiming at the images collected by each lens module, the whole calibration process does not need human participation, and the convenience of lens association is improved.
Of course, it is not necessary for any one product or method of practicing the application to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic view of a first spot image according to an embodiment of the present application;
fig. 3 is a schematic diagram of a second structure of an electronic device according to an embodiment of the present application;
FIG. 4a is a schematic view of a light spot according to an embodiment of the present application;
FIG. 4b is a schematic view of another spot according to an embodiment of the present application;
fig. 5 is a schematic view of a corner point according to an embodiment of the present application;
FIG. 6a is a schematic diagram of a second spot image according to an embodiment of the present application;
fig. 6b is a schematic view of a third spot image according to an embodiment of the present application;
fig. 6c is a schematic view of a fourth spot image according to an embodiment of the present application;
fig. 7 is a schematic flow chart of a lens association method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a lens associating device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In order to solve the above technical problems, the embodiments of the present application provide an electronic device, a lens association method and a lens association device, and the electronic device provided by the embodiments of the present application will be described in detail.
Referring to fig. 1, the electronic device may include a first lens module, a second lens module, a projection component, and a processor; wherein,,
a projection unit configured to project a light spot;
the first lens module is used for acquiring a first light spot image aiming at the light spot;
the second lens module is used for acquiring a second light spot image aiming at the light spot;
and the processor is used for calibrating and obtaining a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
The device provided by the embodiment of the application at least comprises two lens modules, wherein the two lens modules are taken as examples for description, and for distinguishing the description, one lens module is called a first lens module, and the other lens module is called a second lens module; the image collected by the first lens module aiming at the light spots is called a first light spot image, and the image collected by the second lens module aiming at the light spots is called a second light spot image.
For example, the lens module may include a lens and an imaging device, and the imaging device may be a CCD (charge coupled device, charge coupled device image sensor), CMOS (Complementary Metal Oxide Semiconductor ) or the like, which is not limited in particular. The lens collects the optical signals, and the imaging device converts the optical signals into electrical signals, so that images are obtained.
In this way, the first lens module comprises a first lens and a first imaging device, the first lens is used for collecting optical signals aiming at the light spots, and the first imaging device is used for converting the optical signals collected by the first lens into electric signals to obtain first light spot images;
the second lens module comprises a second lens and a second imaging device, the second lens is used for collecting optical signals aiming at the light spots, and the second imaging device is used for converting the optical signals collected by the second lens into electric signals to obtain second light spot images.
The first lens module and the second lens module are used for collecting images aiming at light spots, and as an implementation mode, the first lens module comprises a long-focus lens, the second lens module comprises a short-focus lens, or the first lens is a long-focus lens, and the second lens is a short-focus lens. The short focal length lens has a larger viewing angle and the long focal length lens has a smaller viewing angle, in which case, an image collected by the second lens module including the short focal length lens may be used as a main picture, and an image collected by the first lens module including the long focal length lens may be used as a sub-picture. Referring to fig. 2, the first spot image corresponds to a partial region of the second spot image, or the second spot image corresponds to a large scene, and the first shift image corresponds to a small scene that is a part of the large scene.
In one case, as shown in fig. 2, the spot range may be slightly larger than the acquisition range of the first spot image, or the first spot image may be a part of the spot, in which case the center point of the spot range may substantially coincide with the center point of the first spot image. Alternatively, the first spot image may substantially coincide with the spot range, or the spot range may be slightly smaller than the acquisition range of the first spot image.
Referring to fig. 3, both the angle (or referred to as the acquisition direction) and focal length of the two lens modules, as well as the angle (or referred to as the projection direction) of the projection means, are adjustable. Alternatively, the focal length of the projection member may be adjustable, and is not particularly limited. In this embodiment, the angle of the first lens module is synchronously changed with the angle of the projection component, so that the small scene corresponding to the long-focal-length lens is approximately overlapped with the light spot projected by the projection component, or the long-focal-length lens is aligned with the light spot.
In one embodiment, the spot may include a location area and a coded texture area, the location area comprising rectangular blocks of a different size than the rectangular blocks comprised by the coded texture area. For example, referring to fig. 4a, the positioning area is a middle partial area, the positioning area comprising four small rectangles and cross hairs; the coded texture area is a residual area, and the coded texture area is a checkerboard with black and white intervals. As another example, referring to fig. 4b, the positioning areas are the middle row and the middle column, the remaining areas are the encoded texture areas, and the checkerboard size of the positioning areas is larger than the size of the encoded texture areas.
The processor calibrates the two lens modules based on the first light spot image and the second light spot image. Specifically, a plurality of corner points in the first spot image may be determined as the first corner points; determining a plurality of corner points in the second light spot image as second corner points; matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs; and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the center point of the matching point.
Corner points may be understood as the intersections of a black checkerboard with Bai Qipan, such as shown in fig. 5. For example, a Harris corner detection algorithm may be used to detect the first light spot image and the second light spot image respectively, so as to obtain corners in the images. For distinguishing the description, the corner in the first spot image is referred to as a first corner, and the corner in the second spot image is referred to as a second corner.
As another example, the checkerboard corner has some unique features of the corner, and a filtering template may be set based on these features, and the corner in the image may be identified according to the filtering template.
The corresponding relation between the first corner point and the second corner point can be determined based on the positioning area in the spot image, and a matching point pair is formed according to the corresponding relation, wherein one matching point pair comprises a first corner point and a second corner point, and the first corner point and the second corner point correspond to the same point in the projected spot. According to the coordinate values of the two corner points in the matching point pair, the internal parameter of the lens module and the external parameter between the lens modules can be obtained through calibration, and according to the internal parameter and the external parameter, the mapping relation between the pixel point coordinates in the first light spot image and the pixel point coordinates in the second light spot image can be obtained, and the mapping relation can be used as the association result between the lens modules.
Specifically, in one case, distortion correction may be performed on the focal spot image to obtain a corrected image, and then internal parameters such as a focal length and principal point coordinates of the lens module are solved based on the corrected image. For example, internal parameters such as focal length and principal point coordinates can be calculated by using the Zhang's calibration method or the method in opencv (Open Source Computer Vision Library ). The specific manner of calibrating the internal reference is not limited.
And then solving external parameters between lens modules through an optimization function, such as a reprojection error function or a solvePnP function, wherein the external parameters comprise a rotation relation and a translation relation.
After the correlation result between the lens modules is obtained through calibration, coordinate mapping can be performed between images acquired by the two lens modules, for example, a first lens module comprising a long-focus lens acquires a face image, a second lens module comprising a short-focus lens acquires a whole-body image of a person, and the face image and the whole-body image can be mapped through the correlation result, so that a face area is determined in the whole-body image.
By applying the electronic equipment provided by the embodiment of the application, the projection part projects the light spots, each lens module respectively collects images aiming at the light spots, and the processor automatically calibrates the association result aiming at the images collected by each lens module, so that the whole calibration process does not need human participation, and the convenience of lens association is improved.
In some related schemes, a fixed calibration plate is usually required to be arranged, and the camera acquires a calibration image aiming at the calibration plate, and the calibration image is used for realizing camera calibration. In some environments with poor conditions, such as desert, mountain, etc., it is very inconvenient to set a fixed calibration plate. By adopting the electronic equipment provided by the embodiment of the application, the projection part projects the light spots, and a fixed calibration plate is not required to be arranged, so that the convenience of calibration is improved.
In the above, two lens modules are taken as an example for illustration, the number of the lens modules is not limited in the embodiment of the present application, and the electronic device may include more than two lens modules, and the specific association schemes are similar.
As one embodiment, the device further comprises a third lens module, wherein the angle of the third lens module synchronously changes with the angle of the projection component;
the third lens module is used for acquiring a third light spot image aiming at the light spot, and the third light spot image corresponds to a partial area of the second light spot image;
the processor is specifically configured to calibrate and obtain a correlation result among the first lens module, the second lens module and the three lens modules based on the first light spot image, the second light spot image and the third light spot image.
In this embodiment, the third lens module also includes a long focal length lens. In one case, referring to fig. 6a, the focal length of the third lens module is greater than the focal length of the first lens module, and the third flare image corresponds to a partial area of the first flare image. In another case, referring to fig. 6b, the focal length of the third lens module is smaller than the focal length of the first lens module, and the first flare image corresponds to a partial area of the third flare image. In yet another case, referring to fig. 6c, it is also possible that the first spot image and the third spot image may partially coincide.
The images collected by the second lens module including the short focal length lens can be used as a main picture, and the images collected by the first lens module and the third lens module including the long focal length lens can be used as sub pictures. The number of the sub-frames may be one or more, and is not particularly limited.
The electronic device provided by the embodiment of the application can be also understood as a lens association system.
The embodiment of the application also provides a lens association method and a lens association device, which can be applied to a processor in electronic equipment or can be applied to other electronic equipment, and are not particularly limited.
Fig. 7 is a schematic flow chart of a lens association method according to an embodiment of the present application, including:
s701: and acquiring a first light spot image. The first light spot image is an image collected by the first lens module aiming at light spots projected by the projection component.
S702: and acquiring a second light spot image. The second light spot image is an image acquired by the second lens module aiming at the light spot.
In the embodiment of the application, the projection component projects the light beam, and different lens modules respectively acquire images of the light spots to obtain light spot images. For the sake of distinguishing description, one of the lens modules is referred to as a first lens module, and the other lens module is referred to as a second lens module; the image collected by the first lens module aiming at the light spots is called a first light spot image, and the image collected by the second lens module aiming at the light spots is called a second light spot image.
For example, the lens module may include a lens and an imaging device, and the imaging device may be a CCD (charge coupled device, charge coupled device image sensor), CMOS (Complementary Metal Oxide Semiconductor ) or the like, which is not limited in particular.
As one embodiment, the first lens module includes a long focal length lens and the second lens module includes a short focal length lens. The short focal length lens has a larger viewing angle and the long focal length lens has a smaller viewing angle, in which case, an image collected by the second lens module including the short focal length lens may be used as a main picture, and an image collected by the first lens module including the long focal length lens may be used as a sub-picture. Referring to fig. 2, the first spot image corresponds to a partial region of the second spot image, or the second spot image corresponds to a large scene, and the first shift image corresponds to a small scene that is a part of the large scene.
In one case, as shown in fig. 2, the spot range may be slightly larger than the acquisition range of the first spot image, or the first spot image may be a part of the spot, in which case the center point of the spot range may substantially coincide with the center point of the first spot image. Alternatively, the first spot image may substantially coincide with the spot range, or the spot range may be slightly smaller than the acquisition range of the first spot image.
In this embodiment, the angle of the first lens module is synchronously changed with the angle of the projection component, so that the small scene corresponding to the long-focal-length lens is approximately overlapped with the light spot projected by the projection component, or the long-focal-length lens is aligned with the light spot.
In one embodiment, the spot may include a location area and a coded texture area, the location area comprising rectangular blocks of a different size than the rectangular blocks comprised by the coded texture area. For example, referring to fig. 4a, the positioning area is a middle partial area, the positioning area comprising four small rectangles and cross hairs; the coded texture area is a residual area, and the coded texture area is a checkerboard with black and white intervals. As another example, referring to fig. 4b, the positioning areas are the middle row and the middle column, the remaining areas are the encoded texture areas, and the checkerboard size of the positioning areas is larger than the size of the encoded texture areas.
S703: and calibrating to obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
As an embodiment, a plurality of corner points in the first spot image may be determined as the first corner points; determining a plurality of corner points in the second light spot image as second corner points; matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs; and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the center point of the matching point.
Corner points may be understood as the intersections of a black checkerboard with Bai Qipan, such as shown in fig. 5. For example, a Harris corner detection algorithm may be used to detect the first light spot image and the second light spot image respectively, so as to obtain corners in the images. For distinguishing the description, the corner in the first spot image is referred to as a first corner, and the corner in the second spot image is referred to as a second corner.
As another example, the checkerboard corner has some unique features of the corner, and a filtering template can be set based on the features, and the corner in the image can be identified according to the filtering template.
The corresponding relation between the first corner point and the second corner point can be determined based on the positioning area in the spot image, and a matching point pair is formed according to the corresponding relation, wherein one matching point pair comprises a first corner point and a second corner point, and the first corner point and the second corner point correspond to the same point in the projected spot. According to the coordinate values of the two corner points in the matching point pair, the internal parameter of the lens module and the external parameter between the lens modules can be obtained through calibration, and according to the internal parameter and the external parameter, the mapping relation between the pixel point coordinates in the first light spot image and the pixel point coordinates in the second light spot image can be obtained, and the mapping relation can be used as the association result between the lens modules.
Specifically, in one case, distortion correction may be performed on the focal spot image to obtain a corrected image, and then internal parameters such as a focal length and principal point coordinates of the lens module are solved based on the corrected image. For example, internal parameters such as focal length and principal point coordinates can be calculated by using methods in the Zhang's calibration method and opencv (Open Source Computer Vision Library ). The specific manner of calibrating the internal reference is not limited.
And then solving external parameters between lens modules through an optimization function, such as a reprojection error function or a solvePnP function, wherein the external parameters comprise a rotation relation and a translation relation.
After the correlation result between the lens modules is obtained through calibration, coordinate mapping can be performed between images acquired by the two lens modules, for example, a first lens module comprising a long-focus lens acquires a face image, a second lens module comprising a short-focus lens acquires a whole-body image of a person, and the face image and the whole-body image can be mapped through the correlation result, so that a face area is determined in the whole-body image.
By applying the lens association method provided by the embodiment of the application, the association result is automatically calibrated aiming at the images acquired by each lens module, the whole calibration process does not need human participation, and the convenience of lens association is improved.
The embodiment of the application also provides a lens association device, as shown in fig. 8, comprising:
a first acquiring module 801, configured to acquire a first light spot image; the first light spot image is an image collected by the first lens module aiming at light spots projected by the projection component;
a second acquiring module 802, configured to acquire a second light spot image; the second light spot image is an image acquired by a second lens module aiming at the light spot;
and the calibration module 803 is configured to calibrate and obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
As an embodiment, the spot comprises a positioning area; the calibration module 803 is specifically configured to:
determining a plurality of corner points in the first light spot image as first corner points;
determining a plurality of corner points in the second light spot image as second corner points;
matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs;
and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the centering point of the matching point.
As one embodiment, the first lens module includes a long focal length lens, the second lens module includes a short focal length lens, and the first flare image corresponds to a partial region of the second flare image.
As an embodiment, the light spot further comprises a coded texture region, and the positioning region comprises rectangular blocks with different sizes from the rectangular blocks contained in the coded texture region.
The embodiment of the application also provides a computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and the computer program realizes any lens association method when being executed by a processor.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for method embodiments, apparatus embodiments, device embodiments, and computer-readable storage medium embodiments, the description is relatively simple, as relevant to the description of the device embodiments in part, since they are substantially similar to the device embodiments.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (8)

1. An electronic device, comprising: the projection system comprises a first lens module, a second lens module, a projection component and a processor, wherein the first lens module comprises a long-focus lens, and the second lens module comprises a short-focus lens; wherein,,
the projection component is used for projecting a light spot, the light spot comprises a positioning area and a coding texture area, and the size of a rectangular block contained in the positioning area is different from that of a rectangular block contained in the coding texture area;
the first lens module is used for acquiring a first light spot image aiming at the light spot, and the distance between the center point of the first light spot image and the center point of the light spot range is smaller than a preset value;
the second lens module is used for acquiring a second light spot image aiming at the light spot, and the scene corresponding to the first light spot image is a part of the scene corresponding to the second light spot image;
the processor is used for calibrating and obtaining a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image, wherein the correlation result is a mapping relation between pixel point coordinates in the first light spot image and pixel point coordinates in the second light spot image.
2. The apparatus of claim 1, wherein an angle of the first lens module varies synchronously with an angle of the projection member.
3. The apparatus of claim 2, further comprising a third lens module, an angle of the third lens module synchronously changing with an angle of the projection member;
the third lens module is used for acquiring a third light spot image aiming at the light spot, and the third light spot image corresponds to a partial area of the second light spot image;
the processor is specifically configured to calibrate and obtain a correlation result among the first lens module, the second lens module and the three lens modules based on the first light spot image, the second light spot image and the third light spot image.
4. The apparatus of claim 1, wherein the first lens module comprises a first lens for collecting optical signals for the light spots and a first imaging device for converting the optical signals collected by the first lens into electrical signals to obtain first light spot images;
the second lens module comprises a second lens and a second imaging device, the second lens is used for collecting optical signals aiming at the light spots, and the second imaging device is used for converting the optical signals collected by the second lens into electric signals to obtain second light spot images.
5. A lens association method, comprising:
acquiring a first facula image; the first light spot image is an image collected by the first lens module aiming at light spots projected by the projection component; the first lens module comprises a long-focus lens; the light spot comprises a positioning area and a coding texture area, and the size of a rectangular block contained in the positioning area is different from that of a rectangular block contained in the coding texture area; the distance between the center point of the first light spot image and the center point of the light spot range is smaller than a preset value;
acquiring a second light spot image; the second light spot image is an image acquired by a second lens module aiming at the light spot; the second lens module comprises a short-focal-length lens; the scene corresponding to the first light spot image is a part of the scene corresponding to the second light spot image;
and calibrating to obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image, wherein the correlation result is a mapping relation between pixel point coordinates in the first light spot image and pixel point coordinates in the second light spot image.
6. The method of claim 5, wherein the calibrating the association between the first lens module and the second lens module based on the first spot image and the second spot image comprises:
determining a plurality of corner points in the first light spot image as first corner points;
determining a plurality of corner points in the second light spot image as second corner points;
matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs;
and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the centering point of the matching point.
7. A lens-related device, comprising:
the first acquisition module is used for acquiring a first light spot image; the first light spot image is an image collected by the first lens module aiming at light spots projected by the projection component; the first lens module comprises a long-focus lens; the light spot comprises a positioning area and a coding texture area, and the size of a rectangular block contained in the positioning area is different from that of a rectangular block contained in the coding texture area; the distance between the center point of the first light spot image and the center point of the light spot range is smaller than a preset value;
the second acquisition module is used for acquiring a second light spot image; the second light spot image is an image acquired by a second lens module aiming at the light spot; the second lens module comprises a short-focal-length lens; the scene corresponding to the first light spot image is a part of the scene corresponding to the second light spot image;
the calibration module is used for calibrating and obtaining a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image, wherein the correlation result is a mapping relation between pixel point coordinates in the first light spot image and pixel point coordinates in the second light spot image.
8. The device according to claim 7, characterized in that said calibration module is in particular adapted to:
determining a plurality of corner points in the first light spot image as first corner points;
determining a plurality of corner points in the second light spot image as second corner points;
matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs;
and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the centering point of the matching point.
CN201811334220.2A 2018-11-09 2018-11-09 Electronic equipment and lens association method and device Active CN111243028B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811334220.2A CN111243028B (en) 2018-11-09 2018-11-09 Electronic equipment and lens association method and device
PCT/CN2019/112850 WO2020093873A1 (en) 2018-11-09 2019-10-23 Electronic device, method and device for lens association

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811334220.2A CN111243028B (en) 2018-11-09 2018-11-09 Electronic equipment and lens association method and device

Publications (2)

Publication Number Publication Date
CN111243028A CN111243028A (en) 2020-06-05
CN111243028B true CN111243028B (en) 2023-09-08

Family

ID=70610817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811334220.2A Active CN111243028B (en) 2018-11-09 2018-11-09 Electronic equipment and lens association method and device

Country Status (2)

Country Link
CN (1) CN111243028B (en)
WO (1) WO2020093873A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006120146A2 (en) * 2005-05-11 2006-11-16 Sony Ericsson Mobile Communication Ab Digital cameras with triangulation autofocus systems and related methods
CN102148965A (en) * 2011-05-09 2011-08-10 上海芯启电子科技有限公司 Video monitoring system for multi-target tracking close-up shooting
CN102291569A (en) * 2011-07-27 2011-12-21 上海交通大学 Double-camera automatic coordination multi-target eagle eye observation system and observation method thereof
CN103116889A (en) * 2013-02-05 2013-05-22 海信集团有限公司 Positioning method and electronic device
EP2615580A1 (en) * 2012-01-13 2013-07-17 Softkinetic Software Automatic scene calibration
CN104092939A (en) * 2014-07-07 2014-10-08 山东神戎电子股份有限公司 Laser night vision device synchronous zooming method based on piecewise differentiation technology
CN104363986A (en) * 2014-10-31 2015-02-18 华为技术有限公司 Image processing method and device
CN106934861A (en) * 2017-02-09 2017-07-07 深圳先进技术研究院 Object dimensional method for reconstructing and device
CN107370934A (en) * 2017-09-19 2017-11-21 信利光电股份有限公司 A kind of multi-cam module

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201101152A (en) * 2009-06-30 2011-01-01 Avisonic Technology Corp Light pointing touch panel display device and related touch panel detecting method
US9915857B2 (en) * 2013-12-09 2018-03-13 Geo Semiconductor Inc. System and method for automated test-pattern-free projection calibration
CN205563716U (en) * 2016-03-30 2016-09-07 广州市盛光微电子有限公司 Panoramic camera calibration device based on many camera lenses multisensor
CN106846415B (en) * 2017-01-24 2019-09-20 长沙全度影像科技有限公司 A kind of multichannel fisheye camera binocular calibration device and method
CN106791337B (en) * 2017-02-22 2023-05-12 北京汉邦高科数字技术股份有限公司 Zoom camera with double-lens optical multiple expansion and working method thereof

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006120146A2 (en) * 2005-05-11 2006-11-16 Sony Ericsson Mobile Communication Ab Digital cameras with triangulation autofocus systems and related methods
CN102148965A (en) * 2011-05-09 2011-08-10 上海芯启电子科技有限公司 Video monitoring system for multi-target tracking close-up shooting
CN102291569A (en) * 2011-07-27 2011-12-21 上海交通大学 Double-camera automatic coordination multi-target eagle eye observation system and observation method thereof
EP2615580A1 (en) * 2012-01-13 2013-07-17 Softkinetic Software Automatic scene calibration
CN103116889A (en) * 2013-02-05 2013-05-22 海信集团有限公司 Positioning method and electronic device
CN104092939A (en) * 2014-07-07 2014-10-08 山东神戎电子股份有限公司 Laser night vision device synchronous zooming method based on piecewise differentiation technology
CN104363986A (en) * 2014-10-31 2015-02-18 华为技术有限公司 Image processing method and device
CN106934861A (en) * 2017-02-09 2017-07-07 深圳先进技术研究院 Object dimensional method for reconstructing and device
CN107370934A (en) * 2017-09-19 2017-11-21 信利光电股份有限公司 A kind of multi-cam module

Also Published As

Publication number Publication date
CN111243028A (en) 2020-06-05
WO2020093873A1 (en) 2020-05-14

Similar Documents

Publication Publication Date Title
CN106875339B (en) Fisheye image splicing method based on strip-shaped calibration plate
EP3389268B1 (en) Depth information acquisition method and apparatus, and image collection device
EP1343332B1 (en) Stereoscopic image characteristics examination system
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
US8897502B2 (en) Calibration for stereoscopic capture system
US9348111B2 (en) Automatic detection of lens deviations
CN109712192B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
US8144974B2 (en) Image processing apparatus, method, and program
US20110249117A1 (en) Imaging device, distance measuring method, and non-transitory computer-readable recording medium storing a program
JP5440615B2 (en) Stereo camera device
CN102194223B (en) A kind of distortion factor scaling method of zoom lens and system
WO2011125937A1 (en) Calibration data selection device, method of selection, selection program, and three dimensional position measuring device
CN109951641B (en) Image shooting method and device, electronic equipment and computer readable storage medium
CN112348775A (en) Vehicle-mounted all-round-looking-based pavement pool detection system and method
CN109584312A (en) Camera calibration method, device, electronic equipment and computer readable storage medium
US8179431B2 (en) Compound eye photographing apparatus, control method therefor, and program
CN111243028B (en) Electronic equipment and lens association method and device
Ueno et al. Compound-Eye Camera Module as Small as 8.5$\times $8.5$\times $6.0 mm for 26 k-Resolution Depth Map and 2-Mpix 2D Imaging
CN109658459A (en) Camera calibration method, device, electronic equipment and computer readable storage medium
KR20150000911A (en) Auto mapping method and apparatus of screen and image
KR102295987B1 (en) Calibration method and apparatus of stereo camera module, computer readable storage medium
CN109565544B (en) Position designating device and position designating method
JP7312594B2 (en) Calibration charts and calibration equipment
Li et al. Multichannel camera calibration
JP2004007213A (en) Digital three dimensional model image pickup instrument

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant