CN112073707A - Camera module, electronic equipment, vehicle-mounted distance measuring system and imaging method - Google Patents
Camera module, electronic equipment, vehicle-mounted distance measuring system and imaging method Download PDFInfo
- Publication number
- CN112073707A CN112073707A CN202010820220.4A CN202010820220A CN112073707A CN 112073707 A CN112073707 A CN 112073707A CN 202010820220 A CN202010820220 A CN 202010820220A CN 112073707 A CN112073707 A CN 112073707A
- Authority
- CN
- China
- Prior art keywords
- module
- receiving
- camera module
- image
- transmitting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 54
- 230000002093 peripheral effect Effects 0.000 claims abstract description 9
- 238000012545 processing Methods 0.000 claims description 34
- 230000003287 optical effect Effects 0.000 claims description 10
- 230000007246 mechanism Effects 0.000 claims description 9
- 238000000034 method Methods 0.000 abstract description 9
- 230000015572 biosynthetic process Effects 0.000 abstract description 8
- 230000005540 biological transmission Effects 0.000 abstract description 6
- 230000000694 effects Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000013519 translation Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000010521 absorption reaction Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000005622 photoelectricity Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The application provides a camera module, includes: the transmitting module is used for transmitting signal waves; and the receiving modules are arranged on the peripheral sides of the transmitting modules and used for receiving signal waves so as to obtain a target depth image of a target object. Compare in the image that traditional time of flight range finding method carried out degree of depth formation of image and obtained, the camera module of this application receives the signal wave of a plurality of different angles through a plurality of receiving module that locate transmission module week side, and the degree of depth formation of image that goes on the object from a plurality of directions has obtained multiunit imaging information, has effectively strengthened object spatial information's signal strength, has richened the formation of image detail, has improved the degree of accuracy. The application simultaneously provides an electronic equipment and a vehicle-mounted distance measuring system with the camera module, and also provides an imaging method.
Description
Technical Field
The invention relates to the technical field of space ranging imaging, in particular to a camera module, electronic equipment, a vehicle-mounted ranging system and an imaging method.
Background
In recent years, with the rapid development of the consumer electronics industry, 3D cameras having three-dimensional space ranging and imaging functions have been increasingly applied. Referring to fig. 1, it is common to use a Time of flight (TOF) camera to perform spatial imaging, where the TOF camera projects a signal to a measured object through a transmitter 1, and then receives the signal reflected by the measured object through a receiver 2, and calculates a three-dimensional image of the measured object according to a Time difference between the transmitted and received signals.
However, in the process of implementing the present application, the inventors found that at least the following problems exist in the prior art: the existing TOF lens is only provided with a transmitter 1 and a receiver 2, a certain loss exists in the process of transmitting and transmitting signals, the signals can be partially absorbed by a measured object, and the probability that the signal waves transmitted by a fixed wave band are absorbed by different objects is also different. This condition can affect the accuracy of the spatial imaging, causing image distortion.
Disclosure of Invention
In view of the above, it is desirable to provide a camera module, an electronic device, a vehicle-mounted distance measuring system and an imaging method, so as to solve the above problems.
The embodiment of the invention provides a camera module, which comprises:
the transmitting module is used for transmitting signal waves;
and the receiving modules are arranged on the peripheral sides of the transmitting modules and used for receiving signal waves so as to obtain a target depth image of a target object.
Compared with a single image obtained by space imaging by a traditional flight time ranging method, the camera module receives a plurality of signal waves at different angles through a plurality of receiving modules arranged on the peripheral side of the transmitting module, performs depth imaging on an object from multiple directions, and obtains a plurality of groups of imaging information; when the signal has loss under an angle, the receiving module at another angle can complete the receiving, thereby effectively enhancing the signal intensity of the object space information, enriching the imaging details and improving the accuracy.
Furthermore, the signal wave emitted by the emission module is an infrared laser wave;
the receiving module is an infrared camera.
The infrared laser wave invisible to human eyes is emitted by the vertical cavity surface emitting laser, and the laser wave is shot by the infrared camera, so that the flight time of the light wave can be measured and calculated better, and further the depth imaging is realized.
Further, the receiving module comprises an imaging lens and a photosensitive chip;
the imaging lens is used for converging incident light rays to the photosensitive chip, and the photosensitive chip is used for receiving signal waves so as to obtain a target depth image of a target object.
The receiving module realizes better convergent light through setting up a plurality of components, and then can carry out degree of depth formation of image according to the signal wave of accepting.
Further, receiving module still includes adjustment mechanism, adjustment mechanism is including the rotation piece and the driving piece of connecting, rotate the piece with receiving module connects, the driving piece is used for the drive rotate the piece and rotate, in order to adjust receiving module's position and angle.
The position and the angle of the receiving module are adjusted through the adjusting mechanism, the receiving module can be in a proper receiving angle under a non-application environment, and a better picture effect is obtained.
Furthermore, the camera module further comprises a processing unit and a circuit board;
the transmitting module and the receiving module are electrically connected with the circuit board;
the processing unit is arranged on the circuit board and used for obtaining a plurality of target depth images corresponding to the receiving modules and superposing the plurality of target depth images to obtain a synthesized depth image.
The circuit board can be used for bearing corresponding electronic components and processing units and can be arranged corresponding to the transmitting module and the receiving module; the processing unit performs image translation, image combination, image cutting and the like on the plurality of target depth images, and after the same image parts of the plurality of target depth images are overlapped, the detail parts are more accurate.
Further, the emitting module is a vertical cavity surface emitting laser.
The vertical cavity surface emitting laser can directly emit laser from the top surface of the integrated circuit, and the laser is divided into a plurality of photoelectricity, so that the test can be directly carried out. Compared with other types of laser emitters, the device saves the cost and the use of the device in the semiconductor manufacturing process.
Further, the emission module comprises a laser emitter, a collimation element and a diffraction optical device;
the collimation element is arranged on a light path of the laser emitter and used for collimating laser emitted by the laser emitter, and the diffraction optical device is arranged on the light path of the laser emitter and located on one side, away from the laser emitter, of the collimation element.
The emission module has realized launching laser wave and can the vertical outgoing through setting up a plurality of components, the surface information of transmission target object that can be better.
An embodiment of the present invention further provides an electronic device, including:
a housing, and
in the camera module of any of the above embodiments, the camera module is disposed in the housing.
An embodiment of the present invention further provides a vehicle-mounted distance measuring system, including:
the camera module described in any of the above embodiments.
The embodiment of the invention also provides an imaging method, which comprises the following steps:
controlling a transmitting module to transmit signal waves and controlling a plurality of receiving modules arranged on the peripheral side of the transmitting module to receive the signal waves;
respectively acquiring target depth images corresponding to the receiving modules;
and superposing the plurality of target depth images to obtain a synthesized depth image.
The receiving modules respectively correspond to a plurality of target depth images with different angles and different visual fields, the synthesized depth image is obtained by superposing the target depth images, and the details of the synthesized depth image are more accurate.
Further, the step of "superimposing a plurality of target depth images to obtain a synthesized depth image" specifically includes:
and carrying out image signal processing or digital signal processing on the plurality of target depth images to obtain the synthesized depth image.
The image data is processed in a digital signal processing or image signal processing mode, so that the image superposition effect is good, and meanwhile, the calculation efficiency is considered.
According to the camera module and the imaging method provided by the embodiment of the invention, the plurality of receiving modules are arranged on the periphery of the transmitting module, so that the signal waves transmitted by the transmitting module are respectively received by the plurality of receiving modules, namely, a plurality of depth images are obtained, the picture details are enhanced, and the accuracy is improved. Compared with the existing TOF lens which only has a single transmitting element and a single receiving element, the TOF lens solves the problems of imaging distortion or detail loss and the like caused by signal wave transmission loss or absorption by an object in the prior art.
Drawings
Fig. 1 is a schematic plan view of a TOF lens in the prior art.
Fig. 2 is a schematic plan view of a camera module according to a first embodiment of the present application.
Fig. 3 is a schematic cross-sectional view of the camera module shown in fig. 2.
Fig. 4 is a schematic block diagram of the camera module shown in fig. 2.
Fig. 5 is a schematic plan view of a camera module according to a second embodiment of the present application.
Fig. 6 is a schematic perspective view of an electronic device according to an embodiment of the present application.
Fig. 7 is a block diagram of a vehicle-mounted ranging system according to an embodiment of the present application.
Fig. 8 is a flowchart of an imaging method according to an embodiment of the present application.
Description of the main elements
Transmitting module 10
Collimating element 14
Receiving module 20
Adjusting mechanism 26
Processing unit 30
Wiring board 40
Vehicle-mounted distance measuring system 300
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that when an element is referred to as being "electrically connected" to another element, it can be directly on the other element or intervening elements may also be present. When a component is referred to as being "electrically connected" to another component, it can be connected by contact, e.g., by wires, or by contactless connection, e.g., by contactless coupling.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 2, a camera module 100 according to a first embodiment of the present invention is used for capturing a target object (not shown) and obtaining depth image information of the object. The camera module 100 includes a transmitting module 10 and a plurality of receiving modules 20.
The transmitting module 10 is used for transmitting signal waves to an object. The receiving modules 20 are disposed around the transmitting module 10, and are configured to receive the signal wave reflected by the target object and obtain a target depth image of the target object according to the received signal wave.
Compared with single image information obtained by performing depth imaging by using a traditional time-of-flight ranging method, the camera module 100 in the embodiment of the invention receives a plurality of signal waves at different angles through the plurality of receiving modules 20 arranged on the peripheral side of the transmitting module 10, performs depth imaging on an object from a plurality of directions, and obtains a plurality of groups of imaging information; when the signal has loss at one angle, the receiving module 20 at another angle can complete the receiving, thereby effectively enhancing the signal intensity of the object space information, enriching the imaging details and improving the accuracy.
Referring to fig. 3, in the first embodiment, the Emitting module 10 is a Vertical-Cavity Surface-Emitting Laser (VCSEL), and the VCSEL can emit tens of thousands of light spots vertically and uniformly spread the light spots on the Surface of the object to be measured.
Specifically, the transmit module 10 includes a laser emitter 12, a collimating element 14, and diffractive optics 16. The laser emitter 12 is used for emitting laser light (infrared laser wave in the present embodiment), and the collimating element 14 is disposed on the optical path of the laser emitter 12 and is used for collimating the laser light emitted by the laser emitter 12. A Diffractive Optical Element (DOE) 16 is disposed on the Optical path of the laser emitter 12 for receiving the laser light collimated by the collimating element 14 and diffusing the laser light to form a laser pattern.
Further, the transmitting module 10 can adjust the transmitting power of the laser transmitter 12, so as to achieve the function of signal enhancement or signal reduction. It is understood that when the target object is closer, the emission power of the laser emitter 12 can be appropriately reduced; similarly, when the target is at a greater physical distance, the emission power of the laser emitter 12 may be increased appropriately.
Further, in this embodiment, the signal wave is an infrared laser wave emitted from the VCSEL, and the receiving module 20 is an infrared camera for receiving the infrared laser wave reflected by the surface of the object.
The number of the receiving modules 20 is two, the two receiving modules 20 are respectively located at two opposite sides of the transmitting module 10, and the two receiving modules 20 respectively receive the signal waves emitted by the transmitting module 10 from different positions at the two sides.
Specifically, the receiving module 20 includes an imaging lens 22 and a photosensitive chip 24. The photosensitive chip 24 is located at the image side of the imaging lens 22, and the imaging lens 22 is used for converging the incident light to the photosensitive chip 24. The imaging lens 22 includes at least one optical lens. In one example, the imaging lens 22 may be an optical lens. In another example, the imaging lens 22 may be a combination of a plurality of optical lenses. In this way, the receiving module 20 can improve the imaging effect of the photosensitive chip 24 through the imaging lens 22. In some embodiments, the receiving module 20 further includes a narrow-band infrared cut-off bandpass sheet (not shown), which is disposed between the imaging lens 22 and the photosensitive chip 24, and is used for filtering stray light that passes through the imaging lens 22 and reaches the photosensitive chip 24, so as to improve the imaging sharpness of the photosensitive chip 24.
It is understood that in other embodiments, the receiving module 20 includes only one photosensitive chip 24, and the photosensitive chip 24 can receive the signal wave and acquire the target depth image of the target object.
Thus, the light sensing chip 24 can collect light reflected by an object, the light sensing chip 24 is a light sensing chip 24 specially used for optical time of flight (TOF) measurement, and may be, for example, a light sensing chip 24 such as a CMOS (complementary metal oxide semiconductor), an APD (avalanche photodiode), an SPAD (single photon avalanche photodiode), and the like, and pixels of the light sensing chip 24 may be in the form of a single point, a linear array, an area array, or the like.
Further, receiving module 20 still includes adjustment mechanism 26, and adjustment mechanism 26 is including the rotation piece and the driving piece of connecting, rotate the piece with receiving module connects, and the driving piece is used for the drive rotate the piece and rotate to adjust receiving module 20's position and angle.
In this embodiment, the rotating member is an adjusting shaft, and the driving member is a motor, since the receiving module 20 can rotate around the adjusting shaft, the motor can drive the adjusting shaft to rotate and drive the receiving module 20 to rotate, thereby adjusting the shooting angles of the imaging lens 22 and the photo sensor chip 24. It can be understood that when the target object is closer, the receiving module 20 can be rotated to rotate the side of the receiving module 20 where the imaging lens 22 is located toward the direction close to the transmitting module 10, so that the receiving angle of the receiving module 20 is better. It will be appreciated that in other embodiments, the rotatable member may be a pair of meshing gear sets, one of which is connected to the power end of the drive member and the other of which is connected to the receiving module 20.
It is understood that the embodiment of the adjusting mechanism 26 can also be implemented by other elements capable of adjusting the position and angle of the receiving module 20, such as: the imaging lens 22 and the photosensitive chip 24 are controlled to move transversely by the electric control sliding table or the receiving module 20 is controlled to stretch along the shooting direction by a motor.
Further, referring to fig. 4, in the present embodiment, the camera module 100 further includes a processing unit 30 and a circuit board 40 shown in fig. 2. The circuit board 40 is electrically connected to the transmitting module 10 and the receiving module 20, and the processing unit 30 is disposed on the circuit board 40. The processing unit 30 is configured to obtain target depth images corresponding to the multiple receiving modules 20, and superimpose the multiple target depth images to obtain a synthesized depth image.
In particular, the processing unit 30 includes a control module 32 and an image processing module 34.
The control module 32 is used for sending a work instruction to the transmitting module 10 and the receiving module 20 to control the transmitting module 10 and the receiving module 20.
The image processing module 34 obtains the target depth images corresponding to the plurality of receiving modules 20, and superimposes the plurality of target depth images to obtain a synthesized depth image.
Specifically, each receiving module 20 obtains a target depth image from its light sensing chip 24 according to the received reflected signal wave, and the image processing module 34 obtains a synthesized depth image with finer image details by overlapping a plurality of target depth images.
It can be understood that the method for superimposing multiple target depth images includes, but is not limited to, performing image translation, image combination, frame cropping, etc. on the multiple target depth images, and after the same frame portions of the multiple target depth images are superimposed and overlapped, the detail portions are more accurate.
Further, the Image Processing module 34 may perform a superposition algorithm on the plurality of target depth images by means of Image Signal Processing (ISP) or Digital Signal Processing (DSP).
Further, the processing unit 30 may include a processor and a memory, and may also be an entity with computing capability, such as a single chip microcomputer.
Specifically, the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, and the like. The memory may be used to store the computing instructions and/or modules/units, and the processor may implement the various functions of the control module 32 and the image processing module 34 by running or executing computer programs and/or modules/units stored in the memory, as well as invoking data stored in the memory. The memory may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
It is understood that the circuit board 40 is a support for other electronic components.
Specifically, the Circuit Board 40 may be one of a Printed Circuit Board (PCB) and a Flexible Circuit Board (FPC), but is not limited thereto.
Further, in the present embodiment, the number of the receiving modules 20 is two, and the signal wave emitted by the emitting module 10 is reflected by the object and then received by the two receiving modules 20. Because the interval between the two receiving modules 20 is not large, most of the picture contents of the target depth images corresponding to the two receiving modules 20 are the same, the same parts can be directly overlapped and used through translation, and the details of the imaging effect after the overlapping are clearer; for the part of the picture that cannot be overlapped (i.e. the picture received by only one receiving module 20), the cropping can be selected to be omitted or retained. When all the imaged pictures remain, the synthesized depth image has a wider angle of view.
Use camera module 100 that this application provided to carry out degree of depth formation of image to target object, compare in the traditional TOF camera lens that only has single transmission component and receiving element, this application has realized multi-angle formation of image through set up a plurality of receiving module 20 in transmission module 10 week side, and rethread image synthesis obtains a more accurate depth image of detail.
Referring to fig. 3 to 5, a camera module 100 according to a second embodiment of the present invention includes a transmitting module 10, a receiving module 20, a processing unit 30, and a circuit board 40.
Compared with the first embodiment, the number of the receiving modules 20 of the camera module 100 of the present embodiment is increased from two to three. The three receiving modules 20 are respectively located in three directions of the peripheral side of the transmitting module 10.
After the signal waves transmitted by the transmitting module 10 are received by the three receiving modules 20, the three receiving modules 20 obtain three target depth images from three positions respectively, and then the three target depth images are superimposed by the image processing module 34, so as to obtain a synthesized depth image with more precise details.
It can be understood that, the second embodiment uses three receiving modules 20 to cooperatively form images, and compared with the first embodiment that uses two receiving modules 20 to cooperatively form images, the second embodiment has a better viewing angle and a higher precision of the image, but the volume of the corresponding camera module 100 is increased and the amount of calculation is also increased.
It is understood that in other embodiments of the present application, the number of the receiving modules 20 is not limited to two or three in the above embodiments, and the number of the receiving modules 20 may be set reasonably according to the actually required imaging range, imaging angle, and required precision.
Referring to fig. 6, an embodiment of the invention also provides an electronic device 200, where the electronic device 200 includes a housing 210 and the camera module 100. The camera module 100 is disposed in the housing 210.
The electronic device 200 of the embodiment of the invention includes, but is not limited to, electronic products supporting depth imaging, such as smart phones, tablet computers, notebook computers, electronic book readers, Portable Multimedia Players (PMPs), portable phones, video phones, digital still cameras, mobile medical devices, wearable devices, and the like.
Referring to fig. 7, an embodiment of the invention also provides a vehicle-mounted distance measuring system 300, where the vehicle-mounted distance measuring system 300 includes a control module 310 and the camera module 100.
Traditional on-vehicle range finding system 300 only possesses simple range finding alarming function usually, and the on-vehicle range finding system 300 that this application provided can carry out degree of depth formation of image to the object in the scope when using camera module 100 to carry out the range finding, obtains clear three-dimensional image.
It can be understood that one vehicle-mounted ranging system 300 can carry a plurality of camera modules 100 simultaneously, thereby realizing multi-angle and multi-part imaging.
The control module 310 is configured to control the operation of at least one camera module 100 and process the imaging information of each camera module 100, and the control module 310 may perform processing and analysis of multiple images and may also perform driving functions such as warning or prompting by using the imaging information.
Referring to fig. 8, an embodiment of the invention also provides an imaging method, including the following steps:
s1: controls the transmitting module 10 to transmit signal waves, and controls the plurality of receiving modules 20 disposed around the transmitting module 10 to receive signal waves.
Specifically, the plurality of receiving modules 20 are disposed on the peripheral side of the transmitting module 10, and the signal waves transmitted by the transmitting module 10 are received by the plurality of receiving modules 20 from different angles, respectively.
Further, the signal wave may be an infrared laser wave, the transmitting module 10 may be a vertical cavity surface emitting laser, and the receiving module 20 may be an infrared camera.
S2: the target depth images corresponding to the plurality of receiving modules 20 are obtained respectively.
Specifically, the signal wave transmitted by the transmitting module 10 to the object is reflected by the object and then received by the receiving module 20, so that the distance between the object and the receiving module 20 can be determined according to the propagation time of the signal wave to form depth information, and a target depth image of the target object can be obtained.
S3: and superposing the plurality of target depth images to obtain a synthesized depth image.
Specifically, the plurality of receiving modules 20 respectively correspond to a plurality of target depth images with different angles and different visual fields, and the synthesized depth image is obtained by superposing the plurality of target depth images, and the details of the synthesized depth image are more accurate.
It can be understood that the method for superimposing multiple target depth images includes, but is not limited to, performing image translation, image combination, frame cropping, etc. on the multiple target depth images, and after the same frame portions of the multiple target depth images are superimposed and overlapped, the detail portions are more accurate. For the part of the picture that cannot be overlapped (i.e. the picture received by only one receiving module 20), the cropping can be selected to be omitted or retained. When all the imaged pictures remain, the synthesized depth image has a wider angle of view.
Further, the method for executing the superposition algorithm on the plurality of target depth images includes but is not limited to processing through image signal processing or digital signal processing, and the image signal is processed by using the signal processing mode, so that the image superposition effect is good, and meanwhile, the calculation efficiency is considered.
According to the camera module 100 and the imaging method provided by the embodiment of the invention, the plurality of receiving modules 20 are arranged on the periphery of the transmitting module 10, so that the signal waves transmitted by the transmitting module 10 are respectively received by the plurality of receiving modules 20, a plurality of depth images are obtained, the picture details are enhanced, and the accuracy is improved. Compared with the existing TOF lens which only has a single transmitting element and a single receiving element, the TOF lens solves the problems of imaging distortion or detail loss and the like caused by signal wave transmission loss or absorption by an object in the prior art.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention. Those skilled in the art can also make other changes and the like in the design of the present invention within the spirit of the present invention as long as they do not depart from the technical effects of the present invention. Such variations are intended to be included within the scope of the invention as claimed.
Claims (11)
1. The utility model provides a camera module which characterized in that includes:
the transmitting module is used for transmitting signal waves;
and the receiving modules are arranged on the peripheral sides of the transmitting modules and used for receiving signal waves so as to obtain a target depth image of a target object.
2. The camera module according to claim 1, wherein the signal wave emitted by the emission module is an infrared laser wave;
the receiving module is an infrared camera.
3. The camera module of claim 2, wherein the receiving module comprises an imaging lens and a photo-sensing chip;
the imaging lens is used for converging incident light rays to the photosensitive chip, and the photosensitive chip is used for receiving signal waves so as to obtain a target depth image of a target object.
4. The camera module according to claim 3, wherein the receiving module further comprises an adjusting mechanism, the adjusting mechanism comprises a rotating member and a driving member connected to the receiving module, the rotating member is connected to the receiving module, and the driving member is used for driving the rotating member to rotate so as to adjust the position and the angle of the receiving module.
5. The camera module of claim 1, wherein the camera module further comprises a processing unit and a circuit board;
the transmitting module and the receiving module are electrically connected with the circuit board;
the processing unit is arranged on the circuit board and used for obtaining a plurality of target depth images corresponding to the receiving modules and superposing the plurality of target depth images to obtain a synthesized depth image.
6. The camera module of claim 1, wherein the transmit module is a vertical cavity surface emitting laser.
7. The camera module of claim 6, wherein the transmit module comprises a laser transmitter, a collimating element, and diffractive optics;
the collimation element is arranged on a light path of the laser emitter and used for collimating laser emitted by the laser emitter, and the diffraction optical device is arranged on the light path of the laser emitter and located on one side, away from the laser emitter, of the collimation element.
8. An electronic device, comprising:
a housing, and
the camera module of any of claims 1-7, wherein the camera module is disposed in the housing.
9. An on-vehicle ranging system, comprising:
the camera module of any of claims 1-7.
10. An imaging method, comprising the steps of:
controlling a transmitting module to transmit signal waves and controlling a plurality of receiving modules arranged on the peripheral side of the transmitting module to receive the signal waves;
respectively acquiring target depth images corresponding to the receiving modules;
and superposing the plurality of target depth images to obtain a synthesized depth image.
11. The imaging method according to claim 10, wherein the step of superimposing the plurality of target depth images to obtain one synthesized depth image includes:
and carrying out image signal processing or digital signal processing on the plurality of target depth images to obtain the synthesized depth image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010820220.4A CN112073707A (en) | 2020-08-14 | 2020-08-14 | Camera module, electronic equipment, vehicle-mounted distance measuring system and imaging method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010820220.4A CN112073707A (en) | 2020-08-14 | 2020-08-14 | Camera module, electronic equipment, vehicle-mounted distance measuring system and imaging method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112073707A true CN112073707A (en) | 2020-12-11 |
Family
ID=73661744
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010820220.4A Pending CN112073707A (en) | 2020-08-14 | 2020-08-14 | Camera module, electronic equipment, vehicle-mounted distance measuring system and imaging method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112073707A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113325426A (en) * | 2021-06-28 | 2021-08-31 | 深圳市银星智能科技股份有限公司 | Obstacle detection device and intelligent robot |
-
2020
- 2020-08-14 CN CN202010820220.4A patent/CN112073707A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113325426A (en) * | 2021-06-28 | 2021-08-31 | 深圳市银星智能科技股份有限公司 | Obstacle detection device and intelligent robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10578724B2 (en) | LIDAR optics alignment systems and methods | |
US10129377B2 (en) | Integrated structure including image capture and depth sensing components | |
US10354449B2 (en) | Augmented reality lighting effects | |
US10085011B2 (en) | Image calibrating, stitching and depth rebuilding method of a panoramic fish-eye camera and a system thereof | |
US10901073B2 (en) | Illumination for zoned time-of-flight imaging | |
CN110572630B (en) | Three-dimensional image shooting system, method, device, equipment and storage medium | |
CN108174180B (en) | A kind of display device, display system and 3 D displaying method | |
CN108399596B (en) | Depth image engine and depth image calculation method | |
CN110998223A (en) | Detector for determining the position of at least one object | |
CN105681687B (en) | Image processing apparatus and mobile camera including the same | |
US11763425B2 (en) | High resolution time-of-flight depth imaging | |
CN110213491B (en) | Focusing method, device and storage medium | |
US11175568B2 (en) | Information processing apparatus, information processing method, and program as well as in interchangeable lens | |
CN112073707A (en) | Camera module, electronic equipment, vehicle-mounted distance measuring system and imaging method | |
US20180025505A1 (en) | Image Processing Device, and related Depth Estimation System and Depth Estimation Method | |
US11914271B2 (en) | Multi-input folded camera and mobile device including the same | |
KR102606835B1 (en) | Electronic device for generating depth map and method thereof | |
US20240095939A1 (en) | Information processing apparatus and information processing method | |
CN112630750B (en) | Sensor calibration method and sensor calibration device | |
EP4314703A1 (en) | Mixed-mode depth imaging | |
CN108449529B (en) | Depth calculation processor and mobile terminal | |
WO2020154070A1 (en) | Illumination for zoned time-of-flight imaging | |
CN112912929A (en) | Fisheye infrared depth detection | |
CN210725135U (en) | Image sensing module | |
US10499036B2 (en) | Image sensing module |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20201211 |