Nothing Special   »   [go: up one dir, main page]

CN113805185A - Time-of-flight TOF apparatus and electronic device - Google Patents

Time-of-flight TOF apparatus and electronic device Download PDF

Info

Publication number
CN113805185A
CN113805185A CN202010463224.1A CN202010463224A CN113805185A CN 113805185 A CN113805185 A CN 113805185A CN 202010463224 A CN202010463224 A CN 202010463224A CN 113805185 A CN113805185 A CN 113805185A
Authority
CN
China
Prior art keywords
light source
light
sub
light beams
external object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010463224.1A
Other languages
Chinese (zh)
Inventor
甘远
林峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fushi Technology Co Ltd
Original Assignee
Shenzhen Fushi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fushi Technology Co Ltd filed Critical Shenzhen Fushi Technology Co Ltd
Priority to CN202010463224.1A priority Critical patent/CN113805185A/en
Publication of CN113805185A publication Critical patent/CN113805185A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application provides a TOF apparatus and electronic equipment, and the TOF apparatus includes: a transmit module, comprising: the light source comprises a plurality of sub light source groups, each sub light source group comprises at least one sub light source, and the modulation element is used for modulating light beams emitted by the light source to form modulated light beams and projecting the modulated light beams to an external object; the receiving module comprises a pixel array with a plurality of pixel units, and when the pixel array emits light beams in a plurality of sub light source components, the pixel array is used for receiving modulated light beams returned from the external object in a time-sharing mode so as to obtain a plurality of depth maps of the external object; and the processing module is used for synthesizing a target depth map of the external object according to the plurality of depth maps of the external object, wherein the target depth map comprises depth information determined according to the time difference or the phase difference between the light beams emitted by the sub light source groups and the modulated light beams received by each pixel unit in the plurality of pixel units when each sub light source group emits the light beams.

Description

Time-of-flight TOF apparatus and electronic device
Technical Field
The present application relates to the field of 3D technology, and more particularly, to time-of-flight TOF apparatuses and electronic devices.
Background
The Time of Flight (TOF) module calculates the distance, or depth, of an object by measuring the Time of Flight of a light beam in space, and is widely applied to the fields of consumer electronics, unmanned driving, AR/VR, and the like due to its advantages of high precision, large measurement range, and the like.
The TOF module comprises a light source and a camera, wherein the light source is used for emitting a light beam to a target space to provide illumination, the camera images the light beam returning from an external object, and the distance of the object is calculated according to the time required by the light beam from emission to reception.
However, the pixel units in the camera are generally implemented by Single Photon Avalanche Diodes (SPADs), and the SPADs are large in size and difficult to make into high-resolution pixel arrays, which limits the resolution of 3D depth imaging.
Therefore, how to improve the resolution of 3D depth imaging is an urgent problem to be solved.
Disclosure of Invention
The application provides a time of flight TOF device and electronic equipment, can promote the resolution ratio of 3D degree of depth formation of image under the condition of not promoting the array resolution ratio of pixel array.
In a first aspect, there is provided a time of flight TOF apparatus comprising:
a transmit module, comprising: the light source comprises a plurality of sub light source groups, each sub light source group comprises at least one sub light source, the light source is used for emitting a light beam, and the modulation element is used for modulating the light beam emitted by the light source to form a modulated light beam and projecting the modulated light beam to an external object;
a receiving module, including a pixel array having a plurality of pixel units, for receiving the light beam returned from the external object in a time-sharing manner when the light beam is emitted in the time-sharing manner by the plurality of sub-light source components, so as to obtain a plurality of depth maps of the external object;
and the processing module is used for synthesizing a target depth map of the external object according to a plurality of depth maps of the external object, wherein the target depth map comprises depth information determined according to the time difference or phase difference between the light beams emitted by the sub light source groups and the light beams received by each pixel unit in the plurality of pixel units when each sub light source group in the plurality of sub light source groups emits light beams.
In some optional implementations, the modulation element includes a diffractive optical element DOE, and the modulated light beam is a light spot array, and the diffractive optical element is configured to diffract the light beam emitted by each sub light source group to form the light spot array.
In some optional implementations, the receiving module further includes: and the first lens unit is used for receiving the light beam returned from the external object, collimating or converging the light beam and transmitting the light beam to the plurality of pixel units.
In some optional implementations, the transmitting module further includes: and the second lens unit is arranged between the light source and the modulation element and is used for collimating or converging the light beams emitted by the sub light source group and then transmitting the light beams to the modulation element.
In some optional implementations, the processing module is specifically configured to:
sequentially controlling one sub light source group in the plurality of sub light source groups to emit light beams according to a preset sequence so that the plurality of pixel units collect light beams returned from the external object;
determining a depth map of the external object based on time or phase differences between light beams emitted by the sub light source group and the light beams received by the plurality of pixel units.
In some optional implementations, in the target depth map, each pixel unit corresponds to a plurality of depth values, where the depth values are depth values determined according to time differences or phase values of light beams emitted by the sub light source groups and light beams received by the pixel units when the light beams are emitted by the sub light source groups respectively.
In some optional implementations, the modulation unit is configured to copy the light beams emitted by the plurality of sub light source groups to obtain a plurality of light beams, and the same pixel unit is configured to receive the light beams returned from the external object in a time-sharing manner when emitting light beams in different sub light source groups.
In some alternative implementations, different pixel units receive light beams emitted from the same sub light source group copied by the modulation unit at the same time.
In some alternative implementations, the plurality of light beams emitted by the same sub-light source group and replicated by the modulation unit do not overlap.
In some optional implementations, the light source is a dot matrix light source, the dot matrix light source includes a plurality of light emitting points arranged at intervals, the plurality of light emitting points form the plurality of sub light source groups, each sub light source group includes at least one light emitting point, each light emitting point is configured to emit a light beam, and the light beam emitted by each light emitting point includes a single pulse wave signal.
In some alternative implementations, the light source is a vertical cavity surface emitting laser, VCSEL, or light emitting diode, LED.
In some optional implementations, the emission module further includes a driving circuit for driving the light source to emit light, the driving circuit including:
the light source switch circuit is used for controlling the on and off of the light source;
the light source driving circuit comprises a current-limiting resistor and a driving capacitor, the other end of the light source is connected with one end of the current-limiting resistor, the other end of the current-limiting resistor is connected with a power voltage, the driving capacitor is connected with the current-limiting resistor in parallel, and the driving capacitor is used for improving the value of current flowing through the light source when a light source switch is turned on so as to reduce the rising time of the rising edge of a light beam emitted by the light source.
In some optional implementations, the light source switch circuit includes a light source switch and a light source switch driving circuit, the light source switch driving circuit is connected to the light source switch, and the light source switch driving circuit is configured to control the light source switch to be turned on and off according to a driving signal.
In some optional implementations, the light source switch includes a plurality of control switches, each control switch for controlling on and off of one of the plurality of sub light source groups.
In some optional implementations, the light source switch driving circuit is specifically configured to:
when the driving signal is at a high level, controlling the light source switch to be conducted so as to enable the light source to emit a light beam; or
And when the driving signal is at a low level, controlling the light source switch to be switched off so as to enable the light source to stop emitting light beams.
In some optional implementations, the driving circuit of the transmitting module further includes:
and one end of the filter capacitor is connected with the power supply voltage, and the other end of the filter capacitor is grounded.
In a second aspect, an electronic device is provided, comprising: a time of flight TOF apparatus as described in the first aspect or in any of the alternative implementations described above.
In the embodiment of the application, the light sources in the light sources are turned on and off by taking the sub-light source group as a unit, a mode that the resolution is required to be improved by increasing the number of pixel units in space can be converted into a mode that a depth map is acquired by time-division lighting of the light sources in a time domain, and the depth map acquired by the time-division lighting of the light sources is further synthesized into a target depth map, so that high-resolution 3D imaging is realized under the condition that the resolution of a pixel array is not required to be improved, and further subsequent operations such as face recognition or 3D modeling are performed based on the high-resolution depth map, which is beneficial to improving the system performance.
Drawings
Fig. 1 is a schematic configuration diagram of a TOF apparatus according to an embodiment of the present application.
FIG. 2 is a schematic diagram of a sub-light source bank of an emission module of the TOF apparatus of FIG. 1.
FIG. 3 is a schematic diagram of a receive module of the TOF apparatus of FIG. 1.
Fig. 4 is a schematic diagram of each pixel unit of the receiving module receiving a light beam emitted by one sub-light source group.
Fig. 5 is a schematic structural diagram of a driving circuit of the transmission module.
Fig. 6 is a block schematic diagram of an electronic device of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Further, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In the description of the present application, it is to be understood that the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject technology can be practiced without one or more of the specific details, or with other structures, components, and so forth. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring the focus of the application.
Referring to fig. 1, fig. 1 shows a schematic block diagram Of a Time Of Flight (TOF) apparatus 10 according to an embodiment Of the present application. Alternatively, the TOF apparatus 10 can be adapted for mounting on an electronic device. The electronic device includes, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart wearable device, a smart door lock, a vehicle-mounted electronic device, a medical device, an aviation device, and other devices or apparatuses requiring a 3D information sensing function.
In particular, as shown in fig. 1, the TOF device 10 comprises a transmitting module 11, a receiving module 12 and a processing module 13. The emitting module 11 is configured to emit a light beam 201 to a space of the external object 20, at least a portion of the emitted light beam 201 returns from the external object 20 to form a light beam 202, the returned light beam 202 carries depth information (or depth information) of the external object 20, at least a portion of the light beam 202 is received by the receiving module 12, and the processing module 13 is configured to calculate a time difference or a phase difference between the light beam 201 and the light beam 202 to determine the depth information of the external object 20, so that a depth imaging function of the TOF apparatus 10 on the external object can be achieved. Optionally, the depth information of the external object 20 in the embodiment of the present application may be used for 3D modeling, face recognition, or simultaneous localization and mapping (SLAM), for example, which is not limited in the present application.
Optionally, the processing module 13 is connected to the transmitting module 11 and the receiving module 12, and the processing module 13 is further configured to synchronize trigger signals of the transmitting module 11 and the receiving module 12 to calculate a time difference required for the light beam 201 to be emitted from the transmitting module 11 to be received by the receiving module 12 or calculate a phase difference between the light beam received by the receiving module 12 and the light beam emitted by the transmitting module 11, so as to determine depth information of a corresponding point on the external object 20.
Optionally, the processing module 13 may be a processing module of the TOF apparatus 10, or may also be a processing module of an electronic device including the TOF apparatus 10, for example, a main control module of the electronic device, or a part of the processing module 13 is disposed in the TOF apparatus 10, and a part of the processing module is disposed in the processing module of the electronic device, where the processing is not limited in this embodiment of the present application.
Alternatively, the TOF device 10 is, for example, a Direct Time Of Flight (D-TOF) device. The D-TOF apparatus is based on the direct time-of-flight detection principle to perform depth information sensing. The D-TOF apparatus obtains depth information of the external object 20 by directly calculating a time difference between the light beam emitted by the emitting module 11 and the light beam received by the receiving module 12. Alternatively, the TOF device 10 may be an Indirect Time Of Flight (I-TOF) device, for example. The I-TOF apparatus performs depth information sensing based on indirect time-of-flight detection principles. The I-TOF apparatus obtains depth information of the external object 20 by calculating a phase difference between the light beam emitted by the emitting module 11 and the light beam received by the receiving module 12.
Since the resolution of the receiving module 12 of the D-TOF apparatus is lower, in the following embodiments of the present application, it is described how to improve the resolution of 3D depth imaging by taking the TOF apparatus 10 mainly as a D-TOF apparatus as an example. However, the following embodiments may also be extended to other suitable TOF devices such as I-TOF devices. This is not a limitation of the present application.
The transmission module 11 includes: the light source 110 is used for emitting light beams, and the modulation element 111 is used for modulating the light beams emitted by the light source 110 to form modulated light beams, namely the light beams 201, and projecting the light beams 201 to an external object 20. Optionally, the beam 201 is a speckle pattern, for example.
In some embodiments, the receiving module 12 includes an image sensor including a pixel array 120 composed of a plurality of pixel units, the pixel array 120 being configured to receive the light beam 202 returned from the external object 20. Wherein one pixel cell is used to convert the received light beam 202 into a corresponding electrical signal to obtain depth information. Alternatively, the pixel unit may be a Charge-Coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), an Avalanche Diode (AD), a Single Photon Avalanche Diode (SPAD), or the like.
Optionally, the receiving module 12 further includes a readout circuit formed by one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC), and the like, which are connected to the image sensor, and the application is not limited thereto. Alternatively, part or all of the readout circuitry may also be integrated in the image sensor.
The receiving module 12 further includes: and a second lens unit 121, configured to receive the light beam 202 returned from the external object 20, collimate or converge the light beam 202, and transmit the collimated or converged light beam to the plurality of pixel units.
When the size of the pixel unit used by the receiving module 12 is large, for example, when SPAD is used, the size of SPAD is generally large, and the number of the pixel units in the pixel array 120 determines the resolution of the image, and when the size of the pixel unit is large, the resolution of the acquired image is lower for the image sensor with the same size.
In view of this technical problem, in the embodiment of the present application, the light source 110 is designed to include a plurality of sub light source groups, each of which includes at least one sub light source, and the plurality of sub light source groups emit light beams at the same time. Correspondingly, when the plurality of sub-light source components emit light beams, the plurality of pixel units in the pixel array 120 receive light beams returned from the external object 20 in a time-sharing manner to acquire a plurality of depth maps of the external object 20. For example, the plurality of sub light source groups include nine sub light source groups, which are respectively denoted as sub light source group 1 to sub light source group 9, in one implementation, one sub light source group may be turned on each time according to the sequence number of the sub light source group from small to large, and when each sub light source group is turned on, the pixel array 120 collects the corresponding depth map, so as to obtain 9 depth maps.
Further, the processing module 13 may be configured to synthesize a target depth map of the external object 20 according to the plurality of depth maps of the external object 20. In this way, the target depth map includes depth information determined according to a time difference or a phase difference between the light beams emitted by the sub light source groups and the light beams received by each of the plurality of pixel units when each of the plurality of sub light source groups emits the light beams. Following the above example, the target depth map may include depth information from the 9 depth maps. Therefore, the light source 110 is divided into a plurality of sub light source groups to perform time-sharing light emission, and the depth maps acquired during the time-sharing light emission of the sub light source groups are further synthesized into the target depth map, so that the resolution of the acquired depth map can be improved without increasing the number of pixel units in the pixel array 120.
The light source 110 is configured to emit a light beam, such as infrared light, ultraviolet light, visible light, and the like. The Light source 110 may be, for example, an infrared Light Emitting Diode (LED), a Vertical Cavity Surface Emitting Laser (VCSEL), a Fabry Perot (FP), a Laser Diode (LD), a Distributed Feedback (DFB) Laser, an Electro-absorption Modulated Laser (EML), and the like, which is not limited in the embodiment of the present disclosure. Hereinafter, the light source 110 is described as an example of a VCSEL type light source, but the present application is not limited thereto.
In the embodiment of the present application, the light source may include a single light source or a plurality of light sources, for example, the plurality of light sources may be a regularly arranged or irregularly arranged light source array. Taking the light source 110 in the form of a VCSEL as an example, the light source 110 may include a semiconductor substrate and a VCSEL array chip formed by a plurality of VCSEL light sources arranged on the semiconductor substrate.
The present application does not limit the implementation manner of the plurality of sub light source groups in the light source, and the light source 110 is a light source in the form of a VCSEL, for example, the light source 110 may include a single VCSEL light source, the VCSEL light source may include a plurality of light emitting points, the plurality of light emitting points form the plurality of sub light source groups, and each sub light source group includes at least one light emitting point. For another example, the light source 110 includes a plurality of VCSEL light sources, each VCSEL light source including one light emitting point, the plurality of VCSEL light sources forming the plurality of sub light source groups, each sub light source group including at least one VCSEL light source.
In the embodiment of the present application, the light beam emitted by the light source 110 may be, for example, a pulse signal, such as but not limited to a square wave signal or a sine wave signal. The modulation element 111 is used for modulating the light beam emitted by the light source 110, for example, modulating the light beam into a speckle pattern 201, and emitting the speckle pattern 201 to the space of the external object 20.
Optionally, in some embodiments, the transmitting module 11 further includes: and a first lens unit 112, disposed between the light source 110 and the modulation element 111, for collimating or converging the light beams emitted by the sub light source group and transmitting the light beams to the modulation element 111.
In some embodiments, the modulating element 111 may be a Diffractive Optical Element (DOE) that diffracts an incident beam to form the speckle pattern 201. In one embodiment, the DOE may split the incident light beam to form a plurality of light beams, for example, tens of thousands of light beams or hundreds of light beams or tens of light beams or several light beams, and emit the plurality of light beams toward the external object 20, and each light beam may form a light spot on the surface of the external object 20. In one embodiment, the DOE may diffract an incident beam to form an array of spots, i.e. a regular array of spots. In other embodiments, the DOE may diffract the incident beam to form other patterns, for example, a speckle pattern with a random array of spots.
When the light beam emitted from the light source 110 is replicated by the DOE, the light beam emitted to the external object 20 is composed of a plurality of replicated light beams, which is beneficial to enlarging the field angle and the number of light beams of the TOF apparatus 10 and improving the imaging effect.
In some embodiments, the modulating element 111 may also include a microlens array formed by arranging a plurality of microlens units. In one embodiment, the plurality of microlens elements are configured to receive light beams from the light source 110 and generate an array of light beams corresponding to the arrangement of the microlens elements for emission. In one embodiment, the light source 110 also includes a plurality of sub-light sources corresponding to the arrangement of the micro-lens array, and each micro-lens unit receives the light beams of the sub-light source corresponding thereto and emits the light beams outward after collimating or focusing the light beams. The array of light beams may be in a random or regular arrangement.
Hereinafter, the modulation element 111 is described as an example of DOE, but the present application is not limited thereto.
In the embodiment of the present application, the modulation element 111 does not limit the number of copies of the light beam emitted by the light source 110. For example, the modulation element 111 may duplicate the light beam emitted by one light emitting point in the light source 110 into an N × M light beam array, and when the light source 110 is an a × B array of lattice light sources and the light emitting points in the lattice light sources emit light simultaneously, the light beam array of N × a columns and M × B rows may be formed through the modulation element 111.
Optionally, in some embodiments, the number of copies of the modulation element 111 may be the same as the number of arrays of the pixel array 120, and in other embodiments, the number of copies of the modulation element 111 may also be different from the number of arrays of the pixel array 120, which is not limited in this application.
Optionally, in some embodiments of the present application, the processing module 13 is specifically configured to:
sequentially controlling one of the sub light source groups to emit light beams according to a preset sequence so that the pixel units collect light beams 202 returned from the external object 20;
determining a depth map of the external object 20 according to the time difference or phase difference between the light beam 201 emitted by the sub light source group and the light beam 202 received by the plurality of pixel units.
As an example, as shown in fig. 2, the light source 110 includes 9 sub light source groups 1101, each sub light source group 1101 includes one light emitting point, that is, the light source 110 includes 9 light emitting points, and the 9 light emitting points may be arranged in a 3 × 3 array. The DOE has a replication number of 5 x 3, i.e. 5 columns and 3 rows. As shown in fig. 3, the pixel array 120 includes 15 pixel units 1201, and the arrangement manner may be, for example, a 5 × 3 array.
When the light-emitting points in the light source 110 emit light simultaneously, the light source 110 emits a 3 × 3 array of light spots, and the light spots are further collimated or converged by the first lens unit 112 and enter the DOE, and are diffracted by the DOE to form a 15 × 9 array of light spots, which are irradiated on the external object 20. If a resolution of 15 x 9 is desired, a 15 x 9 array of pixel arrays 120 is required. As can be seen from the foregoing description, increasing the size of the pixel array 120 when a single pixel unit is larger will increase the size of the whole TOF apparatus 10, reducing the practicality of the TOF apparatus 10.
In the embodiment of the present application, the light sources in the light source 110 may be controlled to be turned on and off in units of sub light source groups, and the pixel array 120 collects the light beam 202 returning from the external object 20 when each sub light source group is turned on, so as to obtain a depth map of the external object 20. In the above example, for example, in the example of fig. 2, the 9 sub-light source groups may be numbered as 1 to 9, each sub-light source group is sequentially turned on in the order of increasing the number, that is, each light emitting point is turned on, a light spot of 5 × 3 is formed after diffraction by the DOE, after the light is irradiated to the external object 20, a depth map of 5 × 3 may be obtained from the received light beam 202, and then 9 light emitting points may obtain a depth map of 9 frames 5 × 3 of the external object 20. Then, as shown in fig. 3, when the sub light source group 1 emits light, the pixel array 120 may acquire a corresponding depth map 1, and the symbol "1" in fig. 3 indicates that a corresponding depth value is acquired by each pixel unit in the pixel array 120 when the sub light source group 1 emits light. By analogy, when the sub light source groups 2 to 9 emit light respectively, the pixel array 120 may collect the corresponding depth maps 2 to 9.
Further, when the plurality of sub light source groups emit light respectively, the obtained plurality of depth maps may be combined into a target depth map, and then the target depth map may include depth information corresponding to each pixel unit in the pixel array 120 when each sub light source group emits light, that is, the resolution of the target depth map is not only related to the arrangement of the pixel units in the pixel array 120, but also related to the number of the plurality of sub light source groups, that is, the resolution of the target depth map is higher than that of the depth map obtained when a single sub light source group emits light, and the resolution of the depth map obtained when a single sub light source group emits light is only related to the arrangement of the pixel array 120.
In the above example, the depth maps of the 9 frames 5 × 3 may be synthesized to obtain a target depth map 15 × 9, as shown in fig. 4. Each pixel unit 1201 can correspond to 9 depth values, which are respectively represented by 1 to 9, and the depth values obtained when the sub light source groups 1 to 9 of the 9 sub light source groups emit light independently correspond to each other. For example, of the 9 pixel values corresponding to the pixel unit 1201 in the upper right corner, the symbol "6" represents the depth value acquired by the pixel unit when the sub light source group 6 emits light alone.
Therefore, in the embodiment of the present application, by turning on and off the light sources in the light source 110 by taking the sub-light source group as a unit, a mode that a resolution is required to be improved by increasing the number of pixel units in a space can be converted into a mode that a depth map is acquired by time-division lighting of the light sources in a time domain, and the depth map acquired by the time-division lighting of the light sources is further synthesized into a target depth map, so that under the condition that the resolution of the pixel array 120 is not required to be improved, 3D imaging with high resolution is realized, and further, subsequent operations, such as face recognition, 3D modeling and the like, are performed based on the depth map with high resolution, which is beneficial to improving system performance.
In the embodiment of the present application, the light sources 110 are controlled in groups by using the sub light source groups as units, and at a certain time, the density of the light spots emitted by the emission module 11 is not changed, but if the light spots generated by the light emission of the plurality of sub light source groups are superimposed, the density of the light spots is increased, that is, the target depth map can be considered to be obtained according to the superimposed light spots, so that the resolution of the image is improved.
In the embodiment of the present application, the emitting module 11 may emit pulsed light to the external object 20, the shorter the rising time of the pulsed light is, the higher the detection accuracy of the depth is, and in order to achieve the higher detection accuracy of the depth, extremely high requirements are set for the rising time and the falling time of the pulsed light. Typically within a few ns.
In one embodiment, as shown in fig. 5, the emitting module 11 further includes a driving circuit for driving the light source 110 to emit light. The driving circuit includes a light source switching circuit 114 andthe light source switching circuit 114 is connected to one end (cathode) of the light source 110, the other end (anode) of the light source 110 is connected to one end of the current limiting resistor 113, the other end of the current limiting resistor 113 is connected to a power voltage 117, the light source switching circuit 114 is configured to control the light source 110 to be turned on and off, the current limiting resistor 113 plays a role in limiting and protecting the light source 110, and specifically, the current limiting resistor 110 may control the voltage division of the power voltage 117 on the light source 110 to reduce the current flowing through the light source 110. Optionally, in this embodiment, as shown in fig. 5, the light source switch circuit 114 may include a light source switch 1140 and a light source switch driving circuit 1141, connected to the light source switch 1140, for generating the driving signal VINControlling the light source switch 1140 to be turned on and off. Optionally, the driving signal VINThe square wave signal can be a square wave signal with alternating high and low levels, for example, the light source switch driving circuit 1141 generates the driving signal VINWhen the level is high, the light source switch 1140 is controlled to be turned on, so that the light source 110 emits a light beam; or at the drive signal VINWhen the voltage level is low, the light source switch 1140 is controlled to be turned off, so that the light source 110 stops emitting the light beam.
Optionally, in some embodiments, when the light source 110 is divided into a plurality of sub light source groups, the light source switch 1140 includes a plurality of control switches, and each control switch is used for controlling on and off of one of the plurality of sub light source groups.
In a specific implementation, parasitic capacitance and parasitic inductance inevitably exist among the circuit boards of the light source switch 1140, the light source 110 and the TOF apparatus 10, which results in that the current flowing through the light source 110 cannot reach a set value at a very high speed (for example, several nanoseconds), and therefore, the rising edge of the light signal generated by the light source 110 is relatively gentle.
In view of this technical problem, in the embodiment of the present application, the driving circuit may include a driving capacitor 116 connected in parallel to the current limiting resistor 113, in other words, the driving capacitor 116 may be added between the power voltage 117 and the light source 110. Thus, the value of the current flowing through the light source 110 is increased at the moment when the light source switch 1140 is turned on, that is, the driving capacitor 116 provides a pulse current at the moment of the rise of the optical signal of the light source 110, thereby compensating the problem of slow rise of the current of the optical signal caused by the existence of parasitic capacitance and parasitic inductance. As can be seen from fig. 5, the rising edge of the optical signal becomes significantly steeper after the overdrive capacitance 116 is added, and the rise time can be shortened to within 10 ns.
Optionally, in some embodiments, the driving circuit further includes:
and one end of the filter capacitor 115 is connected with the power supply voltage 117, and the other end of the filter capacitor 115 is grounded. The filter capacitor 115 is used to reduce the influence of the internal resistance, wires, etc. of the light source 110 on the rising edge of the optical signal.
The drive circuit is also suitable for use in an I-TOF apparatus or the like, in an extensible manner.
As shown in fig. 6, an electronic apparatus 100 is further provided in the embodiments of the present application, where the electronic apparatus 100 includes the TOF device 10 described in the embodiments above. The specific implementation of the TOF apparatus 10 refers to the related description of the foregoing embodiments, and is not repeated here. The electronic device 100 includes, for example, but not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart wearable device, a smart door lock, a vehicle-mounted electronic device, a medical device, an aviation device, and other devices or apparatuses with TOF function requirements.
The processing module may be an integrated circuit chip having signal processing capability. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The memory module described above may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous link SDRAM (SLDRAM), and Direct Rambus RAM (DR RAM).
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The specific examples in the embodiments of the present application are only for helping those skilled in the art to better understand the embodiments of the present application, and do not limit the scope of the embodiments of the present application, and those skilled in the art may make various modifications and variations on the embodiments described above, and those modifications and variations fall within the scope of the present application.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A time of flight TOF apparatus, comprising:
a transmit module, comprising: the light source comprises a plurality of sub light source groups, each sub light source group comprises at least one sub light source, the light source is used for emitting a light beam, and the modulation element is used for modulating the light beam emitted by the light source to form a modulated light beam and projecting the modulated light beam to an external object;
a receiving module, including a pixel array having a plurality of pixel units, for receiving the light beam returned from the external object in a time-sharing manner when the light beam is emitted in the time-sharing manner by the plurality of sub-light source components, so as to obtain a plurality of depth maps of the external object;
and the processing module is used for synthesizing a target depth map of the external object according to a plurality of depth maps of the external object, wherein the target depth map comprises depth information determined according to the time difference or phase difference between the light beams emitted by the sub light source groups and the light beams received by each pixel unit in the plurality of pixel units when each sub light source group in the plurality of sub light source groups emits light beams.
2. The TOF apparatus of claim 1, wherein the modulating element comprises a diffractive optical element DOE, the modulated light beam is an array of light spots, and the diffractive optical element is configured to diffract the light beam emitted by each of the sub-light source groups to form the array of light spots.
3. The TOF apparatus of claim 1, wherein the receiving module further comprises: and the first lens unit is used for receiving the light beam returned from the external object, collimating or converging the light beam and transmitting the light beam to the plurality of pixel units.
4. The TOF apparatus of claim 1, wherein the transmitting module further comprises: and the second lens unit is arranged between the light source and the modulation element and is used for collimating or converging the light beams emitted by the sub light source group and then transmitting the light beams to the modulation element.
5. The TOF apparatus of claim 1, wherein the processing module is specifically configured to:
sequentially controlling one sub light source group in the plurality of sub light source groups to emit light beams according to a preset sequence so that the plurality of pixel units collect light beams returned from the external object;
determining a depth map of the external object based on time or phase differences between light beams emitted by the sub light source group and the light beams received by the plurality of pixel units.
6. The TOF apparatus of claim 1, wherein each pixel unit in the target depth map corresponds to a plurality of depth values, and the depth values are determined according to time differences or phase values of the light beams emitted by the sub light source groups and the light beams received by the pixel unit when the light beams are emitted by the sub light source groups respectively.
7. The TOF apparatus of claim 1, wherein the modulation unit is configured to copy the light beams emitted by the plurality of sub light source groups to obtain a plurality of light beams, and the same pixel unit is configured to receive the light beams returned from the external object in time division while emitting the light beams in time division for different sub light source groups.
8. The TOF apparatus of claim 7, wherein different pixel units receive light beams from the same sub-light source group copied by the modulation unit at the same time.
9. The TOF apparatus of claim 7, wherein the plurality of light beams from the same sub-light source group copied by the modulation unit do not overlap.
10. The TOF apparatus of claim 1, wherein the light source is a dot matrix light source, the dot matrix light source comprises a plurality of spaced apart light emitting points, the plurality of light emitting points form the plurality of sub light source groups, each sub light source group comprises at least one light emitting point, each light emitting point is configured to emit a light beam, and the light beam emitted by each light emitting point comprises a single pulse wave signal.
11. The TOF apparatus of any of claims 1-10, wherein the emission module further comprises a driving circuit for driving the light source to emit light, the driving circuit comprising:
the light source switch circuit is used for controlling the on and off of the light source;
the light source driving circuit comprises a current-limiting resistor and a driving capacitor, the other end of the light source is connected with one end of the current-limiting resistor, the other end of the current-limiting resistor is connected with a power voltage, the driving capacitor is connected with the current-limiting resistor in parallel, and the driving capacitor is used for improving the value of current flowing through the light source when a light source switch is turned on so as to reduce the rising time of the rising edge of a light beam emitted by the light source.
CN202010463224.1A 2020-05-27 2020-05-27 Time-of-flight TOF apparatus and electronic device Pending CN113805185A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010463224.1A CN113805185A (en) 2020-05-27 2020-05-27 Time-of-flight TOF apparatus and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010463224.1A CN113805185A (en) 2020-05-27 2020-05-27 Time-of-flight TOF apparatus and electronic device

Publications (1)

Publication Number Publication Date
CN113805185A true CN113805185A (en) 2021-12-17

Family

ID=78943653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010463224.1A Pending CN113805185A (en) 2020-05-27 2020-05-27 Time-of-flight TOF apparatus and electronic device

Country Status (1)

Country Link
CN (1) CN113805185A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113805187A (en) * 2020-05-27 2021-12-17 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device
CN113805186A (en) * 2020-05-27 2021-12-17 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02272778A (en) * 1989-04-13 1990-11-07 Sharp Corp Driving circuit for optical semiconductor element
KR20150129187A (en) * 2014-05-08 2015-11-19 주식회사 히타치엘지 데이터 스토리지 코리아 Method for detecting signal in TOF camera
CN110687541A (en) * 2019-10-15 2020-01-14 深圳奥锐达科技有限公司 Distance measuring system and method
CN110824490A (en) * 2019-09-27 2020-02-21 深圳奥锐达科技有限公司 Dynamic distance measuring system and method
CN111007523A (en) * 2019-12-09 2020-04-14 Oppo广东移动通信有限公司 Time-of-flight transmitter, time-of-flight degree of depth module and electron device
CN111580118A (en) * 2020-05-14 2020-08-25 深圳阜时科技有限公司 Emission module of time of flight TOF device
CN213041994U (en) * 2020-05-27 2021-04-23 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device
CN113805186A (en) * 2020-05-27 2021-12-17 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device
CN113805187A (en) * 2020-05-27 2021-12-17 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02272778A (en) * 1989-04-13 1990-11-07 Sharp Corp Driving circuit for optical semiconductor element
KR20150129187A (en) * 2014-05-08 2015-11-19 주식회사 히타치엘지 데이터 스토리지 코리아 Method for detecting signal in TOF camera
CN110824490A (en) * 2019-09-27 2020-02-21 深圳奥锐达科技有限公司 Dynamic distance measuring system and method
CN110687541A (en) * 2019-10-15 2020-01-14 深圳奥锐达科技有限公司 Distance measuring system and method
CN111007523A (en) * 2019-12-09 2020-04-14 Oppo广东移动通信有限公司 Time-of-flight transmitter, time-of-flight degree of depth module and electron device
CN111580118A (en) * 2020-05-14 2020-08-25 深圳阜时科技有限公司 Emission module of time of flight TOF device
CN213041994U (en) * 2020-05-27 2021-04-23 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device
CN113805186A (en) * 2020-05-27 2021-12-17 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device
CN113805187A (en) * 2020-05-27 2021-12-17 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
白彦霞: "数字电子技术基础", 30 June 2017, 华中科技大学出版社, pages: 67 *
蔡杏山: "电子电路识图咱得这么学", 30 April 2019, 机械工业出版社, pages: 86 - 87 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113805187A (en) * 2020-05-27 2021-12-17 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device
CN113805186A (en) * 2020-05-27 2021-12-17 深圳阜时科技有限公司 Time-of-flight TOF apparatus and electronic device

Similar Documents

Publication Publication Date Title
US10598482B2 (en) Curved array of light-emitting elements for sweeping out an angular range
US11371833B2 (en) Calibration of depth sensing using a sparse array of pulsed beams
CN111487637B (en) Distance measurement system and method based on time delay
CN111487638B (en) Distance measurement system and method based on time delay
CN111123289B (en) Depth measuring device and measuring method
CN111580120A (en) Time-of-flight TOF apparatus and electronic device
EP3424278B1 (en) Staggered array of light-emitting elements for sweeping out an angular range
JP2022510817A (en) Methods and systems for spatially distributed strobing
CN111580118A (en) Emission module of time of flight TOF device
JP2018537680A (en) Light detection distance sensor
CN113805185A (en) Time-of-flight TOF apparatus and electronic device
CN113805187A (en) Time-of-flight TOF apparatus and electronic device
CN113805186A (en) Time-of-flight TOF apparatus and electronic device
KR20190031422A (en) 3D (3D) imaging systems and electronics
CN213210474U (en) Time-of-flight TOF apparatus and electronic device
CN110780312A (en) Adjustable distance measuring system and method
CN213041994U (en) Time-of-flight TOF apparatus and electronic device
CN214703999U (en) Time-of-flight TOF apparatus and electronic device
CN213041995U (en) Time-of-flight TOF apparatus and electronic device
CN114355384B (en) Time-of-flight TOF system and electronic device
CN213210470U (en) Emission module of time of flight TOF device
US10139217B1 (en) Array based patterned illumination projector
Bantounos et al. Towards a solid‐state light detection and ranging system using holographic illumination and time‐of‐flight image sensing
CN214503893U (en) Three-dimensional imaging device and electronic equipment
CN113820725A (en) System and method for performing time-of-flight measurement and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination