Nothing Special   »   [go: up one dir, main page]

CN106662640B - Intelligent lighting time-of-flight system and method - Google Patents

Intelligent lighting time-of-flight system and method Download PDF

Info

Publication number
CN106662640B
CN106662640B CN201580034052.3A CN201580034052A CN106662640B CN 106662640 B CN106662640 B CN 106662640B CN 201580034052 A CN201580034052 A CN 201580034052A CN 106662640 B CN106662640 B CN 106662640B
Authority
CN
China
Prior art keywords
light source
interest
view
field
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201580034052.3A
Other languages
Chinese (zh)
Other versions
CN106662640A (en
Inventor
温宗晋
J.科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Publication of CN106662640A publication Critical patent/CN106662640A/en
Application granted granted Critical
Publication of CN106662640B publication Critical patent/CN106662640B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/705Pixels for depth measurement, e.g. RGBZ

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)
  • Stroboscope Apparatuses (AREA)

Abstract

An apparatus is described that includes a time-of-flight camera system having an illuminator. The illuminator has an optical component and a light source. The optical components and light source are designed to illuminate substantially a smaller area in the field of view of the camera system at the upper limit of the emitted optical power of the illuminator, the smaller area substantially encompassing an object of interest in the field of view.

Description

Intelligent lighting time-of-flight system and method
Technical Field
The present invention relates generally to camera systems and, more particularly, to smart lighting time-of-flight systems and methods.
Background
Many existing computing systems include one or more conventional image capture cameras as integrated peripherals. The current trend is to enhance computing system imaging capabilities by integrating the image components of depth capture to it. Depth capture may be used, for example, to perform various intelligent object recognition functions, such as facial recognition (e.g., security system unlock) or gesture recognition (e.g., contactless user interface functions).
One method of depth information capture, known as "time-of-flight" imaging, emits light from the system to the object and measures, for each of a plurality of pixels of an image sensor, the time between the emission of the light and the receipt of its reflected image on the sensor. The image produced by the time-of-flight pixels corresponds to a three-dimensional contour of the target, which is characterized by a unique depth measurement (z) at each different (x, y) pixel location.
Since many imaging-capable computing systems are mobile in nature (e.g., laptops, tablets, smartphones, etc.), the integration of light sources ("illuminators") into the system to achieve time-of-flight operation presents a number of design challenges, such as cost challenges, packaging challenges, and/or power consumption challenges.
Disclosure of Invention
An apparatus is described that includes a time-of-flight camera system having an illuminator. The illuminator has an optical component and a light source. The optical components and light source are designed to substantially illuminate a smaller area in the field of view of the camera system at the upper limit of the emitted optical power of the illuminator, the smaller area substantially encompassing an object of interest in the field of view.
An apparatus is described that includes means for illuminating a region of interest shaped to contain an object of interest in a field of view of a time-of-flight camera at an upper optical power limit of an illuminator of the time-of-flight camera, the region of interest being smaller than the field of view. The apparatus further comprises means for determining depth profile information in said region using a time-of-flight measurement technique.
Drawings
The following description and the annexed drawings set forth in detail certain illustrative embodiments of the invention. In the drawings:
fig. 1a (i) and 1a (ii) relate to a first possible intelligent lighting feature;
FIGS. 1b (i), 1b (ii) and 1b (iii) relate to a partitioned intelligent lighting method;
fig. 1c (i) and 1c (ii) also relate to a partitioned intelligent lighting method;
fig. 1d (i) and 1d (ii) relate to another possible intelligent lighting feature;
FIG. 1e shows an embodiment of a light source for a partitioned field of view;
fig. 2a to 2e relate to scanning in an intelligent lighting system;
fig. 3 shows an embodiment of the intelligent lighting system;
fig. 4a to 4c show an embodiment of the intelligent lighting method;
figure 5 shows a first luminaire embodiment;
figure 6 shows a second luminaire embodiment;
figure 7 shows a third luminaire embodiment;
figures 8a and 8b show a fourth luminaire embodiment;
FIG. 9 shows a 2D/3D camera system;
FIG. 10 illustrates a computing system.
Detailed Description
The intelligent lighting time-of-flight system addresses some of the design challenges mentioned in the background section. As will be discussed below, smart lighting involves smart operation of any or all of the size, shape, or movement of the emitted light of the time-of-flight system. Time-of-flight systems, and in particular time-of-flight systems integrated in battery-powered systems, generally exhibit a trade-off between the need for energy supply and the need for the transmission and reception intensity of the optical signal.
That is, as the intensity of the illumination light signal increases, the intensity of the received light signal increases. Better received optical signal strength results in better accuracy and performance of the time-of-flight system. However, supporting higher emitted optical signal strength causes more expensive battery solutions and/or depletion over greater battery life, either of which can be a disadvantage in user enjoyment and/or acceptance of systems with time-of-flight measurements.
Smart lighting strives to solve this problem by concentrating the illumination light power into a smaller area of illumination that is directed onto a region of interest in the camera's field of view. By focusing the optical power into a smaller area of illumination, the received optical signal strength and time-of-flight system performance is enhanced without having to increase the energy drawn from the battery. Thus, the aforementioned user perceived disadvantages may be acceptably minimized.
The use of smaller regions of light involves the ability to direct smaller regions of illumination to regions of interest in the camera field of view. The region of interest is, for example, an area in the camera field of view that is smaller than the camera field of view and that is prioritized over other areas in the field of view in terms of obtaining depth information. Examples of regions of interest include regions where there is a target whose depth information is desired or where previously made time-of-flight measurements result in poor received signal strength.
Thus, after the region of interest is identified in the field of view, the lighting system receives information indicative of the region of interest and concentrates the light intensity on the region of interest. Concentrating the light intensity to the area of interest may involve emitting optical light at or near the power limit of the illuminator, but directing the light primarily onto the area of interest.
A first example includes, for an illuminator with a single light source, emitting light from the light source at the power limit of the illuminator and focusing a smaller "spot" of light at the region of interest. A second example includes, for a luminaire having multiple light sources, emitting light from one of the light sources at the power limit of the luminaire so that the light of the other light sources must remain off and directing a beam of light from the illumination light source to the area of interest.
Other intelligent lighting strategies may utilize illumination of smaller regions of interest by illuminating the region of interest with less power than full illuminator power. For example, if the area of interest is small enough, sufficiently accurate information about the area may be available with less power than full illuminator power.
Various possible features of the intelligent lighting system are discussed in detail below. Generally, however, the intelligent lighting system may be designed to change either or both the size and shape of the illuminated area in order to illuminate the object of interest in the camera field of view. Furthermore, the intelligent illumination system may be designed to change the position of the illuminated area, for example, by scanning the emitted beam over the field of view.
Fig. 1a to 1d as discussed below relate to aspects of varying the size and shape of the illuminated area. In contrast, fig. 2a to 2c, as discussed below, relate to aspects of changing the position of the illuminated area.
Fig. 1a (i) and 1a (ii) illustrate that the size of the illuminated area can be adjusted in the view of the object of interest to be illuminated. That is, in the first scenario 111 of FIG. 1a (i), the first, smaller object of interest 102 consumes less area in the camera's field of view 101. Thus, the illuminated area of interest 103 emitted by the illuminator is shrunk in size to encompass the smaller object of interest 102. In contrast, in the case 112 of FIG. 1a (ii), the second, larger object of interest 104 consumes a larger area in the field of view 101 of the camera. Thus, the illuminated area of interest 105 emitted by the illuminator is expanded to encompass a larger object of interest 104.
The shrinking and expanding of the size of the illuminated areas 103, 105 may be achieved, for example, with an illuminator having movable optical components (e.g., a movable light source, a movable lens, a movable mirror, etc.). Controlled movement of optical components in the illuminator can be used to controllably set the size of the illuminated area. Examples of illuminators with movable optical components are discussed in more detail further below with respect to fig. 5a to 5 c.
Alternatively, as shown in fig. 1b (i) and 1b (ii), the field of view 101 may be partitioned into different portions, which may be individually illuminated (e.g., as shown in fig. 1b (i), the field of view is partitioned into nine different portions 106 _1through 106 _9). As shown in case 121 of fig. 1b (i), the smaller object of interest 107 is illuminated by illuminating one of the partitions (partition 106_1). In contrast, as shown in case 122 of FIG. 1b (ii), a larger object of interest 109 is illuminated by illuminating four of the partitions (partitions 106_1, 106_2, 106_4, and 106 _5). Thus, the illuminated region of interest 110 of FIG. 1b (ii) is significantly larger than the illuminated region of interest 108 of FIG. 1b (i).
With reference to fig. 1b (ii) it is noted that the entire field of view may be illuminated by illuminating all the partitions simultaneously, or by illuminating each partition individually in succession (or some mix of the two methods). The former method tends to induce a weaker received light signal. The latter method can be performed with higher light concentration on the separately illuminated partitions, but at the expense of the time required to scan the field of view. Here, successively illuminating each partition corresponds to a form of scanning. The scanning is described in more detail further below with respect to fig. 2a to 2 e.
Illumination of the partitioned areas in the field of view may be achieved with partitioned light sources. Fig. 1b (iii) depicts a top view (looking into the inside of the surface of the luminaire) of an exemplary zoned luminaire light source 117, said luminaire light source 117 having nine individual light sources 113_1 to 113_9. In various embodiments, each individual light source is implemented as a Vertical Cavity Side Emitting Laser (VCSELs) or Light Emitting Diodes (LEDs) and is responsible for illuminating a particular zone. All of the individual light sources 113 _1through 113 _9may be integrated, for example, on the same semiconductor chip. In one embodiment, each individual light source is implemented as an array of light source devices (VCSELs or LEDs) so that the entire illuminator power budget can be used to illuminate only a single area (in which case the individual light sources of other areas must be turned off).
If illuminator light source 117 is used in case 121 of FIG. 1b (i), individual light source 113_1 will be turned on to illuminate partition 106_1. In contrast, if the illuminator light source 117 were used for scenario 112 of FIG. 1b (ii), the individual light sources 113_1, 113_2, 113_4, and 113 _5would be turned on. More details about the illuminator with zoned fields of view and corresponding light source embodiments are described in more detail further below.
FIG. 1c (i) shows another partitioning approach, where the partitions themselves are not all the same size. That is, there are different partitions with different sizes. Due to the different sized sections, the size of the illuminated area can be changed by illuminating the different sized sections in turn. For example, if only a smaller section is illuminated and then only a larger section is illuminated, the size of the illuminated area will expand. An exemplary light source 119 for the partitioning method of FIG. 1c (i) is shown in FIG. 1c (ii). Note that the individual light sources illuminating the larger sub-areas have a larger potential optical power output (e.g., illustrated by having more VCSELs or LEDs) than the individual light sources illuminating the smaller sub-areas.
Note that for the same emitted optical power, expansion and/or contraction of the size of the illuminated area (whether by non-sectorized or sectorized principles) involves a trade-off between the size of the illuminated area and the strength of the received signal. That is, for the same emitted optical power, a smaller illuminated area corresponds to a stronger received signal strength. In contrast, again for a constant emitted optical power, a larger illuminated area corresponds to a weaker received signal strength.
If a larger illuminated area size is desired but there is no loss in received signal strength, there is another tradeoff between the illuminated area size and the amount of power to be consumed by the luminaire. That is, to increase the size of the illuminated area but maintain the received light signal strength, the illuminator (without any scanning as described below) will generally need to emit more intense light, which will cause the illuminator to consume more energy.
Some intelligent lighting systems may be designed to maintain a minimum received light signal strength at the image sensor. In case the illuminated area of interest shrinks, the light intensity can be reduced, since a sufficiently strong light intensity per unit illuminated surface area can still be maintained. Conversely, illuminator power may increase as the size of the region of interest expands.
Furthermore, in case the area of interest is small, the emitted light intensity may only slightly decrease, remain constant or even increase, since the received light signal is typically inversely proportional to the distance from the reflecting object to the camera, since the object to be illuminated is further away. Thus, when determining the appropriate illuminator optical power, the intelligent lighting system may consider its distance in addition to the size of the object of interest. A more complete discussion of the factors that may be considered by the intelligent lighting system when setting the lighting characteristics is described in more detail below with respect to fig. 3.
In various embodiments, the shape of the illuminated area may vary. Fig. 1d (i) shows a first situation 131 when the pointing beam (pointed beam) is pointed in the middle of the field of view, the pointing beam is substantially circular, but, as shown in situation 132 of fig. 1d (ii), the pointing beam becomes more elliptical in shape when the same beam is pointed at the corners of the field of view of the illuminated area. Movement of the pointing beam, as will be discussed in more detail below, may be achieved with an illuminator having movable optical components.
Fig. 1e shows an embodiment of a light source for the segmented field of view method, the segments themselves having different shapes. Thus, illuminating only a first partition having a first shape, and then illuminating only a second partition having a second, different shape, will correspondingly produce an illuminated area of changing shape in the field of view.
Fig. 1a to 1e described above relate to an intelligent lighting system that can vary the size and/or shape of the illuminated area as the system attempts to properly illuminate different objects of interest appearing in the time of flight camera system field of view.
In contrast, fig. 2a to 2c relate to the scanning of emitted light in the field of view of the camera. Scanning involves intelligent changes in the area receiving illumination over time in order to capture the entire area of interest that is larger than the size of the illuminated area itself. Here, reviewing from fig. 1a to 1e above, as the region of interest expands, the emitted light intensity may have to increase in order to maintain a sufficiently strong illumination and corresponding received signal intensity. It is envisioned that some areas of interest may be large enough that the appropriate emitted light intensity exceeds the desired or allowed illuminator light budget.
Scanning helps to maintain or enhance received signal strength over a larger area of interest, but there is no corresponding increase in emitted optical power strength. That is, for example, by scanning smaller "dots" of illumination over a larger region of interest, i.e., using light power sufficient to illuminate only the smaller dots, depth information can be collected for the larger region of interest.
Fig. 2a shows an example of a scan as described immediately above. As shown in fig. 2a, a larger object of interest 205 is illuminated by scanning a smaller illuminated area, initially from location 206 at time T1 to location 207 at time T2 and then to location 208 at time T3 and finally location 209 at time T4. The scanning of fig. 2a may be realized, for example, with an illuminator having moving optical components that can direct or sweep a light beam in a scanning motion over a field of view.
Alternatively or in combination, as shown in fig. 2b, scanning may be achieved with a zone illumination system by illuminating different zones in an on and off sequence. That is, as shown in FIG. 2b, the first partition 211 is illuminated at time T1. Immediately thereafter, at time t2, the first partition 211 is closed and the second partition 212 is opened. For the third and fourth partitions 213, 214, similar sequences occur immediately at times t3 and t 4. Thus, the region of interest in all four zones may be illuminated at times t1 to t 4.
Fig. 2c shows that the scanning may be disjointed. That is, the embodiment of figures 2a and 2b assumes that the next region to be illuminated in the scan is adjacent to the region that was just previously illuminated. In contrast, fig. 2c illustrates that scanning may include illuminating two separate regions that are not adjacent. Here, at time T1, the first region 221 is illuminated. Then, at time T2, a second region 222 is illuminated, the two regions being mutually non-adjacent in the field of view. Incoherent scanning may be performed, for example, when a "region of interest" includes two or more different, non-adjacent areas or events in the field of view that require illumination. Incoherent scanning may be performed with both zoned and non-zoned illumination strategies.
Note that the example of fig. 2c also shows that the size of the illuminated area may vary over the scan sequence (illuminated area 222 is larger than illuminated area 221). Varying the illuminated area size during the scan is not limited to an incoherent scan and may be a feature of a continuous scan such as fig. 2a and 2b discussed above. In the case of sector scanning, it is possible to vary the size of the illuminated area, for example by first going to a first sector and then to a plurality of sectors.
Fig. 2d further shows that the intelligent lighting system of a certain partition may be designed to perform scanning in the partition. That is, the illuminator may have both zoned light sources and movable optical components, such that a smaller beam is scanned in the surface area of a zone to effectively illuminate the zone. As shown in fig. 2d, an illumination "spot" of smaller size than the zone is scanned in the upper left zone to effectively illuminate the upper left zone. The entire field of view may be scanned by scanning each partition sequentially or simultaneously (as discussed further below with respect to fig. 2 e) or some mix of the two methods.
As discussed above, various illuminator embodiments are capable of varying the size of the illuminated area (by varying the cross-section of the beam of emitted light), while other illuminator embodiments encompass a zoned approach, where the field of view is zoned and the illuminator is capable of illuminating each zone individually. The method of fig. 2d may be integrated into a luminaire having both of these characteristics. That is, it is conceivable that an illuminator whose design supports varying the size of the illuminated area could conceivably form a beam that is large enough to illuminate the entire zone, and also form a beam that is smaller than the entire zone so that it can scan in the zone.
Fig. 2e shows another zone scanning method, in which the respective zones are simultaneously scanned with their own respective beams. In an embodiment, the illuminator is designed to not only direct separate beams to each of the zones simultaneously, but also to scan the beams. Embodiments of illuminator designs capable of simultaneously scanning multiple zones in a field of view are described in more detail further below.
Note that while the embodiment of fig. 2d is directed to a partitioned approach, other embodiments may scan over an area where the illuminator design does not fully encompass the design of the partitions (e.g., a particular beam may be directed anywhere in the field of view). However, simultaneous scanning of multiple beams includes each beam having its own respective area scanned therein. Such a region may be considered a partition in a simultaneous multi-beam scanning sequence.
Any of the scanning methods of fig. 2a to 2e discussed above may introduce a trade-off between the time taken to collect time-of-flight information for a region of interest and the size of the region of interest. That is, for a constant illuminated area size (e.g., "spot size"), more scan time will be consumed as the size of the area of interest to be illuminated grows. Conversely, if the region of interest grows, the scan time can be reduced by increasing the size of the illumination, but at the expense of increased emitted optical power (if the light intensity per unit area is maintained) or received signal intensity (if the light intensity per unit area is allowed to decrease).
The discussion of fig. 1a to 1e and 2a to 2d emphasizes some of the basic tradeoffs that exist in intelligent systems, such as: 1) A trade-off between illuminated area size and received signal strength; 2) Trade-offs between received signal strength and luminaire power consumption; 3) A trade-off between illuminated area size and scan time; 4) A trade-off between illuminator power and distance between the target of interest and the camera. Additional tradeoffs may include tradeoffs between reflected and emitted optical power of the target of interest. Here, a typical time-of-flight illuminator will emit Infrared (IR) light. The illuminator may emit less optical power if the object of interest to be illuminated substantially reflects IR light. In contrast, an illuminator may increase its emitted optical power if the object of interest to be illuminated does not reflect IR light particularly well.
Which trade-offs and/or which direction and how important any particular trade-off should be a function of the particular circumstances surrounding any particular lighting situation.
For example, consider that the object of interest to be illuminated is of moderate size and is far from the camera. Here, if the available power budget is large and it is desired to complete the reading in a short time, the intelligent lighting control system may choose to fully illuminate the target area with high lamp power without any scanning. In contrast, in another case, where the object of interest is large and close to the camera, but the available power budget is small and lacks the need to complete the reading immediately, the same intelligent illumination system may choose to form a smaller illuminated area and scan it over the area of interest.
From these examples, it should be clear that the intelligent lighting system can take into account the surrounding situation before illuminating a particular region of interest with a particular illuminated region size, illuminator power, and whether any scanning has occurred.
Fig. 3 illustrates the integration of smart lighting technology 301 into a working computing system, such as a handheld tablet or smartphone. Here, the smart lighting techniques may be implemented, for example, in part or in whole, in device driver software and/or firmware for an integrated camera device that includes time-of-flight measurement capabilities. The software/firmware may be stored, for example, in non-volatile memory of the computing system (e.g., in FLASH firmware or system memory).
As shown in fig. 3, the intelligent lighting technology software/hardware may be implemented as a system of methods designed to seek an appropriate balance in the aforementioned trade-offs given a set of input signals corresponding to the surrounding conditions of the depth profile image capture sequence.
As shown in fig. 3, the smart lighting method 301 may receive one or more of the following input parameters from the host system 302: 1) An object of interest (which may be specific to what the object is (e.g., hand, face, etc.) and/or characteristics of the object location and/or shape in the field of view); 2) How time-of-flight measurements are time critical (it needs to be performed in multiple blocks); and 3) the power budget (e.g., specifically the maximum allowed power) of the time-of-flight system and/or its luminaires. The components of the host system 302 that generate these input parameters may include intelligent object recognition software applications and/or hardware logic circuit components (e.g., for facial recognition, hand recognition, etc.). The power budget input information may be generated by energy management software, firmware, and/or hardware of the host system 302.
The smart lighting method 301 may also receive input information from the camera system 303b itself, such as: 1) A distance between the target of interest and the camera; 2) A reflectivity of the object of interest; 3) The position and/or shape of the object of interest; 4) Intensity of background light. Any input parameters provided by the camera may be provided after the initial illumination of the target (or field of view in general). That is, for example, as an initial response to input from the host system 302, the time-of-flight system may initially illuminate the target and/or the field of view as a first pass. The data collected from the first pass are then presented to the intelligent lighting method 301 so that they can better optimize the capture of the target depending on which area is illuminated and how strong the emitted light should be.
To generate image capture control commands for camera 303b, the intelligent lighting method 301 effectively determines which trade-off control and/or which direction and how important any particular trade-off should be, with applicable input parameters specifying what areas are illuminated, the intensity of the emitted light, whether any scanning is applied and, if applied, the applicable scanning parameters (e.g., scanning time, scanning speed, scanning pattern, etc.).
Fig. 4a to 4c illustrate another embodiment of a method that the intelligent lighting method 301 of fig. 3 may be designed to perform. As shown in fig. 4a, initially, a large area 410, e.g., substantially covering the field of view 401 of the camera, is first illuminated by a time-of-flight illuminator. In some embodiments, as shown in FIG. 4a, the large area 410 may correspond to the entire field of view 401. In other embodiments, the large area 410 may correspond to a majority but less than the entire field of view 401 (e.g., about 33%, 50%, 60%, 75% of the field of view 410). Here, note that illumination of a large area 410 in the field of view 401 may correspond to a weaker received light intensity, as the emitted illumination is "spread out" over a wider surface area.
The image sensor that receives the reflected light includes circuitry (e.g., sense amplifier circuitry) that measures the received signal intensity at each pixel relative to some threshold. Those pixels that receive light at a weak light intensity are identified (e.g., pixels whose received light intensity falls below a threshold value are identified). In many cases, as shown in fig. 4b, it is expected that the neighboring pixel groups will fall below a threshold, which in turn corresponds to the identification of an area 411 in the field of view 401 that receives weak light signals.
Here, the smart lighting method 301 of fig. 3 may receive signal strength information of pixels of all image sensors and apply a threshold to determine the size and location of the area 411, or, alternatively, may receive only the characteristics of pixels from down to below the threshold and determine the area 411 from them. Upon identifying the weak signal areas 411, the intelligent lighting method will proceed as shown in fig. 4c to assign instructions to the time-of-flight luminaires to re-illuminate these same areas 411.
Re-illumination is performed with more concentrated light to "boost" the light intensity directed to area 411. The concentration is achieved by forming a smaller area of illumination light (compared to illuminating the entire field of view), for example, in the same amount as the illuminator intensity emitted when the field of view is flooded. With the re-illumination of these areas with stronger light, the time-of-flight measurement should be done in that the pixels that previously received weak light signals will now receive sufficiently strong light signals.
In the case of an illuminator with movable optical components, the portions of the field of view that need to be re-illuminated can be re-illuminated by moving one or more optical components to direct a beam of light onto each region. In the case of illuminators with zoned fields of view, the portions of the field of view that need to be re-illuminated are re-illuminated by illuminating their corresponding zones. In one embodiment, the same total amount of optical power used to initially illuminate the entire field of view may be the same as the total amount of power used to illuminate only the re-illuminated zones.
Fig. 5 to 8a, b provide different embodiments of luminaires, which are capable of performing the above-described intelligent lighting techniques.
Fig. 5 shows an embodiment of an illuminator with a movable lens assembly 501, 502 for adjusting the illuminated area size (by moving the lens vertically over the light source) and scanning the illuminated area or at least directing the illumination to an arbitrary area within the camera's field of view (by tilting the lens plane over the light source).
As shown in fig. 5, light source 503 is located below lens 501 and, when illuminated, the emitted light propagates through the lens and into the field of view of the camera. The light source 503 may be implemented, for example, as a semiconductor chip having an array of Infrared (IR) VCSELs or LEDs. The use of an array helps to "boost" the maximum optical output power, which is done approximately coextensive with NL, where N is the number of VCSELs/LEDs in the array and L is the maximum output power per VCSELs/led. In one embodiment, all VCSELs/LEDs in the array receive the same drive current, such that each VCSELs/led emits approximately the same optical power as the other VCSELs/LEDs in the array. The optical output power is controlled by controlling the magnitude of the drive current.
A pair of voice coil motors 531, 532, each having a spring return 533, 534, are used as actuators to define the vertical position of each of two points along the outer edge of the lens 501. The tilt angle of the lens 501 about the y-axis is substantially defined by the force exerted by the first motor 531 on its return spring 533. The tilt angle of the lens 502 about the x-axis is substantially defined by the force exerted by the first motor 532 on its return spring 534. From these basic situations, an arbitrary tilt angle of the lens can be established as a function of the corresponding force exerted by the motor and the reaction force exerted by the spring. There may be a hinge pin or ball joint on the opposite side of the lens support 501 from the return spring, for example, to allow the lens support 501 to pivot about the x and y axes.
Furthermore, the vertical position of the lens 501 may be established by actuating the two motors 531, 532 equally. That is, if both motors 531, 531 are extended outwardly by an equal amount, the lens will be raised in the + z direction. Accordingly, if both motors 531, 531 are recessed inward by an equal amount, the lens will be lowered in the-z direction. Instead of the hinge pins or ball joints described above, one or more additional voice coil actuators may be positioned along the outer perimeter of the lens support 502 to further stabilize both the tilt angle and the vertical positioning of the lens (e.g., three 120 ° spaced apart actuators, four 90 ° spaced apart actuators, etc.).
Fig. 6 shows a luminaire with a movable light source 603. The light source itself may be implemented as the light source 503 discussed above with respect to fig. 5. In the method of fig. 6, the lens assembly positioning remains substantially fixed, but the platform or substrate 610 on which the light source is mounted is movable according to the same principles discussed above with respect to the lens support 502 of fig. 5. That is, the voice coil motor actuator and return spring pairs 631/633, 632/634 may be used to achieve a tilt angle of the platform 610 about either or both of the x and y axes. Changing the tilt angle of the platform changes the angle of incidence of the emitted light into the lens, which in turn will change the pointing direction of the light beam emitted from the lens into the field of view of the camera.
A third voice coil actuator and return spring pair (not shown) may be coupled on the edge of the platform 610 except for the two edges where the voice coil actuator and return spring pairs 631/633, 632/634 are positioned to enable movement of the platform 610 along the z-axis, which in turn will affect the size of the illuminated area (spot size) in the camera field of view.
Figure 7 shows another illuminator embodiment in which a light source 712 is affixed to the underside of a robotic arm 713, said robotic arm 713 being oriented at an angle at which the light source is positioned to direct light to a mirror 714 mounted on a movable stage 710. The lens and lens support are fixed in position over the mirror such that light reflected from the mirror surface propagates through the lens into the field of view of the camera. The light source may be implemented as discussed above with respect to fig. 5 and 6.
A set of voice coil motor actuator and return spring pairs 731/733, 732/734 may be used to achieve the tilt angle of the stage 710 about either or both the x and y axes. Changing the tilt angle of stage 710 changes the angle of incidence of the emitted light into the lens, which in turn will change the pointing direction of the light beam emitted from the lens into the field of view of the camera.
A third voice coil actuator and return spring pair (not shown) may be coupled on the edge of the stage 710 except for the two edges where the voice coil actuator and return spring pairs 731/733, 732/734 are positioned to effect movement of the stage 710 along the z-axis, which in turn will affect the size of the illuminated area (spot size) in the camera field of view.
Either of the luminaire designs of fig. 6 and 7 may be augmented to include a movable lens arrangement as discussed in fig. 5. Adding movable lens capability to the designs of fig. 6 and 7 may, for example, provide faster scan times and/or larger emission angles from the illuminator. Each of the movable stages 610, 710 of fig. 6 and 7 may be implemented as a micro-electromechanical (MEM) device to place a light source (fig. 6) or a mirror (fig. 7) anywhere in the xy plane.
Fig. 8a and 8b show embodiments of an illuminator designed to individually illuminate different zones in a field of view. As shown in fig. 8a and 8b, the illuminator 801 includes a semiconductor chip 804 with an array of light sources 806_1 through 806 _u9 for each zone in the field of view. Although the particular embodiment of fig. 8a and 8b shows nine field of view portions arranged in an orthogonal grid, other numbers and/or arrangements of partitions may be used. Similarly, although each array of light sources is depicted as an array of NxN squares of the same size, other array patterns and/or shapes including arrays of different sizes and/or shapes on the same semiconductor die may be used. Each of the light source arrays 106 _1through 106 _9may be implemented as, for example, VCSELs or an array of LEDs.
According to fig. 8a and 8b, in an embodiment, the illuminator 801 further comprises an optical element 807, said optical element 807 having a microlens array 808 on the bottom surface facing the semiconductor chip 804 and having an emitting surface with a different lens structure 805 for each sector to direct light received from its particular light source array to its corresponding field of view sector. Each lens of the microlens array 808 essentially behaves as a smaller objective lens that collects the diverging light from the underlying light source and shapes the light into a smaller divergence inside the optical element as it approaches the emitting surface. In one embodiment, a microlens is assigned to and aligned with each light source of the lower array of light sources, although other embodiments may exist in which there are more or fewer lenses per light source for any particular array.
The microlens array 808 enhances optical efficiency by capturing a majority of the optical light emitted from the underlying laser array and forming a more concentrated beam. Here, the individual light sources of the various arrays typically have a wide divergence angle of the emitted light. The microlens array 808 is capable of collecting most or all of the divergent light from the light sources of the array and helps form a transmitted light beam with a smaller divergence angle.
Collecting most or all of the light from the array of light sources and forming a beam of lower divergence angle substantially forms a higher optical power beam (that is, an increase in light intensity per unit surface area), resulting in a stronger received signal at the sensor for the area of interest illuminated by the beam. According to calculations, if the divergence angle from the array of light sources is 60 °, reducing the divergence angle of the emitted beam to 30 ° will increase the signal strength at the sensor by a factor of 4.6. Reducing the divergence angle of the emitted light to 20 ° will increase the signal intensity at the sensor by a factor of 10.7.
In addition, optical element 807 can be designed to provide further diffusion of collected light by, for example, constructing element 807 from a material that is translucent in the IR spectrum and/or otherwise designing the optical path within element 807 to impose diffuse internal reflection (e.g., constructing element 807 as a multilayer structure). As briefly mentioned above, the emitting surface of the optical element 807 may include unique lens structures 805, each lens structure 805 shaped to direct light to its correct field of view partition. As shown in the embodiment of fig. 8a and 8b, each lens structure 805 has a dome shape. Other embodiments may have, for example, a sharper trapezoidal shape or no structure at all.
Consistent with the discussion provided above with respect to fig. 5, 6, and 7, optical element 807 may also be movable, for example, by mechanical coupling, such as two or three voice coil motor actuator and return spring pairs. By designing the optical element 807 to be movable, scanning a single beam in a zone as discussed above with respect to fig. 2d may be achieved by moving the optical element 807 in a scanning motion while illuminating only the light source associated with the zone being scanned. Furthermore, simultaneous scanning of multiple zones as shown in fig. 2e may be achieved by illuminating a respective light source for each zone and moving optical element 807 in a scanning motion.
Fig. 9 shows an integrated conventional camera and time-of-flight image system 900. System 900 has a connector 901 for electrically connecting with, for example, a larger system/motherboard, such as that of a desktop computer, tablet computer, or smartphone. Depending on the layout and implementation, connector 901 may be connected to, for example, a flexible cable that is physically connected to the system/motherboard, or connector 901 may be directly connected to the system/motherboard.
The connector 901 is secured to a planar board 902, which planar board 902 is implemented as a multilayer structure of alternating conductive and insulating layers, wherein the conductive layers are patterned to form electrical traces that support the internal electrical connections of the system 900. Commands are received from a larger host system through connector 901, such as configuration commands in camera system 900 to write configuration information to or read configuration information from configuration registers. Further, the command may be any command associated with the smart lighting technology system, such as any of the outputs provided by the smart technology method 301 discussed above with respect to fig. 3.
The RGBZ image sensor 903 is mounted to the plane plate 902 under the receiving lens 904. The RGBZ image sensor 903 includes a pixel array having RGBZ unit pixel cells. The RGB pixel cells are used to support traditional "2D" visible light capture (traditional picture capture) functionality. The Z-pixel cell is sensitive to IR light and is used to support 3D depth profile imaging with time-of-flight techniques. While the basic embodiment includes RGB pixels for visible image capture, other embodiments may use different color pixel strategies (e.g., cyan, magenta, and yellow).
The image sensor 903 may also include an ADC circuit for digitizing signals from the image sensor and a timing and control circuit for generating timing and control signals for the pixel array and the ADC circuit.
The planar board 902 may include signal traces to convey digital information provided by the ADC circuitry to the connector 901 for processing by higher-end components of the host computing system, such as an image signal processing pipeline (e.g., integrated on an application processor).
The camera lens module 904 is integrated over the RGBZ image sensor 903. The camera lens module 904 contains a system of one or more lenses to focus the received light to the image sensor 903. Because the camera lens module 904 receiving visible light may interfere with the time-of-flight pixel cells of the image sensor receiving IR light, and, conversely, because the camera module receiving IR light may interfere with the RGB pixel cells of the image sensor receiving visible light, either or both of the pixel array of the image sensor and the lens module 903 may contain a filter system arranged to substantially block IR light to be received by the RGB pixel cells and to substantially block visible light to be received by the time-of-flight pixel cells.
As explained in the discussion above, an illuminator capable of illuminating a particular area in the field of view consistent with smart lighting technology is mounted on the flat panel 902. The illuminator 905 may be implemented, for example, as any of the illuminators discussed above with respect to fig. 5-8 a, b. The light source driver is coupled to the light source 907 of the luminaire to cause it to emit light with a particular intensity and modulation waveform.
In one embodiment, the integrated system 900 of FIG. 9 supports three modes of operation: 1) A 2D mode; 2) A 3D mode; and 3) 2D/3D mode. In the case of the 2D mode, the system behaves as a conventional camera. Thus, the illuminator 905 is disabled and the image sensor is used to receive a visible image through its RGB pixel cells. In the case of 3D mode, the system captures time-of-flight depth information of the target under the field of view of the illuminator 905. Thus, the illuminator 905 is enabled and emits IR light (e.g., on-off-on-off … sequence) onto the target. The IR light reflects from the object, is received by the camera lens module 1504 and is sensed by the Z pixels of the image sensor. In case of the 2D/3D mode, both the 2D mode and the 3D mode are simultaneously activated.
Fig. 10 shows a depiction of an exemplary computing system 1000, such as a personal computing system (e.g., desktop or notebook) or a mobile or handheld computing system, such as a tablet device or smartphone. As shown in fig. 10, a basic computing system may include a central processing unit 1001 (which may include, for example, a plurality of general purpose processing cores) and a main memory controller 1017 disposed on an application processor or multi-core processor 1050, a system memory 1002, a display 1003 (e.g., touch screen, tablet), a local wired point-to-point connection (e.g., USB) interface 1004, various network I/O functions 1005 (e.g., ethernet interface and/or cellular modem subsystem), a wireless local area network (e.g., wiFi) interface 1006, a wireless point-to-point connection (e.g., bluetooth) interface 1007 and a global positioning system interface 1008, various sensors 1009\\ u 1 to 1009_n, one or more cameras 1010, a battery 1011, a power management control unit 1012, a speaker and microphone 1013, and an audio encoder/decoder 1014.
The application processor or multi-core processor 1050 may include one or more general purpose processing cores 1015, one or more graphics processing units 1016, a main memory controller 1017, I/O control functions 1018, and one or more signal processor pipelines 1019 in its CPU 1001. The general purpose processing core 1015 typically executes the operating system and application software of the computing system. The graphics processing unit 1016 typically executes graphics intensive functions (graphics intensive functions) to, for example, generate graphics information for presentation on the display 1003. The memory control function 1017 interfaces with the system memory 1002. The image signal processing pipeline 1019 receives image information from the camera and processes the raw image information for downstream use. Energy management control unit 1012 generally controls the energy consumption of system 1000.
Each touch screen display 1003, communication interfaces 1004-1007, GPS interface 1008, sensor 1009, camera 1010, and speaker/microphone codec 1013, 1014 may be viewed as various forms of I/O (input/output) with respect to the overall computing system, which also includes integrated peripherals (e.g., one or more cameras 1010), as appropriate. Depending on the implementation, various of these I/O components may be integrated on the application processor/multi-core processor 1050 or may be located off-die (die) or outside the package of the application processor/multi-core processor 1050.
In an embodiment, consistent with the smart lighting techniques explained in the discussion above, one or more cameras 1010 include an illuminator capable of illuminating a particular area in the camera field of view. Application software, operating system software, device driver software, and/or firmware executing on a general purpose CPU core (or other functional block having an instruction execution pipeline to execute program code) of an application processor or other processor may direct smart lighting commands or other commands to and receive image data from the camera system. Other commands that may be received by the camera 1010 include the commands discussed above for entering or exiting any of the 2D, 3D, or 2D/3D system states.
The intelligent lighting technology itself may be implemented partially or wholly as any one or more of the following: 1) Software running on the pass processing core; 2) System firmware (e.g., BIOS firmware); 3) Dedicated logic circuitry (e.g., provided in one or more of the following: camera 1010, integrated in ISP 1090; integrated with I/O or peripheral controller 1080). The intelligent lighting techniques as discussed above may receive input information from an energy management control unit, which itself may be partially or fully implemented with one or more software running on general purpose processing cores, system firmware, dedicated logic circuits, etc.
Embodiments of the invention may include various processes as described above. The processes may be embodied in machine-executable instructions. The instructions may be used to cause a general-purpose or special-purpose processor to perform a certain process. Alternatively, the processes may be performed by specific hardware components that contain hardwired logic for performing the processes, or by any combination of programmed computing components and custom hardware components.
Elements of the present invention may also be provided as a machine-readable medium for storing the machine-executable instructions. The machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, CD-ROMs, and magneto-optical disks, FLASH memory, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, propagation media or other type of media/machine-readable medium for storing electronic instructions. For example, the invention may be downloaded as a computer program which may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals embodied in a carrier wave or other propagation medium via a communication connection (e.g., modem or network connection).
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims (17)

1. An apparatus for illumination, comprising:
a time-of-flight camera system having an illuminator with an optical component, a light source, and an image sensor,
wherein the apparatus is configured to:
illuminating, using an illuminator of a time-of-flight camera, a portion of a field of view of the time-of-flight camera with optical power;
receiving characteristics of pixels of the image sensor corresponding to the region of interest that falls below the threshold;
re-illuminating the region of interest with the same optical power using an illuminator of a time-of-flight camera, the region of interest shaped to contain an object of interest within a field of view of the time-of-flight camera, the region of interest being smaller than the portion of the field of view; and
determining depth profile information in the re-illuminated region of interest using a time-of-flight measurement technique on the region,
wherein the camera system is configured to scan the field of view or the target.
2. The device of claim 1, wherein the size of the area is changeable to a larger sized area and an upper emitted optical power range is maintained while illuminating the larger sized area.
3. The apparatus of claim 1, wherein said field of view is predefined into a number of zones, said region corresponding to one of said zones.
4. The apparatus of claim 3, wherein said light source comprises a plurality of light source devices, each of said light source devices being illuminated when only said area is substantially illuminated.
5. The apparatus of claim 1, wherein said light source comprises a plurality of light source devices, each of said light source devices being illuminated when only said area is substantially illuminated.
6. The apparatus of claim 5, wherein the area is a smallest sized area that the illuminator is capable of illuminating.
7. The apparatus of claim 1, wherein the optical component is mechanically coupled to an electrically powered device such that a position of the optical component is movable relative to the light source.
8. The apparatus of claim 7, wherein the position of the light source is fixed relative to the illuminator.
9. The apparatus of claim 7, wherein the light source component comprises any one of:
a lens;
a mirror.
10. The apparatus of claim 1, wherein the optical component is mechanically coupled to an electrically powered device such that a position of the light source is movable relative to the light source.
11. A method for illumination, comprising:
illuminating, using an illuminator of a time-of-flight camera, a portion of a field of view of the time-of-flight camera with optical power;
receiving characteristics of pixels of the image sensor corresponding to the region of interest that falls below the threshold;
re-illuminating the region of interest with the same optical power using an illuminator of a time-of-flight camera, the region of interest shaped to encompass an object of interest within a field of view of the time-of-flight camera, the region of interest being smaller than the portion of the field of view; and
determining depth profile information within the re-illuminated region of interest using a time-of-flight measurement technique on the region,
wherein the camera is configured to scan the field of view or the target.
12. The method of claim 11, wherein said field of view is predefined into a number of zones, said region corresponding to one of said zones.
13. The method of claim 12, wherein said illuminating comprises emitting infrared light from a plurality of light source devices, each of said light source devices being illuminated when only said area is substantially illuminated.
14. The method of claim 11, wherein said illuminating comprises emitting infrared light from a plurality of light source devices, each of said light source devices being illuminated when only said area is substantially illuminated.
15. The method of claim 11, wherein the area is a smallest sized area that the illuminator is capable of illuminating.
16. The method of claim 11, further comprising illuminating different regions in the field of view.
17. A computing system, comprising:
an application processor having a plurality of processing cores and a memory controller, a system memory coupled to the memory controller; and
the device according to any one of claims 1-10.
CN201580034052.3A 2014-12-22 2015-11-04 Intelligent lighting time-of-flight system and method Active CN106662640B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14/578,959 2014-12-22
US14/578,959 US20160178991A1 (en) 2014-12-22 2014-12-22 Smart illumination time of flight system and method
PCT/US2015/058945 WO2016105668A1 (en) 2014-12-22 2015-11-04 Smart illumination time of flight system and method

Publications (2)

Publication Number Publication Date
CN106662640A CN106662640A (en) 2017-05-10
CN106662640B true CN106662640B (en) 2022-11-29

Family

ID=56129237

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580034052.3A Active CN106662640B (en) 2014-12-22 2015-11-04 Intelligent lighting time-of-flight system and method

Country Status (4)

Country Link
US (1) US20160178991A1 (en)
EP (1) EP3238430A4 (en)
CN (1) CN106662640B (en)
WO (1) WO2016105668A1 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9992396B1 (en) 2015-02-02 2018-06-05 Apple Inc. Focusing lighting module
KR102406327B1 (en) * 2016-02-02 2022-06-10 삼성전자주식회사 Device and operating method thereof
WO2018100082A1 (en) * 2016-11-30 2018-06-07 Sony Semiconductor Solutions Corporation Apparatus and method
US20180224100A1 (en) * 2017-04-03 2018-08-09 Robe Lighting S.R.O. Follow Spot Control System
US10678220B2 (en) 2017-04-03 2020-06-09 Robe Lighting S.R.O. Follow spot control system
EP3607353B1 (en) * 2017-04-05 2023-03-01 Telefonaktiebolaget LM Ericsson (PUBL) Illuminating an environment for localisation
JP7042582B2 (en) * 2017-09-28 2022-03-28 株式会社東芝 Image pickup device and distance calculation method for image pickup device
DE102017222970A1 (en) 2017-12-15 2019-06-19 Ibeo Automotive Systems GmbH LIDAR measuring system
JP2020076619A (en) * 2018-11-07 2020-05-21 ソニーセミコンダクタソリューションズ株式会社 Floodlight control system, floodlight control method
US11831906B2 (en) * 2019-01-02 2023-11-28 Hangzhou Taro Positioning Technology Co., Ltd. Automated film-making using image-based object tracking
CN110995992B (en) * 2019-12-04 2021-04-06 深圳传音控股股份有限公司 Light supplement device, control method of light supplement device, and computer storage medium
CN111025329A (en) * 2019-12-12 2020-04-17 深圳奥比中光科技有限公司 Depth camera based on flight time and three-dimensional imaging method
EP4130649A4 (en) * 2020-04-01 2024-04-24 LG Electronics, Inc. Mobile terminal and control method therefor
US11108957B1 (en) * 2020-06-17 2021-08-31 Microsoft Technology Licensing, Llc Low power operation of differential image sensor pixels
JP2023003094A (en) * 2021-06-23 2023-01-11 ソニーセミコンダクタソリューションズ株式会社 Distance-measuring device and method, and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120038903A1 (en) * 2010-08-16 2012-02-16 Ball Aerospace & Technologies Corp. Electronically steered flash lidar
US20140055771A1 (en) * 2012-02-15 2014-02-27 Mesa Imaging Ag Time of Flight Camera with Stripe Illumination
US8761594B1 (en) * 2013-02-28 2014-06-24 Apple Inc. Spatially dynamic illumination for camera systems

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7515822B2 (en) * 2006-05-12 2009-04-07 Microsoft Corporation Imaging systems' direct illumination level adjusting method and system involves adjusting operation of image sensor of imaging system based on detected level of ambient illumination
KR101184170B1 (en) * 2007-04-20 2012-09-19 소프트키네틱 에스.에이. Volume recognition method and system
US8081797B2 (en) * 2008-10-10 2011-12-20 Institut National D'optique Selective and adaptive illumination of a target
US20140139632A1 (en) * 2012-11-21 2014-05-22 Lsi Corporation Depth imaging method and apparatus with adaptive illumination of an object of interest
US9361502B2 (en) * 2014-07-31 2016-06-07 Symbol Technologies, Llc System for, and method of, controlling target illumination for an imaging reader

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120038903A1 (en) * 2010-08-16 2012-02-16 Ball Aerospace & Technologies Corp. Electronically steered flash lidar
US20140055771A1 (en) * 2012-02-15 2014-02-27 Mesa Imaging Ag Time of Flight Camera with Stripe Illumination
US8761594B1 (en) * 2013-02-28 2014-06-24 Apple Inc. Spatially dynamic illumination for camera systems

Also Published As

Publication number Publication date
CN106662640A (en) 2017-05-10
EP3238430A4 (en) 2018-08-01
WO2016105668A1 (en) 2016-06-30
US20160178991A1 (en) 2016-06-23
EP3238430A1 (en) 2017-11-01

Similar Documents

Publication Publication Date Title
CN106662640B (en) Intelligent lighting time-of-flight system and method
US10055855B2 (en) Time-of-flight camera system and method to improve measurement quality of weak field-of-view signal regions
US9832357B2 (en) Time-of-flight camera system with scanning iluminator
EP3238432B1 (en) Integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with movable illuminated region of interest
US20180007347A1 (en) Integrated Camera System Having Two Dimensional Image Capture and Three Dimensional Time-of-Flight Capture With A Partitioned Field of View
EP3351964B1 (en) Illuminator for camera system having three dimensional time-of-flight capture with movable mirror element
CN114089348A (en) Structured light projector, structured light system, and depth calculation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: California, USA

Applicant after: Google Inc.

Address before: California, USA

Applicant before: Google Inc.

GR01 Patent grant
GR01 Patent grant