Nothing Special   »   [go: up one dir, main page]

CN115866112A - Adaptive flash photography, video and/or flash using camera, scene or user input parameters - Google Patents

Adaptive flash photography, video and/or flash using camera, scene or user input parameters Download PDF

Info

Publication number
CN115866112A
CN115866112A CN202211165800.XA CN202211165800A CN115866112A CN 115866112 A CN115866112 A CN 115866112A CN 202211165800 A CN202211165800 A CN 202211165800A CN 115866112 A CN115866112 A CN 115866112A
Authority
CN
China
Prior art keywords
light source
source module
illumination
background
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211165800.XA
Other languages
Chinese (zh)
Inventor
张博升
A·M·阿来莫
N·D·彼达德
K·B·西埃斯里克基
陈通博
P·M·胡渤
F·A·巴卡伊
N·P·博尼耶
B·当
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/934,491 external-priority patent/US20230101548A1/en
Priority claimed from US17/934,494 external-priority patent/US12132996B2/en
Priority claimed from US17/934,506 external-priority patent/US12219267B2/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN115866112A publication Critical patent/CN115866112A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/22Illumination; Arrangements for improving the visibility of characters on dials
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

The present disclosure relates to adaptive flash photography, video and/or flash using a camera, scene or user input parameters. Specifically, a light source module is provided that includes an array of illumination elements and an optional projection lens. The light source module is configured to receive or generate control signals for adjusting different ones of the lighting elements to control the light field emitted from the light source module. In some embodiments, the light source module is further configured to adjust the projection lens in response to illuminating the object in the scene and the field of view of the imaging device. The controller for the light source module may determine the light field pattern based on various parameters including the field of view of the imaging device, an illumination sensitivity model of the imaging device, the depth, ambient lighting and reflectivity of the object, configured lighting priorities including ambient atmosphere preservation, background lighting and direct/indirect lighting balance, and so on.

Description

Adaptive flash photography, video and/or flash using camera, scene or user input parameters
Technical Field
The present disclosure relates generally to light source modules that emit light, including but not limited to flash modules for illuminating a subject in an image captured by a camera device.
Background
For small devices, including devices having one or more miniature cameras, a light source module is typically included in such devices that illuminates at least a portion of a scene that is within the field of view of the camera of the device. Such camera and light source modules may be included in larger electronic devices, including mobile electronic devices, which may include mobile phones, smart phones, notebooks, and the like.
The light source module (which may also be referred to as a "flash" module, a "strobe" module, etc.) emits light that illuminates the space outside the light source module. The illuminated space may include a camera field of view, illuminating a subject within the camera field of view to obtain an image of the subject captured by the camera.
In some cases, the camera may be designed to capture images of a scene in the field of view of the camera, including objects at different distances from the camera, e.g., via a telephoto lens system or a wide angle lens system. In some cases, the camera system may be designed to capture images of objects in a scene at a particular distance from the camera in one of a plurality of camera modes (such as a wide-angle mode or a telephoto mode). Further, the camera may be designed to capture images of objects at a particular distance from the camera at any number of multiple zoom levels supported by the camera. In such cases, not for zoom level adjustment, for distance adjustment to an object, or for different cameras
The mode-adjusted light source module may cause insufficient, or uneven illumination of the scene captured by the camera.
In some cases, a scene may include multiple objects at different distances from the camera and including different ambient lighting and reflectivity characteristics. In such cases, a light source module that does not adjust the illumination on the illumination field may cause illumination of the scene captured by the camera to be non-uniform.
Disclosure of Invention
Some embodiments provide a mobile computing device comprising a camera arrangement having one or more lens systems. For one lens system, there may be different digital zooms to achieve different fields of view. For multiple lens systems, each lens system may have a different field of view, such as a wide angle lens system, a telephoto lens system, and a super wide angle lens system. The field of view of the image captured by the camera device may be based on a combination of the fields of view of different lens systems, such as a combination of the field of view of a wide angle lens system and the field of view of a telephoto lens system, or a combination of a wide angle lens system and a super wide angle lens system. In addition, the camera arrangement may be configured to capture photographs at multiple zoom levels using a combination of different lens systems, such as a combination of a telephoto lens system and a wide angle lens system. For example, the camera arrangement may comprise a camera with a telephoto lens system and another camera with a wide angle lens system, or may comprise a camera configured to operate both the telephoto lens system and the wide angle lens system to achieve an intermediate optical zoom level between the all-optical wide-angle mode and the all-optical telephoto mode. The mobile computing device also includes a light source module embedded in or coupled with the mobile computing device. The light source module includes an array of illumination elements configured to emit light through a projection lens. For example, the one or more lighting elements may be one or more Light Emitting Diodes (LEDs).
The mobile computing device includes a controller configured to determine respective amounts of light to be emitted from individual lighting elements in the array of lighting elements to focus the illumination field such that the illumination field of view optimizes illumination of the scene. It is noted that in some embodiments, the controller may determine an amount of current to be directed to respective ones of the lighting elements, where the amount of light emitted from a given lighting element is proportional to the current supplied to the lighting element. In some embodiments, the camera arrangement field of view resulting from the combination of the wide angle field of view and the telephoto field of view may have a pyramid shape, the focus of the pyramid being one or more lenses of the lens system of the camera arrangement.
Different scenes of objects at different distances within the camera arrangement's field of view may have a quadrilateral shape. As the distance from the camera increases in the composite camera arrangement field of view, the scene corresponding to the cross section of the pyramid composite camera arrangement field of view at the increasing distance may have a quadrilateral shape with an increasing area. The controller may determine an illumination pattern for the composite camera arrangement field of view based on a level of inclusion of the telephoto or wide-angle field of view in the composite field of view. The inclusion level may vary in the spectrum of the field of view from a composite camera arrangement based primarily on a wide-angle field of view to a composite camera arrangement based primarily on a telephoto field of view. In some embodiments, the controller may be configured to receive information indicative of a camera optical zoom level, a camera mode (such as a wide-angle mode or a telephoto mode), a digital zoom level, an estimated distance to an object in a scene to be captured by the camera, or other camera information (such as autofocus information). This information may correspond to the inclusion level of a wide-angle field or a telephoto field of view in the varying composite camera arrangement field of view. The controller may be further configured to infer an inclusion level of a wide-angle field or a telephoto field in the composite camera field of view based on an optical or digital zoom level of the camera, a distance to the scene, and/or a camera mode.
In some implementations, the illumination field may illuminate objects in the scene in the composite camera arrangement field of view at a particular distance such that corner portions of the scene (which include a quadrilateral cross-section through the composite camera arrangement field of view at the particular distance) are illuminated to a substantially similar extent as a central portion of the quadrilateral scene.
For a given image capture operation, the controller may further configure the illumination pattern based on ambient lighting, depth of objects in the scene, reflectivity of objects, illumination sensitivity of the imaging device, and the like. In some embodiments, the controller may be further configured to determine an overall illumination intensity of the array of lighting elements, and cause one or more lighting elements in the array of lighting elements to emit light according to the determined overall illumination intensity. In some implementations, the overall illumination intensity of the array of illumination elements may be determined based at least in part on a distance from the light source module to an object in the scene in the camera field of view to be illuminated. Furthermore, in some embodiments, the overall illumination intensity of the one or more lighting elements may be further determined based at least in part on ambient lighting conditions of the scene to be illuminated. In some embodiments, the total amount of current allocated to the light source module may be limited, and a controller for the light source module may strategically allocate current to the lighting elements of the lighting array such that some lighting elements are supplied with more current than others. The controller may determine the current distribution to the lighting elements of the light source module in a manner that optimizes how light is projected into the scene, e.g., to compensate for distance, ambient lighting conditions, reflectivity differences, lens effects, background lighting conditions, etc.
Drawings
Fig. 1 illustrates a light source module with an adjustable illumination array and a projection lens, according to some embodiments.
Fig. 2 illustrates a system including an adjustable illumination array and a controller of a projection lens, according to some embodiments.
Fig. 3A illustrates a composite camera field of view according to some embodiments.
Figure 3B illustrates a camera with a single lens component providing a variable field of view according to some embodiments.
Fig. 3C illustrates a light source module with an adjustable illumination array and a projection lens embedded in a mobile computing device, according to some embodiments.
Fig. 4A-4C illustrate light source modules with adjustable illumination arrays and projection lenses illuminating scenes at different distances according to some embodiments.
Fig. 5 is a flow diagram illustrating a method for providing field of view compensation using an illumination array and a projection lens, according to some embodiments.
Fig. 6 is a flow diagram illustrating a method for providing field of view compensation using an illumination array and an adjustable projection lens, according to some embodiments.
Fig. 7 is a flow diagram illustrating a method for providing imaging lens shading compensation using an illumination array and a projection lens, according to some embodiments.
Fig. 8 illustrates an exemplary backlighting scenario according to some embodiments.
Fig. 9 is a flow diagram illustrating a method for providing backlight compensation using an illumination array and a projection lens, according to some embodiments.
Fig. 10 illustrates an exemplary scene including a background ambient atmosphere, according to some embodiments.
Fig. 11 is a flow diagram illustrating a method for providing ambience preservation using a lighting array and a projection lens, in accordance with some embodiments.
Fig. 12 illustrates an exemplary scene including isolated objects according to some embodiments.
Fig. 13 is a flow diagram illustrating a method of providing minimal interference using an illumination array and a projection lens, according to some embodiments.
Fig. 14 illustrates an exemplary scene including objects at different depths according to some embodiments.
Fig. 15 is a flow diagram illustrating a method for providing depth compensation using an illumination array and a projection lens, according to some embodiments.
Fig. 16 illustrates an exemplary scene of an object with varying ambient lighting, according to some embodiments.
Fig. 17 is a flow diagram illustrating a method for providing ambient atmosphere compensation using a lighting array and a projection lens, according to some embodiments.
Fig. 18 illustrates an exemplary scene including objects of different reflectivity, according to some embodiments.
Fig. 19 is a flow diagram illustrating a method for providing reflectivity compensation using an illumination array and a projection lens, according to some embodiments.
Fig. 20 illustrates an exemplary low light scene according to some embodiments.
Fig. 21 is a flow diagram illustrating a method for providing low light scene illumination using an illumination array and a projection lens, according to some embodiments.
Fig. 22 illustrates an exemplary scenario supporting a bouncing flash according to some embodiments.
Fig. 23 is a flow diagram illustrating a method for providing indirect flash using an illumination array and a projection lens, according to some embodiments.
Fig. 24 is a flow diagram illustrating a method for providing creative supplemental lighting matching artistic intent using an illumination array and a projection lens, according to some embodiments.
Fig. 25A illustrates a Total Internal Reflection (TIR) lens that may be included in a light source module according to some embodiments.
Fig. 25B illustrates a reflector that may be included in a light source module according to some embodiments.
Fig. 26A-26B illustrate a light source module embedded in a mobile computing device, according to some embodiments.
Fig. 27 is a flow diagram illustrating a method for enabling a flash mode using an illumination array and a projection lens, according to some embodiments.
FIG. 28 illustrates a portable multifunction device with an embedded light source module according to some embodiments.
Fig. 29 illustrates an exemplary computer system according to some embodiments.
This specification includes references to "one embodiment" or "an embodiment". The appearances of the phrase "in one embodiment" or "in an embodiment" are not necessarily referring to the same embodiment. The particular features, structures, or characteristics may be combined in any suitable manner consistent with this disclosure.
"comprising," the term is open-ended. As used in the appended claims, the term does not exclude additional structures or steps. Consider the claims as cited below: the claims hereof do not exclude the inclusion of additional components to the apparatus (e.g., network interface units, graphics circuitry, etc.).
"configured," various units, circuits, or other components may be described or recited as "configured to" perform a task or tasks. In such context, "configured to" is used to connote structure by indicating that the units/circuits/components include structure (e.g., circuitry) that performs such task or tasks during operation. As such, the cells/circuits/components can be said to be configurable to perform this task even when the specified cell/circuit/component is not currently operable (e.g., not turned on). The units/circuits/components used with the "configured to" language include hardware-e.g., circuitry, memory storing program instructions executable to perform operations, and so on. Reference to a unit/circuit/component "being configured to" perform one or more tasks is expressly intended to not refer to the sixth paragraph of 35u.s.c. § 112 for that unit/circuit/component.
Further, "configured to" may include a general-purpose structure (e.g., a general-purpose circuit) that is manipulated by software and/or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in a manner that is capable of performing one or more tasks to be solved. "configured to" may also include adjusting a manufacturing process (e.g., a semiconductor fabrication facility) to manufacture a device (e.g., an integrated circuit) suitable for performing or carrying out one or more tasks.
"first", "second", etc. As used herein, these terms serve as labels to the nouns preceding them, and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.). For example, the buffer circuit may be described herein as performing a write operation of a "first" value and a "second" value. The terms "first" and "second" do not necessarily imply that the first value must be written before the second value.
"based on". As used herein, the term is used to describe one or more factors that affect the determination. The term does not exclude additional factors that influence the determination. That is, the determination may be based solely on these factors or at least partially on these factors. Consider the phrase "determine a based on B. In this case, B is a factor that affects the determination of a, and such phrases do not exclude that the determination of a may also be based on C. In other examples, a may be determined based on B alone.
Detailed Description
Brief introduction to the drawings
Some embodiments provide a light source module having an adjustable illumination array and a projection lens such that light emitted from the array of illumination elements forms a light field of a particular illumination pattern, portions of which have variable illumination intensities. The light source module may emit a pyramidal light beam having a square or rectangular cross-section and may be configured to project a light pattern corresponding to a pyramidal composite field of view of one or more cameras associated with the light source module. The light pattern may have a variable light intensity within the pyramidal light beam such that some portions of the square or rectangular cross-section are illuminated more than other portions. The controller for the light source module may determine such variable illumination intensities based on the measured scene conditions, as discussed further herein.
The composite camera field of view may have a rectangular or square cross-section at different distances (e.g., a scene) within the composite camera field of view. The compound field of view may be a combination of the wide field of view of the wide angle lens system and the telephoto field of view of the telephoto lens system. Further, the composite field of view may vary continuously over the spectrum from almost full wide angle to almost full telephoto based on the inclusion level of the wide-angle field or the telephoto field of view in the composite camera field of view. For example, in some embodiments, the first camera may include a telephoto lens system and the second camera may include a wide angle lens system, in such embodiments, the first camera and the second camera may capture a composite image using some image data from each of the two cameras. In such implementations, the composite field of view of the two cameras may vary based on the level of inclusion of image data from each of the two cameras in the composite image. In other embodiments, a generic camera may include an aperture associated with a telephoto lens system and an aperture associated with a wide angle lens system, and combine light or image data from both the wide angle lens system and the telephoto lens system to form a composite image. The inclusion level of light or image data from the wide angle lens system or the telephoto lens system may be adjustable such that the inclusion level of the telephoto field or the wide angle field in the composite camera field of view may be adjusted.
Further, the light source module may include or interact with a controller configured to adjust individual elements of the illumination array to capture an image of the scene based at least in part on the variable level of intensity of light to be projected into various portions of the scene, the determined light field pattern to be used to illuminate the scene, and the wide angle lens system field of view or telephoto lens system field of view used in the composite camera field of view.
In some embodiments, the light source module may include or interact with a controller configured to determine an estimated distance to an object in the camera field of view and adjust various elements of the illumination array based on the distance to the object in the camera field of view such that light emitted from the light source module substantially illuminates one or more objects in the scene within the camera field of view. For example, when the estimated distance to one or more objects in the scene in the camera field of view is a short distance, the controller may adjust the individual elements of the illumination array such that the light is evenly spread over the closer scene in the camera field of view.
Some embodiments may include a controller that estimates distances to objects in a scene in a field of view of the camera based on information received from the camera. For example, the controller may use zoom level information and/or autofocus information from the camera to estimate a distance to one or more objects in a scene to be captured by the camera. In some embodiments, a controller for a light source module may be included in the controller that also controls a camera associated with the light source module. In some embodiments, the controller for the light source module may be separate from the camera controller and may receive information from the camera controller, such as zoom information and/or autofocus information. In some embodiments, the light source module and/or a mobile device including the light source module may include one or more sensors that directly measure distance, such as a lidar sensor, a laser and a reflected light sensor, or other types of depth sensors.
In some embodiments, the controller for the light source module may also determine the illumination intensity of the illumination element of the light source module. For example, a controller for a light source module may use an estimated distance to an object in a scene to be captured by a camera, a camera sensor sensitivity setting (such as a camera ISO setting or shutter speed), and/or ambient lighting conditions to determine an illumination intensity of one or more lighting elements of the light source module. For example, in a darker light condition, the camera may select a particular ISO setting corresponding to the darker condition, and the controller may select a lighting setting corresponding to a higher lighting intensity for the lighting elements of the light source module to illuminate the darker field of view. The selected lighting setting may be greater than the lighting setting selected for the field of view having the brighter lighting condition. In some implementations, the controller may independently determine the illumination intensity setting for each of the array of illumination elements of the light source module based on a distance to an object in the scene in the camera field of view, light conditions of the scene in the camera field of view, and the like. In some implementations, different illumination intensities for different lighting elements of the array may be determined based on different distances to objects in the scene, where different ones of the lighting elements illuminate different portions of the scene, including objects at different distances.
Light source module with adjustable illumination array and projection lens
Fig. 1 illustrates a light source module with an adjustable illumination array and a projection lens according to some embodiments. The lighting module 100 may include a lighting array 110 comprising a plurality of lighting elements, such as an array of Light Emitting Diodes (LEDs) or laser diodes. In some embodiments, the lighting elements may be arranged in a two-dimensional matrix such that each lighting element corresponds to a two-dimensional matrix of regions in the imaged scene.
In some embodiments, the light emitted by each of the lighting elements may be collectively projected by the shared projection lens 120 to generate the illumination field 140. The shared projection lens 120 may be implemented in any number of configurations. For example, as shown in fig. 1, a simple single element lens may be used. In this example, the light emitted by each of the lighting elements may be inverted horizontally and vertically when projected onto the illumination field 140. In other projection lens embodiments, these inversions may not occur, and the controller 130 may maintain a mapping of individual ones of the lighting elements of the lighting array 110 to positions in the lighting pattern 150 of the lighting field 140 in order to control the particular lighting pattern 150. In other embodiments, the shared projection lens 120 may be a multi-element lens, and each lens may have a conventional shape, or may include alternative shapes such as discussed below in fig. 25A. Further, in some embodiments, the lens 120 may be a fixed type, and in other embodiments may be a type that is adjustable under the control of the controller 130. It should be understood that the above examples are not intended to be limiting and that any number of shared lens configurations may be used.
Further, the controller 130 may determine an overall illumination intensity (not shown), may determine the illumination field 140, and may determine the illumination pattern 150 of the illumination array 110. In some embodiments, the controller 130 may be implemented in hardware or software. In some embodiments, the controller 130 may be implemented by one or more processors via program instructions stored in a memory of the mobile device. In some embodiments, the controller 130 may instruct the array of lighting elements to illuminate the scene using variable illumination intensities for a particular illumination field and illumination pattern by controlling the respective intensities of the respective lighting elements within the lighting array 110. In some embodiments, the controller 130 may additionally instruct an adjustable projection lens (such as projection lens 120) to be actuated via an actuator as part of achieving a particular illumination field 140 and illumination pattern 150.
The controller 130 may implement various illumination patterns 150. In some implementations, the controller can implement a wide illumination pattern to uniformly illuminate a scene with a wide field of view. In some embodiments, a wide pattern may be achieved by controlling the various elements of the illumination array 110 to all emit relatively the same amount of light, while in other embodiments, the adjustable projection lens 120 may be configured to project a wide illumination field 140 by adjusting the position of the projection lens 120 relative to the illumination array 110. In other embodiments, a combination of illumination element and projection lens control may be employed. In some embodiments, a narrow pattern may be achieved by controlling the elements contributing to the center of the illumination field to emit more light than the elements contributing to the periphery of the illumination field, while in other embodiments, the adjustable projection lens 120 may be configured to be adjusted via an actuator to project a narrow illumination field 140. In other embodiments, a combination of control of the illumination elements and adjustment of the projection lens may be employed. In some embodiments, more complex illumination patterns 150 may be employed, as discussed below in fig. 4-24.
In some embodiments, the total amount of power consumed by the lighting array may be limited for various reasons, including battery life and heat dissipation. By varying the illumination produced by the individual elements of the illumination array, greater illumination of the object of interest may be achieved at a given amount of power, or alternatively, appropriate illumination of the object of interest may be provided at a reduced overall power consumption level.
Fig. 2 illustrates a system including a controller configured to adjust an illumination array and a projection lens, according to some embodiments. One or more sensors, such as sensor 200, may detect a condition of a scene to be illuminated by the light source module. In various embodiments, examples of such sensors are camera imaging sensors, depth sensors, focus sensors, and ambient light sensors. However, these examples are merely examples, and any number of sensors that detect various lighting conditions of a scene may be employed, and the above examples are not intended to be limiting.
The sensor is in communication with a controller, such as controller 210, and the controller determines the illumination intensity, illumination field, and illumination pattern of the illumination array based on measurements of the scene determined via the sensor. In some embodiments, a controller (such as controller 210) may be implemented in hardware or software. In some embodiments, controller 210 may be implemented via program instructions executing on one or more processors of the mobile device, where the program instructions are stored in a memory of the mobile device.
A light source module, such as light source module 220, may include an array of illumination elements 224, such as illumination array 110 of fig. 1, and a projection lens 226, such as projection lens 120 of fig. 1. In some embodiments, a controller (such as controller 210) may instruct an array of lighting elements to illuminate with a particular illumination field and illumination pattern at a particular overall illumination intensity by controlling the individual intensities of the respective lighting elements within lighting array 224. In some embodiments, the projection lens (such as projection lens 226) may be a fixed lens, while in other embodiments, controller 210 may additionally instruct the actuator to adjust the position of the adjustable projection lens as part of achieving a particular illumination field and illumination pattern.
Exemplary Mobile device including an Adjustable light Source Module
Fig. 3A illustrates an exemplary composite field of view including a combination of a telephoto field of view and a wide angle field of view, according to some embodiments. The camera 302 may include a wide angle camera and lens system 302 and a telephoto camera and lens system 304. Each camera and lens system, such as 302 and 304, may have one or more characteristic fields of view defined by the respective focal length of the system lens and the two-dimensional size of the camera sensor within the respective camera and lens system. Although not shown, in some embodiments, additional lens systems, such as ultra-wide angle lens systems, may be used.
In some implementations, the camera systems 302 and 304 may be arranged such that the fields of view of the cameras overlap one another. For example, the wide field of view 306 from the camera 302 may overlap with the telephoto field of view 308 from the camera system 304.
Further, in some embodiments, the camera systems 302 and 304 may be arranged such that at least a portion of one of the fields of view of the respective cameras does not overlap with the other camera fields of view. For example, at least a portion of the wide field of view 306 from the camera system 302 does not overlap with the telephoto field of view 308 from the camera system 304. In some embodiments, the composite field of view may include both a wide angle field of view and a telephoto field of view. In some embodiments, the camera arrangement may comprise other lens systems or additional lens systems. For example, in some embodiments, one or more intermediate lens systems may be included between the fully telephoto lens system and the fully wide angle lens system. Further, in some embodiments, a super wide angle lens system may be included. With respect to a particular image capture operation or an ongoing image capture operation, a controller of a mobile device including camera 300 may select a level of inclusion of image data from wide field of view 306 and telephoto field of view 308 in an image (or video) to be captured by camera 300. As described above, the light source controller may determine the illumination intensity with a particular illumination field and illumination pattern based on the inclusion level of the telephoto or wide-angle field in the composite field of view of the camera 300.
Fig. 3B illustrates an exemplary single lens camera that may provide a variable field of view, according to some embodiments. The cameras 310 may include a single lens system that, in various embodiments, may have one or more characteristic fields of view defined by the respective focal lengths of the lenses and the two-dimensional size of the camera sensors (not shown) within the respective cameras 310. In some embodiments, the one or more characteristic fields of view may include wide-angle or super-wide field of view characteristics of a wide-angle or super-wide-angle lens, while in other embodiments, the one or more characteristic fields of view may include a narrower field of view, such as a field of view associated with a portrait or telephoto lens system. In various embodiments, the active two-dimensional area of the camera sensor may be configured to adjust the field of view, wherein the widest field of view of the camera 310 may be obtained by enabling the maximum or entire area of the camera sensor, and the progressively narrower field of view may be configured by reducing the effective area of the sensor by disabling or discarding data from sensor elements along the periphery of the sensor. Further, in some embodiments, each of the two dimensions of the sensor may be independently configurable, allowing the aspect ratio and field of view provided by the camera 310 to be varied. Further, in some embodiments, the controller 312 of the camera sensor for configuring the field of view may be implemented in software or hardware within the device hosting the camera 310 (such as the mobile device 320).
Fig. 3C shows a mobile device 320 that includes a light source module 330 and a camera 325. The cameras 325 may include a first aperture associated with a wide angle lens system and a second aperture associated with a telephoto lens system, or may include more than one camera, where at least one of the cameras has an aperture associated with a wide angle lens system and at least one of the cameras has an aperture associated with a telephoto lens system. In some embodiments, additional lens systems may be included. For example, the camera 320 may include a wide angle camera system 302, a telephoto camera system 304, both the wide angle camera system 302 and the telephoto camera system 304, or a hybrid camera configured to operate in both wide angle and telephoto modes. In some implementations, a scene in the camera field of view may be adjusted based on digital zoom. In some embodiments, optical zoom may alternatively or additionally be used to adjust the camera field of view.
Light source module 330 may also include an illumination array 340 (such as illumination array 110 of fig. 1 or illumination element array 224 of fig. 2), a projection lens 350 (such as projection lens 120 of fig. 1 or projection lens 226 of fig. 2), and so forth. A controller (not shown, such as controller 210 of fig. 2) may determine an illumination intensity with a particular illumination field and illumination pattern based on a camera field of view, user input, and/or data received from a sensor (such as sensor 200 of fig. 2). In some embodiments, an example of such a sensor may include a camera 325.
Light source module with adjustable illumination field
In some embodiments, the light source module may include an adjustable illumination array and a projection lens that illuminate the scene at different distances. In some embodiments, a controller (such as controller 410 shown in fig. 4A-4C) may determine a level of diffusion for illuminating a scene in a composite camera field of view using, for the field of view, an inclusion level of a wide field of view of a wide angle lens system in the composite camera (such as shown in fig. 3 above) and an inclusion level of a telephoto field of view of a telephoto lens system. For ease of illustration, a complex using two lens systems is described. However, in some embodiments, the compound camera may include additional lens systems, such as three or more lens systems in some embodiments.
In some implementations, the inclusion level of the wide field of view or the inclusion level of the telephoto field of view may be inferred from the camera zoom level information and/or the distance information. In some implementations, the camera may determine an estimated distance to an object in the camera field of view, and the controller for the light source module may adjust the level of diffusion for the scene illumination using the estimated distance determined by the camera. In some embodiments, the controller may receive camera information (such as autofocus information) from the camera and may determine an estimated distance to an object in the scene in the field of view of the camera based on the received camera information. In some implementations, the controller can determine an estimated distance to a scene to be captured by the camera based on whether the camera is operating in a telephoto mode or a wide-angle mode. Further, in some embodiments, the mobile device may include multiple cameras such that when operating in a telephoto mode, a telephoto camera is selected, and when operating in a wide-angle mode, a wide-angle camera is selected. In some embodiments, a single camera may include two or more apertures and two or more lens systems, where one of the lens systems has a wider angle than the other lens systems (such as a telephoto lens system). Further, in some embodiments, the mobile device may operate in a hybrid mode using both a telephoto camera and a wide angle camera. The controller may adjust the illumination array and projection lens system using any of the above combinations of wide angle, telephoto, and/or varying degrees of wide angle/telephoto composite field of view selection. Additionally, the controller may measure the distance directly, for example, via a lidar or radar sensor.
Illuminating objects in the quadrilateral scene in the camera field of view of the one or more cameras associated with the light source module such that the objects are uniformly illuminated in the quadrilateral scene may result in a better image being captured by the associated one or more cameras than if the quadrilateral scene were illuminated such that the objects were not uniformly illuminated. For example, in fig. 4A to 4C, the light source module 402 has rectangular (quadrangular) output patterns 424 and 434. The rectangular output pattern of the light source module may be designed to match a rectangular (quadrilateral) scene in the camera field of view of the associated camera or cameras. Thus, the light source module may be configured to project light in a pyramidal pattern matching the field of view of the one or more cameras, the pyramidal pattern having a rectangular cross-section at different distances from the one or more cameras. However, under varying widths of the camera field of view, if the illumination array uniformly illuminates objects in the camera field of view while maintaining a rectangular illumination pattern of the scene at a given distance within the camera field of view, different levels of illumination from the various elements may be required.
As shown in fig. 4A, camera 420 is primarily in the ultra-wide angle camera selection, and controller 410 may adjust the output of light source illumination module 400 to an ultra-wide angle pattern output level based on the inclusion level of the ultra-wide angle field of view and/or an estimated distance 423 to scene 422 in the composite field of view. This ultra-wide angle pattern may be achieved by controlling the individual elements 402 to all emit relatively the same amount of light, as discussed above in fig. 1. The controller 410 is adjusted to the ultra-wide angle pattern output level such that the light emitted from the illumination element 402 uniformly illuminates a scene 424 in the illumination field of the light source module, where the scene 424 has a quadrilateral shape that matches the scene 422 in the composite field of view. For clarity, the scene 422 in the composite field of view and the scene 424 in the illumination field are shown adjacent to each other in fig. 4A. However, in operation, the scene 424 in the illumination field and the scene 422 in the composite field of view may be on top of each other, e.g., the camera 420 may be taking a picture of the same scene illuminated by the light source module 400.
As shown in fig. 4B, the camera 430 is primarily in the wide-angle camera option, and the controller 410 may adjust the output of the light source illumination module 400 to a wide-angle pattern output level, providing a more focused pattern to reach a scene 434 in the illumination field located at a greater distance 433. This wide-angle pattern may be achieved by controlling the individual elements 402 such that elements contributing to the center of the illumination field emit more light than elements contributing to the periphery of the illumination field, as discussed above in fig. 1. Adjusting the controller 410 to the wide-angle pattern output level causes the light emitted from the illumination element 402 to uniformly illuminate a scene 434 in the illumination field of the light source module, where the scene 434 has a quadrilateral shape that matches the scene 432 in the composite field of view. As discussed above, the scene 434 in the illumination field and the scene 432 in the composite field of view are shown adjacent to each other in fig. 4B. However, in operation, scene 434 and scene 432 may be on top of each other or represent the same scene.
As shown in fig. 4C, the camera 440 is primarily in a telephoto camera selection, and the controller 410 may adjust the output of the light source illumination module 400 to a narrow or telephoto pattern output level, providing a more focused pattern to reach the scene 444 in the illumination field located at a greater distance 443. This narrow pattern may be achieved by controlling the individual elements 402 such that elements contributing to the center of the illumination field emit more light than elements contributing to the periphery of the illumination field, as discussed above in fig. 1. Adjusting the controller 410 to a telephoto or narrow pattern output level may cause the light emitted from the illumination element 402 to uniformly illuminate a scene 444 in the illumination field of the light source module, where the scene 444 has a quadrilateral shape that matches the scene 442 in the composite field of view. As discussed above, the scene 444 in the illumination field and the scene 442 in the composite field of view are shown adjacent to each other in fig. 4C. However, in operation, the scene 444 and the scene 442 may be on top of each other or represent the same scene.
Fig. 5 is a flow diagram illustrating a method for providing field of view compensation using an illumination array and a projection lens, according to some embodiments. The method begins at step 500, where during the capture of an image, a configured field of view of an imaging device may be determined. This determination may be performed in a variety of ways, as discussed above in fig. 3 and 4.
In some embodiments, once the configured field of view of the imaging device has been determined, the method may proceed to step 510, where a controller (such as controller 130 of fig. 1) may determine an illumination field that is narrowed to match the determined field of view of the imaging device. To accomplish this, in some embodiments, the controller may reduce or disable elements of the illumination array that contribute to the periphery of the illumination field outside of the determined field of view of the imaging device to narrow the illumination field to match the determined field of view of the imaging device.
Once the illumination field has been established, the method proceeds to step 520, where individual elements of an illumination array (such as illumination array 110) may be configured to provide an illumination pattern (such as illumination pattern 150 of fig. 1) that provides a narrowed illumination field (such as illumination field 140 of fig. 1) that matches the determined field of view of the imaging device. In addition, elements of the illumination array that contribute to illumination of the field of view of the imaging device may be scaled (e.g., by changing the current supplied to the illumination elements) to provide target illumination values for image capture.
In some embodiments, the method may then proceed to step 530, where the scene may be illuminated according to the determined illumination pattern.
Fig. 6 is a flow diagram illustrating a method for providing field of view compensation using an illumination array and an adjustable projection lens, according to some embodiments. The method begins at step 600 where a configured field of view of an imaging device may be determined during the capture of an image. This determination may be performed in a variety of ways, as discussed above in fig. 3 and 4.
In some embodiments, once the configured field of view of the imaging device has been determined, the method may proceed to step 610, where a controller (such as controller 130 of fig. 1) may determine the illumination field and illumination pattern that are narrowed to match the determined field of view of the imaging device.
In some embodiments, once the illumination field and illumination pattern have been established, the method proceeds to step 620, where the adjustable projection lens and various elements of the illumination array (such as illumination array 110) may be configured to provide the illumination field (such as illumination field 140 of fig. 1) and the illumination pattern (such as illumination pattern 150 of fig. 1). In some embodiments, to accomplish this, the controller may adjust the position of the projection lens via the actuator to provide the determined illumination field, and may reduce or disable elements of the illumination array that contribute to the periphery of the illumination field outside of the determined field of view of the imaging device to narrow the illumination field to match the determined field of view of the imaging device. In addition, elements of the illumination array that contribute to illumination of the field of view of the imaging device may be scaled (e.g., by varying the amount of current supplied to the illumination elements) to provide target illumination values for image capture.
In some embodiments, the method may then proceed to step 630, where the scene may be illuminated according to the determined illumination pattern.
Exemplary method for providing lens shading compensation
Fig. 7 is a flow diagram illustrating a method for providing imaging lens shading compensation using an illumination array and a projection lens, according to some embodiments. The method begins at step 700, where during the capture of an image, a configured field of view of an imaging device may be determined. This determination may be performed in a variety of ways, as discussed above in fig. 3 and 4.
The imaging device may include an imaging lens that projects an image to be captured onto an imaging sensor. The imaging lens may have a characteristic focal length, where in some embodiments the focal length may be fixed, while in other embodiments the focal length may be configurable. In other embodiments, the focal length may vary within the focal range of the lens.
The imaging sensor may have a characteristic size, and in some embodiments a configurable effective size, where the field of view of the imaging device is determined by the characteristic focal length of the imaging lens (as configured) and the characteristic size or effective size of the imaging sensor. In addition, the illumination of the imaging sensor by the imaging lens may vary over the surface of the imaging sensor for various reasons including lens shading and vignetting, resulting in variations in illumination sensitivity over the surface of the imaging sensor, such as, for example, light attenuation at the periphery of the image sensor which may cause the boundaries of the resulting image to darken. However, lens shading and vignetting are merely examples of illumination sensitivity variations; these examples are not intended to be limiting and any number of causes and effects may be envisioned.
The image projected by the imaging lens onto the imaging sensor may be characterized using an illumination sensitivity model of the imaging device that describes changes in illumination sensitivity across the imaging sensor surface. In some embodiments, once the configured field of view of the imaging device has been determined, the method may proceed to step 710, where a profile containing the illumination sensitivity model may be obtained by a controller (such as controller 130 of fig. 1).
This profile may be obtained in a number of ways similar to the way the field of view is determined, as discussed above in fig. 3 and 4. In some implementations, the illumination sensitivity model for the profile can be obtained from a look-up table associated with the field of view of the imaging device. For example, known lens shading effects may be stored in a look-up table for a particular camera configuration. However, this is merely an example and is not intended to be limiting.
In some embodiments, once the illumination sensitivity model has been obtained, the method may proceed to step 720, where a controller (such as controller 130 of fig. 1) may configure the illumination pattern to compensate for the illumination sensitivity model of the imaging device. In some embodiments, to achieve this, the controller may adjust individual elements of the illumination array, wherein elements contributing to portions of the scene with lower illumination sensitivity are configured to emit more light than elements contributing to portions of the scene with higher illumination sensitivity.
In some embodiments, the projection lens may be adjustable to control the illumination field, such as discussed above in fig. 1, 2, and 6. In these embodiments, the controller may also adjust the adjustable projection lens to provide the illumination field, while also configuring the illumination pattern to compensate for the illumination sensitivity model of the imaging device in these embodiments.
Once the illumination pattern has been configured to compensate the illumination sensitivity model of the imaging device, the method may proceed to step 730, where the overall illumination level of the array may be determined and the elements of the illumination array contributing to the illumination of the field of view of the imaging device may be scaled to provide an overall illumination value for image capture.
In some embodiments, the method may then proceed to step 740, where the scene may be illuminated according to the determined illumination pattern. In some embodiments, this method of compensating for lens shading and/or illumination sensitivity models may be combined with various other methods described herein.
Exemplary method for providing backlight Compensation
Fig. 9 is a flow diagram illustrating a method for providing backlight compensation using an illumination array and a projection lens, according to some embodiments. As shown in fig. 8, an exemplary backlit scene may include a foreground object or subject 800 and a background area 810.
The method begins at step 900, where during image capture, foreground objects (such as foreground object 800 of fig. 8) and background objects (such as background object 810 of fig. 8) within a field of view of an imaging device may be determined. This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various regions and objects within a scene. However, this example is not intended to be limiting, and various methods of determining such objects and regions may be employed.
Once the foreground object and background regions have been determined, the method may proceed to step 910, where ambient lighting values for the foreground object and background regions may be determined. This determination may be performed in a variety of ways. For example, a sensor (such as sensor 200 shown in fig. 2) may be employed to determine ambient lighting values within a scene. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the ambient illumination value has been determined, the method may proceed to step 920 where groups of elements of the illumination array that emit light contributing to the illumination of the foreground object may be identified. Once identified, a controller (such as controller 130 of fig. 1) may configure an illumination pattern (such as illumination pattern 150 of fig. 1) such that the identified group of elements emits light to balance the illumination of the foreground object according to the ambient illumination values of the foreground object and the background area, as shown at step 930.
In some embodiments, once the illumination pattern has been configured, the method may proceed to step 940, where the scene may be illuminated according to the determined illumination pattern.
Exemplary method for providing ambient atmosphere preservation
Fig. 11 is a flow diagram illustrating a method for providing ambience preservation using a lighting array and a projection lens, in accordance with some embodiments. As shown in fig. 10, an exemplary ambient atmosphere preservation scene may include a foreground object or subject 1000 and a background region 1010.
The method begins at step 1100, where during image capture, foreground objects (such as foreground object 1000 of fig. 10) and background objects (such as background object 1010 of fig. 10) within a field of view of an imaging device may be determined. This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various areas and objects within a scene. However, this example is not intended to be limiting, and various methods of determining such objects and regions may be employed.
Once the foreground object and background regions have been determined, the method may proceed to step 1110, where ambient lighting values for the foreground object and background regions may be determined. This determination may be performed in a variety of ways. For example, a sensor (such as sensor 200 shown in fig. 2) may be employed to determine ambient lighting values within a scene. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the ambient illumination values have been determined, the method may proceed to step 1120, where groups of elements of the illumination array that emit light contributing to the illumination of the foreground object may be identified. Further, in some embodiments, another group of elements of the illumination array that emit light contributing to the illumination of the background area may be identified.
Once identified, a controller (such as controller 130 of fig. 1) may configure a lighting pattern (such as lighting pattern 150 of fig. 1) such that the identified group of elements emits light to balance the lighting of the foreground object according to the ambient lighting values of the foreground object and the background area, as shown in step 1130. Further, in some embodiments, the identified other group of elements is disabled such that the background area is not illuminated by the other group of illuminating elements, thereby preserving ambient illumination of the background area.
In some embodiments, once the illumination pattern has been configured, the method may proceed to step 1140, where the scene may be illuminated according to the determined illumination pattern.
Exemplary method for providing minimal interference using flashing light
Fig. 13 is a flow diagram illustrating a method of providing minimal interference using an illumination array and a projection lens, according to some embodiments. As shown in fig. 12, an exemplary scene may include a foreground object or subject 1200 and a background area 1210.
The method begins at step 1300 in which, during image capture, a foreground object (such as foreground object 1200 of fig. 12) and a background region (such as background region 1210 of fig. 12) within a field of view of an imaging device may be determined. This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various regions and objects within a scene. However, this example is not intended to be limiting and various methods of determining such objects and regions may be employed.
In some embodiments, once the foreground object and background regions have been determined, the method may proceed to step 1310, where groups of elements of the illumination array that emit light contributing to the illumination of the foreground object may be identified.
Once identified, a controller (such as controller 130 of fig. 1) may configure an illumination pattern (such as illumination pattern 150 of fig. 1) such that the identified group of elements emits light to illuminate the foreground object, as shown in step 1320. Additionally, in some embodiments, the remaining elements of the lighting array are reduced or disabled such that interference with the ambient environment caused by the lighting of the lighting array is minimized.
In some embodiments, once the illumination pattern has been configured, the method may proceed to step 1330, where the scene may be illuminated according to the determined illumination pattern.
Exemplary method for providing depth Compensation
Fig. 15 is a flow diagram illustrating a method for providing depth compensation using an illumination array and a projection lens, according to some embodiments. As shown in fig. 14, an exemplary scene may include objects or subjects 1440 and 1450 at respective distances 1445 and 1445 from an imaging device 1410 having a field of view 1420.
The method begins at step 1500 where, during image capture, a plurality of objects (such as objects 1440 and 1450 of fig. 14) of an imaging device (such as imaging device 1410 of fig. 14) and a background region within a field of view (such as field of view 1420 of fig. 14) may be determined. This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various regions and objects within a scene.
However, this example is not intended to be limiting, and various methods of determining such objects and regions may be employed.
In some embodiments, once the object and background regions have been determined, the method may proceed to step 1510, where the respective distances from the imaging device (such as distances 1445 and 1445 of fig. 14) may be determined. This determination may be performed in a variety of ways. For example, a sensor 200 (such as the sensor shown in FIG. 2) may be employed to measure object distances within a scene. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the respective distances have been determined, the method may proceed to step 1520 in which respective groups of elements of the illumination array that emit light contributing to illumination of the respective objects may be identified.
In some embodiments, once the respective group of elements is identified, as shown at step 1530, a controller (such as controller 130 of fig. 1) may configure an illumination pattern (such as illumination pattern 150 of fig. 1) such that the identified group of elements emit light to illuminate the object according to their respective distances and the ambient illumination values determined for the identified background region.
In some embodiments, once the illumination pattern has been configured, the method may proceed to step 1530 where the scene may be illuminated according to the determined illumination pattern.
Exemplary method for providing ambient atmosphere Compensation
Fig. 17 is a flow diagram illustrating a method for providing ambient atmosphere compensation using an illumination array and a projection lens, according to some embodiments.
As shown in fig. 16, an exemplary scene may include objects or subjects 1600 and 1610.
The method begins at step 1700 where a plurality of objects within a field of view of an imaging device, such as objects 1600 and 1610 of fig. 16, may be determined during the capture of an image. This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various areas and objects within a scene. However, this example is not intended to be limiting and various methods of determining such objects and regions may be employed.
Once the object has been determined, the method may proceed to step 1710, where an ambient lighting value of the object may be determined. This determination may be performed in a variety of ways. For example, a sensor (such as sensor 200 shown in fig. 2) may be employed to determine ambient lighting values within a scene. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the ambient illumination values have been determined, the method may proceed to step 1720, where respective groups of elements of the illumination array that emit light contributing to illumination of respective objects may be identified.
In some embodiments, once the respective group of elements is identified, as shown at step 1730, a controller (such as controller 130 of fig. 1) may configure an illumination pattern (such as illumination pattern 150 of fig. 1) such that the identified group of elements emits light to illuminate the object, thereby balancing the illumination of the respective object with the illumination of other objects.
In some embodiments, once the illumination pattern has been configured, the method may proceed to step 1730, where the scene may be illuminated according to the determined illumination pattern.
Exemplary method for providing reflectivity Compensation
Fig. 19 is a flow diagram illustrating a method for providing reflectivity compensation using an illumination array and a projection lens, according to some embodiments. As shown in fig. 18, an exemplary scene may include objects or subjects 1800 and 1810.
The method begins at step 1900 in which a plurality of objects, such as objects 1800 and 1810 of fig. 18, within a field of view of an imaging device may be determined during capture of an image. This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various regions and objects within a scene. However, this example is not intended to be limiting, and various methods of determining such objects and regions may be employed.
Once the object has been determined, the method can proceed to step 1910, where a corresponding reflectance value for the object can be determined. This determination may be performed in a variety of ways. For example, a combination of a sensor (such as sensor 200 shown in fig. 2) and light emitted by an illumination array (such as illumination element array 224 of fig. 2) may be employed to determine reflectance values within a scene. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the respective reflectance values have been determined, the method may proceed to step 1920 where respective groups of elements of the illumination array that emit light contributing to illumination of the respective objects may be identified.
In some embodiments, once the respective groups of elements are identified, as shown at step 1930, a controller (such as controller 130 of fig. 1) may configure an illumination pattern (such as illumination pattern 150 of fig. 1) such that the identified groups of elements emit an amount of light to illuminate the object to compensate for reflectivity differences between objects.
In some embodiments, once the illumination pattern has been configured, the method may proceed to step 1930, where the scene may be illuminated according to the determined illumination pattern.
Exemplary method for providing Low light scene illumination
Fig. 21 is a flow diagram illustrating a method for providing low light scene illumination using an illumination array and a projection lens, according to some embodiments. As shown in fig. 20, an exemplary scene may include a foreground object or subject 2000 and a background object 2010.
The method begins at step 2100, where during image capture, foreground objects (such as foreground object 2000 of fig. 20) and background objects (such as background object 2010 of fig. 20) within a field of view of an imaging device may be determined. This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various regions and objects within a scene. However, this example is not intended to be limiting, and various methods of determining such objects and regions may be employed.
Once the object has been determined, the method can proceed to step 2110 where reference luminance values for the background and object, as well as target luminance and distance from the imaging device for the foreground object, can be determined. These determinations may be performed in a variety of ways. For example, a sensor (such as sensor 200 shown in fig. 2) may be employed to determine brightness values and distances within a scene. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the brightness values and distance values have been determined, the method may proceed to step 2120 in which respective groups of elements of the illumination array that emit light contributing to illumination of respective objects may be identified.
In some embodiments, once the respective group of elements is identified, as shown in step 2130, a controller (such as controller 130 of fig. 1) may configure an illumination pattern (such as illumination pattern 150 of fig. 1) such that the group of elements emitting light to illuminate the foreground object is configured according to the reference brightness, the target brightness, and the distance from the imaging device.
In some embodiments, once the lighting pattern has been configured, the method may proceed to step 2140, where the scene may be illuminated according to the determined lighting pattern.
Exemplary method for providing Indirect illumination
Fig. 23 is a flow diagram illustrating a method for providing indirect flash using an illumination array and a projection lens, according to some embodiments. As shown in fig. 22, an exemplary scene may include a foreground object or subject 2240 at a respective distance 2245 from an imaging device 2210 having a field of view 2220. Additionally, the scene may include reflective objects 2250 at respective distances 2255 from the imaging device 2210.
The method begins at step 2300, where during image capture, a foreground object (such as foreground object 2240 of fig. 22) and a reflective object (such as reflective object 2250 of fig. 22) may be determined. While the foreground object may be within the field of view of the image, it should be noted that the reflection may be within the field of view of the image or may be outside the field of view of the image. However, in various embodiments, the reflective object may be located within the illumination field of the light source module.
This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various regions and objects within a scene. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the object has been determined, the method may proceed to step 2310, where the orientation of the reflective object relative to the image device, the light source module, and the foreground object may be determined. This determination may be performed in a variety of ways. For example, a combination of a sensor (such as sensor 200 shown in fig. 2) and light emitted by an illumination array (such as illumination element array 224 of fig. 2) may be employed to determine the orientation of a reflective object. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the orientation has been determined, the method may proceed to step 2320 in which respective groups of elements of the illumination array that emit light contributing to illumination of the foreground object may be identified, the groups of elements including groups that emit light that directly illuminates the foreground object and groups that emit light that indirectly illuminates the foreground object via reflection by the reflective object.
Once the corresponding group of elements is identified, as shown in step 2330, a desired ratio of direct illumination and indirect illumination of the foreground object may be determined. This determination may be performed in a variety of ways. For example, the ratio may be determined according to a selected profile, according to user input through a user interface, or according to a configured default value. However, this example is not intended to be limiting and other approaches may be employed.
In some embodiments, once the desired ratios have been determined, as shown in step 2340, a controller (such as controller 130 of fig. 1) may configure a lighting pattern (such as lighting pattern 150 of fig. 1) such that the emitted light is configured according to the desired ratios to illuminate respective groups of elements of the foreground object.
Once configured, the group of elements that emit light that directly illuminates the foreground object will be configured in proportion to the composition of elements that emit light that directly illuminates the foreground object, the proportion being based at least in part on the determined desired ratio.
In some embodiments, once the lighting pattern has been configured, the method may proceed to step 2350, where the scene may be illuminated according to the determined lighting pattern.
Complementary lighting adapted to artistic intentions
Fig. 24 is a flow diagram illustrating a method for providing creative supplemental lighting matching artistic intent using an illumination array and a projection lens, according to some embodiments. The method starts in step 2400 where, during capture of an image, a lighting distribution model can be determined from artistic content. In various embodiments, this determination may be made in various ways. For example, in some embodiments, the lighting distribution model may be selected from a plurality of preconfigured lighting distribution models. Examples of such pre-configured illumination distribution models may include illumination models that provide illumination patterns on captured images to create regions of greater or lesser illumination, contrast, color, etc., e.g., as in Vermeer illumination or Hollywood illumination techniques. Another example of a pre-configured illumination distribution model may be a lighting pattern that creates an illumination intensity or color gradient, or for increasing contrast or depth perception in an image. Other examples may include lighting patterns intended to relate to the emotional response of the image viewer. Additionally, in some pre-configured illumination distribution models, objects identified in the field of view of the image may receive pre-configured different lighting, such as providing focused illumination on the objects identified in the image. In some implementations, an illumination distribution model may be determined based on previously captured images, for example, to replicate or supplement the lighting found in previously selected or associated images. These various examples are not intended to be limiting, and any number of illumination distribution models may be envisioned.
The method may then proceed to 2410, where based on the determined model, a foreground object may be determined if a foreground object is identified in the determined model, at 2410. This determination may be performed in a variety of ways. For example, sensors (such as sensor 200 shown in FIG. 2) may be employed to locate various regions and objects within a scene. However, this example is not intended to be limiting and other approaches may be employed.
Once the illumination distribution model and associated objects are determined and identified, a controller (such as controller 130 of fig. 1) may configure a lighting pattern (such as lighting pattern 150 of fig. 1) to provide lighting of a scene to be captured according to artistic intent, as shown at step 2420. In some embodiments, once the illumination pattern has been configured, the method may proceed to step 2430, where the scene may be illuminated according to the determined illumination pattern.
Exemplary lens and Reflector
In some embodiments, the light source module may include a Total Internal Reflection (TIR) lens and/or a reflector. The TIR lens may be configured to reflect light such that the light is directed in a particular direction. For example, instead of light generated by a non-TIR light source spreading the light source over 360 degrees or 180 degrees, a TIR lens may concentrate the light into a concentrated beam in a particular direction. In some implementations, a TIR lens may be included in the light source module between the lighting element and the dimmable diffusing material. The tunable light diffusing material may diffuse the concentrated light exiting the TIR lens. However, for long distance lighting scenes, the dimmable diffusing material may apply minimal diffusion, and the concentrated light beam from the TIR lens may propagate to farther scenes and illuminate the farther scenes to a greater extent than the unconcentrated light from light sources that do not include a TIR lens. Thus, a light source module having both a TIR lens and a dimmable diffusing material may be configured to provide diffuse light to illuminate near to mid distance scenes and may provide a concentrated beam of light to reach distant scenes.
Fig. 25A illustrates an exemplary TIR lens. The lens 2502 receives light from the lighting element 2504 and provides a concentrated beam of light 2506. As can be seen in the cut-away view, the lens 2502 includes grooves 2508 that are angled such that light 2510 from the lighting element 2504 passes through a portion of the lens 2502 and reflects off of the grooves 2508 such that the reflected light is parallel to other light reflecting off other portions of the grooves 2508. Thus, while light 2510 from the lighting element 2504 is initially directed in multiple directions, light 2506 exiting the lens 2502 is concentrated and directed generally in the same direction.
Fig. 25B shows an exemplary reflector. The reflector comprises a reflector body 2552 having a curved shape designed such that light 2554 from the lighting element 2550 is reflected out of the reflector body such that the reflected light is parallel to other light reflected out of the reflector body. This results in a concentrated beam 2556 exiting reflector body 2552.
In some embodiments, the light source module may include both a TIR lens and a reflector (such as the reflector described in fig. 25B). Further, a dimmable diffusing material may be placed adjacent to the TIR lens such that light exiting the TIR lens passes through the dimmable diffusing material before exiting the light source module.
Additional use of light source module
In addition to illuminating a scene to be captured by a camera or video recorder, the light source module may be used as a flash, as an indicator to send visual notifications to a user, as a transmitter to transmit information via a modulated light signal, or for other purposes. When used as a flashlight, a dimmable diffusive material may be used to adjust the light beam emitted from the light source module. For example, a user of a mobile device with an embedded light source module may wish to have a wide light beam when searching for an area, and a focused light beam when operating in a fixed location. When used in flash mode, the light beam may be adjusted using a light source module, such as any of the light source modules described above.
In some embodiments, a dimmable diffusive material may be used in a flash mode to adjust the light beam from the light source module between a broad beam and a concentrated or narrow beam.
In some embodiments, a controller of the mobile device may interact with one or more other components of the mobile device to determine whether the light source module in flash mode should emit a wide beam or a concentrated or narrow beam. For example, the controller may interact with signals from one or more gyroscopes, accelerometers, or other motion detection devices to determine whether the mobile device is scanning a wide area or is relatively stationary and focused on a single location. In response to determining that the mobile device is focused on a single location, the controller may switch from a wide beam mode to a narrow beam or a concentrated beam mode. In some embodiments, the controller may interact with a camera of the mobile device to detect objects in the scene and focus the light beam on one or more objects detected in the scene. For example, fig. 26A to 26B illustrate a light source module embedded in a mobile device in a flash mode. In fig. 26A, the light source module 2600 is in a flash mode and a narrow beam or a concentrated beam mode. The light source module 2600 emits a narrow light beam 2602. In fig. 26B, the light source module 2604 is embedded in the mobile device and is in a flash mode and a wide beam mode. The light source module 2604 emits a wide light beam 2606. In some embodiments, the light source module may be embedded in various devices (including mobile computing devices such as phones, tablets, etc.) and may be used in flash mode as described above.
Fig. 27 is a flow diagram illustrating a method for enabling flash mode using an illumination array and a projection lens, according to some embodiments. In some embodiments, the method begins at step 2700 in which a controller (such as controller 130 in fig. 1) receives a request to enable an illumination array (such as illumination array 110 in fig. 1) in a flash mode, the request including a target illumination field (such as illumination field 140 in fig. 1).
In some embodiments, in response to the request, the controller (as shown in 2710) may then configure the lighting pattern of the lighting array (such as lighting pattern 150 of fig. 1) according to the target lighting field specified in the request.
In some embodiments, once the illumination pattern has been configured, the method may then proceed to step 2720, where the illumination array may be enabled according to the determined illumination pattern.
Multifunction device examples
Embodiments of electronic devices in which embodiments of the light source modules, camera modules, light diffusion modules, and the like described herein may be used, user interfaces for such devices, and associated processes for using such devices are described herein. As described above, in some embodiments, the light source module, camera module, light diffusion module, etc. may be included in a mobile computing device that may include a camera device. In some embodiments, the device is a portable communication device, such as a mobile phone, that also contains other functions, such as PDA and/or music player functions. Other portable electronic devices may also be used, such as a laptop, a cellular phone, a tablet device, or a tablet computer with a touch-sensitive surface (e.g., a touch screen display and/or a touchpad). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touchpad). In some embodiments, the device is a gaming computer having an orientation sensor (e.g., an orientation sensor in a game controller). In other embodiments, the device is not a portable communication device, but is a camera device.
In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the electronic device may include one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
Devices typically support a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a web page creation application, a disc editing application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the device may use one or more common physical user interface devices, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture of the device (such as a touch-sensitive surface) may support various applications with a user interface that is intuitive and transparent to the user.
Fig. 28 shows a schematic representation of an exemplary device 4000 that may include, for example, a camera and illumination array as described herein with reference to fig. 27 in fig. 1, according to some embodiments. In some embodiments, device 4000 may be a mobile device and/or a multifunction device. In various embodiments, the apparatus 4000 may be any of various types of apparatuses, including but not limited to: personal computer systems, desktop computers, laptop computers, notebook computers, tablet computers, all-in-one computers, tablet or netbook computers, mainframe computer systems, handheld computers, workstations, network computers, cameras, set-top boxes, mobile devices, augmented Reality (AR) and/or Virtual Reality (VR) headsets, consumer devices, video game controllers, handheld video game devices, application servers, storage devices, televisions, video recording devices, peripheral devices (such as switches, modems, routers), or generally any type of computing or electronic device.
In some embodiments, device 4000 may include a display system 4002 (e.g., including a display and/or a touch-sensitive surface) and/or one or more cameras 4004. In some non-limiting embodiments, the display system 4002 and/or one or more front-facing cameras 4004a can be disposed on a front side of the apparatus 4000, for example, as indicated in fig. 28. Additionally or alternatively, one or more rear-facing cameras 4004b can be disposed at a rear side of the device 4000. In some implementations including multiple cameras 4004, some or all of the cameras may be the same or similar to each other. Additionally or alternatively, some or all of the cameras may be different from one another. In various implementations, the location and/or arrangement of the cameras 4004 may be different than those indicated in fig. 28. Additionally, apparatus 4000 may include light source modules 4018a and/or 4018b, which may be similar to light module 100 described in fig. 1 and light source module 220 described in fig. 2. In some embodiments, the controller for the light source module may be implemented in software or hardware on the apparatus 4000.
Device 4000 may include memory 4006 (e.g., including an operating system 4008 and/or application/program instructions 4010), one or more processors and/or controllers 4012 (e.g., including a CPU, memory controller, display controller, and/or camera controller, etc.), and/or one or more sensors 4016 (e.g., orientation sensor, proximity sensor, and/or position sensor, etc.), among others. In some embodiments, the device 4000 may communicate with one or more other devices and/or services (such as computing device 4018, cloud services 4020, and so on) via one or more networks 4022. For example, device 4000 may include a network interface (e.g., network interface 4210) that enables device 4000 to transmit data to and receive data from network 4022. Additionally or alternatively, device 4000 may be capable of communicating with other devices via wireless communication using any of a variety of communication standards, protocols, and/or technologies.
Fig. 29 illustrates a schematic block diagram of an exemplary computing device, referred to as computer system 4200, that may include or host embodiments of a camera having a lighting array module, e.g., as described herein with respect to fig. 1-28, according to some embodiments. Further, computer system 4200 may implement methods for controlling the operation of a camera and/or for performing image processing on images captured with the camera. In some embodiments, apparatus 4000 (described herein with reference to fig. 28) may additionally or alternatively include some or all of the functional components of computer system 4200 described herein.
The computer system 4200 may be configured to perform any or all of the embodiments described above. In different embodiments, computer system 4200 may be any of a variety of types of devices, including but not limited to: personal computer systems, desktop computers, laptop computers, notebook computers, tablet computers, all-in-one computers, tablet or netbook computers, mainframe computer systems, handheld computers, workstations, network computers, cameras, set-top boxes, mobile devices, augmented Reality (AR) and/or Virtual Reality (VR) headsets, consumer devices, video game controllers, handheld video game devices, application servers, storage devices, televisions, video recording devices, peripheral devices (such as switches, modems, routers), or generally any type of computing or electronic device.
In the illustrated embodiment, computer system 4200 includes one or more processors 4202 coupled to system memory 4204 via an input/output (I/O) interface 4206. Computer system 4200 also includes one or more cameras 4208 coupled to I/O interface 4206 (and associated light source modules). Computer system 4200 also includes a network interface 4210 coupled to I/O interface 4206, and one or more input/output devices 4212 such as cursor control device 4214, keyboard 4216, and display 4218.
In various embodiments, computer system 4200 may be a single-processor system that includes one processor 4202, or a multi-processor system that includes several processors 4202 (e.g., two, four, eight, or another suitable number). Processor 4202 may be any suitable processor capable of executing instructions. For example, in various embodiments, processors 4202 may be general-purpose or embedded processors implementing any of a variety of Instruction Set Architectures (ISAs), such as the x86, powerPC, SPARC, or MIPS ISAs, or any other suitable ISA. Further, in some embodiments, one or more of processors 4202 may include additional types of processors, such as a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), and so forth. In a multiprocessor system, each of processors 4202 may typically (but not necessarily) implement the same ISA. In some embodiments, computer system 4200 may be implemented as a system on a chip (SoC). For example, in some embodiments, processor 4202, memory 4204, I/O interface 4206 (e.g., fabric), etc., may be implemented in a single SoC that includes multiple components integrated into a single chip. For example, a SoC may include multiple CPU cores, multi-core GPUs, multi-core neural engines, caches, one or more memories, etc., integrated into a single chip. In some embodiments, soC implementations implement a Reduced Instruction Set Computing (RISC) architecture or any other suitable architecture.
The system memory 4204 may be configured to store program instructions 4220 accessible by the processor 4202. In various embodiments, the system memory 4204 may be implemented using any suitable memory technology such as Static Random Access Memory (SRAM), synchronous Dynamic RAM (SDRAM), non-volatile/flash type memory, or any other type of memory. Additionally, existing camera control data 4222 of memory 4204 may include any of the information or data structures described above. In some embodiments, the program instructions 4220 and/or data 4222 may be received, transmitted or stored on a different type of computer-accessible medium, or similar medium, separate from the system memory 4204 or the computer system 4200. In various embodiments, some or all of the functionality described herein may be implemented via such a computer system 4200.
In one embodiment, the I/O interface 4206 may be configured to coordinate I/O communications between the processor 4202, the system memory 4204, and any peripheral devices in the device, including the network interface 4210 or other peripheral device interfaces, such as the input/output devices 4212. In some embodiments, I/O interface 4206 may perform any necessary protocol, timing, or other data transformations to convert data signals from one component (e.g., system memory 4204) into a format suitable for use by another component (e.g., processor 4202). In some embodiments, I/O interface 4206 may include support for devices attached, for example, over various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard. In some embodiments, the functionality of I/O interface 4206 may be divided into two or more separate components, such as a north bridge and a south bridge. Further, in some embodiments, some or all of the functionality of I/O interface 4206 (such as an interface to system memory 4204) may be incorporated directly into processor 4202.
Network interface 4210 may be configured to allow data to be exchanged between computer system 4200 and other devices (e.g., carriers or proxy devices) attached to network 4224, or between nodes of computer system 4200. In various embodiments, network 4224 may comprise one or more networks including, but not limited to, a Local Area Network (LAN) (e.g., ethernet or an enterprise network), a Wide Area Network (WAN) (e.g., the internet), a wireless data network, some other electronic data network, or some combination thereof. In various embodiments, the network interface 4210 may support communication via, for example, a wired or wireless general purpose data network (such as any suitable type of ethernet network); communication via a telecommunications/telephony network (such as an analog voice network or a digital fiber optic communication network); communication via a storage area network (such as a fibre channel SAN), or via any other suitable type of network and/or protocol.
In some embodiments, the input/output devices 4212 may include one or more display terminals, keyboards, keypads, touch pads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or accessing data by one or more computer systems 4200. Multiple input/output devices 4212 may be present in computer system 4200 or may be distributed on various nodes of computer system 4200. In some embodiments, similar input/output devices may be separate from computer system 4200 and may interact with one or more nodes of computer system 4200 through wired or wireless connections (such as through network interface 4210).
Those skilled in the art will appreciate that computer system 4200 is merely illustrative and is not intended to limit the scope of embodiments. In particular, the computer systems and devices may include any combination of hardware or software that can perform the indicated functions, including computers, network devices, internet appliances, PDAs, wireless telephones, pagers, and the like. Computer system 4200 may also be connected to other devices not shown or may operate as a standalone system. Further, the functionality provided by the illustrated components may be combined in fewer components or distributed in additional components in some embodiments. Similarly, in some embodiments, the functionality of some of the illustrated components may not be provided, and/or other additional functionality may be available.
Those skilled in the art will also recognize that while various items are illustrated as being stored in memory or on storage during use, these items, or portions thereof, may be transferred between memory and other storage devices for purposes of memory management and data integrity. Alternatively, in other embodiments, some or all of these software components may execute in memory on another device and communicate with the illustrated computer system via inter-computer communication. Some or all of the system components or data structures may also be stored (e.g., as instructions or structured data) on a computer-accessible medium or portable article to be read by an appropriate drive, various examples of which are described above. In some embodiments, instructions stored on a computer-accessible medium separate from computer system 4200 may be transmitted to computer system 4200 via transmission media or signals (such as electrical, electromagnetic, or digital signals) transmitted via a communication medium such as a network and/or a wireless link. Various embodiments may also include receiving, transmitting or storing instructions and/or data implemented in accordance with the above description upon a computer-accessible medium. Generally speaking, a computer-accessible medium may include a non-transitory computer-readable storage medium or memory medium, such as a magnetic or optical medium, e.g., disk or DVD/CD-ROM, a volatile or non-volatile medium such as RAM (e.g., SDRAM, DDR, RDRAM, SRAM, etc.), ROM, or the like. In some embodiments, a computer-accessible medium may include transmission media or signals, such as electrical, electromagnetic, or digital signals, transmitted via a communication medium such as a network and/or a wireless link.
The following clauses describe exemplary embodiments consistent with the accompanying drawings and the above description.
1. A mobile computing device, the mobile computing device comprising:
a camera arrangement, the camera arrangement comprising:
an image capturing device;
a plurality of lighting elements configured to emit light;
a lens configured to project emitted light of the plurality of lighting elements according to a lighting field; and
a controller, wherein during capture of an image by the image capture device, the controller is configured to:
determining the illumination field and associated illumination pattern based at least in part on a profile of an imaging device, the profile comprising a field of view and an illumination sensitivity model; and
causing respective ones of the plurality of lighting elements to emit light through the lens to generate the determined illumination pattern, wherein the respective ones of the plurality of lighting elements are respectively configured to emit different amounts of light based at least in part on the illumination sensitivity model of the profile of the imaging device.
2. The mobile computing device of clause 1, wherein:
the image capturing device includes an imaging lens and an imaging sensor;
the field of view of the profile is determined by a focal length of the imaging lens and a size of the imaging sensor; and is
The illumination sensitivity model of the profile is based at least in part on one or more features of an image rendered by the imaging lens onto the imaging sensor.
3. The mobile computing device of clause 2, wherein the one or more features of the image rendered by the imaging lens onto the imaging sensor include at least a vignetting feature associated with the imaging lens.
4. The mobile computing device of clause 1, wherein:
the image capture device is configured to provide a plurality of focal lengths;
the profile of the imaging device is one of a plurality of profiles determined from respective configured focal lengths of the image capture devices;
the field of view of the profile is determined by a configured focal length of the image capture device; and is
The illumination sensitivity model of the profile is based at least in part on the one or more features of the image rendered onto the imaging sensor by the imaging lens at the configured focal length; wherein respective sensitivity models of respective profiles of the plurality of profiles determined at different focal lengths of the imaging lens are different.
5. The mobile computing device of any of clauses 1-4, wherein the plurality of lighting elements is a two-dimensional array of lighting elements.
6. The mobile computing device of clause 1, wherein the controller is further configured to determine the illumination pattern from one or more objects identified in the field of view of the imaging device.
7. A light source module, the light source module comprising:
a plurality of lighting elements configured to emit light;
a lens configured to project emitted light of the plurality of lighting elements according to a lighting field; and
a controller, wherein during image capture device capturing an image, the controller is configured to:
determining the illumination field and associated illumination pattern of the light source based at least in part on a profile of the imaging device, the profile comprising a field of view and an illumination sensitivity model; and
causing respective ones of the plurality of lighting elements to emit light through the lens to generate the determined illumination pattern, wherein the respective ones of the plurality of lighting elements are respectively configured to emit different amounts of light based at least in part on the illumination sensitivity model of the profile of the imaging device.
8. The light source module according to clause 7,
wherein: the lens is an adjustable lens; and is
During image capture by an image capture device, the controller is further configured to adjust the adjustable lens to the determined illumination field based at least in part on the field of view of the profile of the imaging device.
9. The light source module of clause 7 or clause 8, wherein:
the image capturing device includes an imaging lens and an imaging sensor;
the field of view of the profile is determined by a focal length of the imaging lens and a size of the imaging sensor; and is
The illumination sensitivity model of the profile is based at least in part on one or more features of an image rendered by the imaging lens onto the imaging sensor.
10. The light source module of clause 9, wherein:
the imaging lens is configured to provide a plurality of focal lengths;
the profile of the imaging device is one of a plurality of profiles determined from respective configured focal lengths of the imaging lenses;
the field of view of the profile is determined by a configured focal length of the imaging lens; and is
The illumination sensitivity model of the profile is based at least in part on the one or more features of the image rendered onto the imaging sensor by the imaging lens at the configured focal length; wherein respective sensitivity models of respective profiles of the plurality of profiles determined at different focal lengths of the imaging lens are different.
11. The light source module of clause 9 or clause 10, wherein the one or more features of the image rendered by the imaging lens onto the imaging sensor comprise at least a vignetting feature associated with the imaging lens.
12. The light source module according to any of clauses 7 to 11, wherein to determine the illumination field and the associated illumination pattern of the light source module, the controller is configured to evaluate a focus distance of the imaging lens.
13. The light source module of any one of clauses 7 to 12, wherein the plurality of lighting elements is a two-dimensional array of lighting elements.
14. The light source module of clause 7, wherein the controller is further configured to determine the illumination pattern from one or more objects identified in the field of view of the imaging device.
15. A method, the method comprising:
configuring a light source module during image capture of an image by an image capture device, wherein the light source module comprises a plurality of lighting elements configured to emit light and a lens configured to project the emitted light of the plurality of lighting elements, and wherein the configuring comprises:
determining an illumination field and an associated illumination pattern of the light source based at least in part on a profile of the imaging device, the profile comprising a field of view and an illumination sensitivity model; and
causing respective ones of the plurality of lighting elements to emit light through the lens to generate the determined illumination pattern, wherein the respective ones of the plurality of lighting elements are respectively configured to emit different amounts of light based at least in part on the illumination sensitivity model of the profile of the imaging device.
16. The method of clause 15, wherein:
the image capturing device includes an imaging lens and an imaging sensor;
the field of view of the profile is determined by the focal length of the imaging lens and the size of the imaging sensor; and is provided with
The illumination sensitivity model of the profile is based at least in part on one or more features of an image rendered by the imaging lens onto the imaging sensor.
17. The method of clause 15, wherein:
the imaging lens is configured to provide a plurality of focal lengths;
the profile of the imaging device is one of a plurality of profiles determined from respective configured focal lengths of the imaging lenses;
the field of view of the profile is determined by a configured focal length of the imaging lens; and is
The illumination sensitivity model of the profile is based at least in part on the one or more features of the image rendered onto the imaging sensor by the imaging lens at the configured focal length; wherein respective sensitivity models of respective profiles of the plurality of profiles determined at different focal lengths of the imaging lens are different.
18. The method of any of clauses 15-17, wherein the one or more features of the image rendered by the imaging lens onto the imaging sensor include at least a vignetting feature associated with the imaging lens.
19. The method of any of clauses 15-18, wherein to determine the illumination field and the associated illumination pattern of the light source module, the controller is configured to evaluate a focus distance of the imaging lens.
20. The method of any of clauses 15-19, wherein the plurality of lighting elements is a two-dimensional array of lighting elements.
21. A mobile computing device, the mobile computing device comprising: a camera arrangement, the camera arrangement comprising:
an image capturing device;
a plurality of lighting elements configured to emit light;
a controller for the plurality of lighting elements, wherein during image capture by an image capture device, the controller is configured to:
evaluating light emitted by the plurality of lighting elements, reflected by one or more objects, and detected at the image capture device to determine respective reflectance values of the one or more objects;
determining a lighting pattern of the light source module based at least in part on the respective reflectance values determined for the one or more objects; and
causing respective ones of the plurality of lighting elements to emit light to generate the determined lighting pattern, wherein the respective ones of the plurality of lighting elements are respectively configured to emit different amounts of light based at least in part on the respective reflectance values determined for the one or more objects.
22. The mobile computing device of clause 21, wherein:
the one or more objects include a foreground object and a bounce object;
light emitted by a first portion of the plurality of lighting elements is reflected by the bounce object to the foreground object and then reflected from the foreground object to the image capture device;
additional light emitted by a second portion of the plurality of lighting elements is reflected by the foreground object to the image capture device;
the illumination pattern of the light source module illuminates the foreground object;
the first portion of the plurality of lighting elements is configured to emit a first amount of light based at least in part on a reflectance value of the reflective object; and is
The second portion of the plurality of lighting elements is configured to emit a second amount of light different from the first amount of light.
23. The mobile computing device of clause 22, wherein:
the light source module further comprises an adjustable lens configured to project the emitted light of the plurality of lighting elements;
the bouncing object is outside a field of view of the image capture device; and is
The controller is further configured to adjust the adjustable lens to an illumination field that includes the bouncing object.
24. The mobile computing device of clause 22 or clause 23, wherein:
identifying the bouncing object using respective depth values determined for the bouncing object and the foreground object.
25. The mobile computing device of any of clauses 22-24, wherein:
the first amount of light and the second amount of light are determined according to a configured ratio of direct illumination and indirect illumination, respectively.
26. The mobile computing device of clause 21, wherein:
light emitted by a first portion of the plurality of lighting elements is reflected by a first object of the one or more objects to the image capture device;
light emitted by a second portion of the plurality of lighting elements is reflected by a second object of the one or more objects to the image capture device;
the first portion of the plurality of lighting elements is configured to emit a first amount of light based at least in part on the determined reflectance value of the first object;
the second portion of the plurality of lighting elements is configured to emit a second amount of light based at least in part on the determined reflectance value of the second object; and is
The first amount of light and the second amount of light are different.
27. The mobile computing device of clause 21, wherein:
the first amount of light is inversely proportional to the determined reflectance value of the first object; and is
The second amount of light is inversely proportional to the determined reflectance value of the second object.
28. A light source module, the light source module comprising:
a plurality of lighting elements configured to emit light;
a controller for the plurality of lighting elements, wherein during image capture by an image capture device, the controller is configured to:
evaluating light emitted by the plurality of lighting elements, reflected by one or more objects, and detected at the image capture device to determine respective reflectance values of the one or more objects;
determining a lighting pattern of the light source module based at least in part on the respective reflectance values determined for the one or more objects; and
causing respective ones of the plurality of lighting elements to emit light to generate the determined lighting pattern, wherein the respective ones of the plurality of lighting elements are respectively configured to emit different amounts of light based at least in part on the respective reflectance values determined for the one or more objects.
29. The light source module of clause 28, wherein:
the one or more objects include a foreground object and a bounce object;
light emitted by a first portion of the plurality of lighting elements is reflected by the bounce object to the foreground object and then reflected from the foreground object to the image capture device;
additional light emitted by a second portion of the plurality of lighting elements is reflected by the foreground object to the image capture device;
the illumination pattern of the light source module illuminates the foreground object;
the first portion of the plurality of lighting elements is configured to emit a first amount of light based at least in part on a reflectance value of the reflective object; and is provided with
The second portion of the plurality of lighting elements is configured to emit a second amount of light different from the first amount of light.
30. The light source module of clause 29, wherein:
the light source module further comprises an adjustable lens configured to project emitted light of the plurality of lighting elements;
the bouncing object is outside a field of view of the image capture device; and is provided with
The controller is further configured to adjust the adjustable lens to an illumination field that includes the bouncing object.
31. The light source module of clause 29 or clause 30, wherein:
identifying the bounce object using respective depth values determined for the bounce object and the foreground object.
32. The light source module of any one of clauses 29 to 31, wherein:
the first amount of light and the second amount of light are determined according to a configured ratio of direct illumination and indirect illumination, respectively.
33. The light source module of clause 28, wherein:
light emitted by a first portion of the plurality of lighting elements is reflected by a first object of the one or more objects to the image capture device;
light emitted by a second portion of the plurality of lighting elements is reflected by a second object of the one or more objects to the image capture device;
the first portion of the plurality of lighting elements is configured to emit a first amount of light based at least in part on the determined reflectance value of the first object;
the second portion of the plurality of lighting elements is configured to emit a second amount of light based at least in part on the determined reflectance value of the second object; and is
The first amount of light and the second amount of light are different.
34. The light source module of clause 28, wherein:
the first amount of light is inversely proportional to the determined reflectance value of the first object; and is
The second amount of light is inversely proportional to the determined reflectance value of the second object.
35. A method, the method comprising:
configuring a light source module during image capture of an image by an image capture device, wherein the light source module comprises a plurality of lighting elements configured to emit light, and wherein the configuring comprises:
evaluating light emitted by the plurality of lighting elements, reflected by one or more objects, and detected at the image capture device to determine respective reflectance values of the one or more objects;
determining a lighting pattern of the light source module based at least in part on the respective reflectance values determined for the one or more objects; and
causing respective ones of the plurality of lighting elements to emit light to generate the determined lighting pattern, wherein the respective ones of the plurality of lighting elements are respectively configured to emit different amounts of light based at least in part on the respective reflectance values determined for the one or more objects.
36. The method of clause 35, wherein:
the one or more objects include a foreground object and a bounce object;
light emitted by a first portion of the plurality of lighting elements is reflected by the bounce object to the foreground object and then from the foreground object to the image capture device;
additional light emitted by a second portion of the plurality of lighting elements is reflected by the foreground object to the image capture device;
the illumination pattern of the light source module illuminates the foreground object;
the first portion of the plurality of lighting elements is configured to emit a first amount of light based at least in part on a reflectance value of the reflective object; and is
The second portion of the plurality of lighting elements is configured to emit a second amount of light different from the first amount of light.
37. The method of clause 36, wherein:
the light source module further comprises an adjustable lens configured to project emitted light of the plurality of lighting elements;
the bouncing object is outside a field of view of the image capture device; and is
The method further comprises adjusting the tunable lens to an illumination field comprising the bouncing object.
38. The method of clause 36 or clause 37, wherein:
identifying the bounce object using respective depth values determined for the bounce object and the foreground object.
39. The method of any of clauses 36-38, wherein:
the first amount of light and the second amount of light are determined according to a configured ratio of direct illumination and indirect illumination, respectively.
40. The method of clause 35, wherein:
light emitted by a first portion of the plurality of lighting elements is reflected by a first object of the one or more objects to the image capture device;
light emitted by a second portion of the plurality of lighting elements is reflected by a second object of the one or more objects to the image capture device;
the first portion of the plurality of lighting elements is configured to emit a first amount of light inversely proportional to the determined reflectance value of the first object;
the second portion of the plurality of lighting elements is configured to emit a second amount of light inversely proportional to the determined reflectance value of the second object; and is provided with
The first amount of light and the second amount of light are different.
In various embodiments, the methods described herein may be implemented in software, hardware, or a combination thereof. Additionally, the order of the blocks of a method may be changed, and various elements may be added, reordered, combined, omitted, modified, etc. Various modifications and changes will become apparent to those skilled in the art having the benefit of this disclosure. The various embodiments described herein are intended to be illustrative and not restrictive. Many variations, modifications, additions, and improvements are possible. Thus, multiple examples may be provided for components described herein as a single example. The boundaries between the various components, operations and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific example configurations. Other allocations of functionality are contemplated that may fall within the scope of claims that follow. Finally, structures and functionality presented as discrete components in the exemplary configurations may be implemented as a combined structure or component. These and other variations, modifications, additions, and improvements may fall within the scope of the embodiments as defined in the claims that follow.

Claims (21)

1. A mobile computing device, the mobile computing device comprising:
a camera arrangement, the camera arrangement comprising:
an image capturing device;
a plurality of lighting elements configured to emit light;
a plurality of background lighting control schemes; and
a controller for the plurality of lighting elements, wherein during image capture by an image capture device, the controller is configured to:
identifying, within a field of view of the image capture device, a foreground object and a background region distinct from the foreground object;
determining an illumination pattern of a light source module based at least in part on:
a distance of the foreground object to the image capture device;
an ambient brightness level of the foreground object;
an ambient brightness level of the background region; and
a selected one of the plurality of background lighting control schemes; and
causing respective ones of the plurality of lighting elements to emit light to generate the determined lighting pattern, wherein the respective ones of the plurality of lighting elements are respectively configured to emit different amounts of light based at least in part on a distance of a subject from the image capture device, the ambient brightness level of the subject, and the ambient brightness level of the background region.
2. The mobile computing device of claim 1, wherein:
the selected background lighting control scheme is a background preservation control scheme; and is
The illumination pattern of the light source module is configured to:
maintaining the brightness of the background region at the ambient brightness level of the background region; and
increasing the brightness of the foreground object above the ambient brightness level of the foreground object.
3. The mobile computing device of claim 1, wherein:
the selected background lighting control scheme is a background compensation control scheme;
the controller is further configured to identify respective ambient brightness levels of a plurality of background objects in the background region; and is
The illumination pattern of the light source module is configured to:
providing different illumination to the foreground object and the plurality of background objects based at least in part on the respective ambient brightness levels of the plurality of background objects.
4. The mobile computing device of claim 3, wherein:
the controller is further configured to identify respective distances of a plurality of background objects; and is
The illumination pattern of the light source module is determined based at least in part on the respective distances of the plurality of background objects to the image capture device.
5. The mobile computing device of claim 3 or claim 4, wherein:
the controller is further configured to identify respective reflectance values of the plurality of background objects; and is provided with
The illumination pattern of the light source module is determined based at least in part on the respective reflectance values of the plurality of background objects.
6. The mobile computing device of claim 1, wherein:
the controller is further configured to identify respective distances of one or more additional foreground objects within the field of view of the image capture device; and is
The illumination pattern of the light source module is determined based at least in part on respective distances of the one or more additional foreground objects.
7. The mobile computing device of claim 1, wherein:
the controller is further configured to identify respective ambient brightness levels of one or more additional foreground objects within the field of view of the image capture device; and is provided with
The illumination pattern of the light source module is determined based at least in part on the respective ambient brightness levels of one or more additional foreground objects.
8. The mobile computing device of claim 1, wherein:
the illumination pattern of the light source module is determined based at least in part on a template image or a predetermined light distribution designed with artistic intent.
9. A light source module, the light source module comprising:
a plurality of lighting elements configured to emit light;
a plurality of background lighting control schemes; and
a controller for the plurality of lighting elements, wherein during image capture by an image capture device, the controller is configured to:
identifying a foreground object and a background region different from the foreground object within a field of view of the image capture device;
determining an illumination pattern of the light source module based at least in part on:
a distance of the foreground object to the image capture device;
an ambient brightness level of the foreground object;
an ambient brightness level of the background region; and
a selected one of the plurality of background lighting control schemes; and
causing respective ones of the plurality of lighting elements to emit light to generate the determined lighting pattern, wherein the respective ones of the plurality of lighting elements are respectively configured to emit different amounts of light based at least in part on a distance of a subject from the image capture device, the ambient brightness level of the subject, and the ambient brightness level of the background region.
10. The light source module of claim 9, wherein:
the selected background lighting control scheme is a background preservation control scheme; and is
The illumination pattern of the light source module is configured to:
maintaining the brightness of the background region at the ambient brightness level of the background region; and
increasing the brightness of the foreground object above the ambient brightness level of the foreground object.
11. The light source module of claim 9, wherein:
the selected background lighting control scheme is a background compensation control scheme;
the controller is further configured to identify respective ambient brightness levels of a plurality of background objects in the background region; and is provided with
The illumination pattern of the light source module is configured to:
providing different illumination to the foreground object and the plurality of background objects based at least in part on the respective ambient brightness levels of the plurality of background objects.
12. The light source module of claim 11, wherein:
the controller is further configured to identify respective distances of the plurality of background objects; and is
The illumination pattern of the light source module is determined based at least in part on the respective distances of the plurality of background objects to the image capture device.
13. The light source module of claim 11 or claim 12, wherein:
the controller is further configured to identify respective reflectance values of the plurality of background objects; and is
The lighting pattern of the light source module is determined based at least in part on the respective reflectance values of the plurality of background objects.
14. The light source module of claim 9, wherein:
the controller is further configured to identify respective distances of one or more additional foreground objects within the field of view of the image capture device; and is
The illumination pattern of the light source module is determined based at least in part on respective distances of the one or more additional foreground objects.
15. The light source module of claim 9, wherein:
the controller is further configured to identify respective ambient brightness levels of one or more additional foreground objects within the field of view of the image capture device; and is
The illumination pattern of the light source module is determined based at least in part on the respective ambient brightness levels of one or more additional foreground objects.
16. A method, the method comprising:
configuring a light source module during image capture of an image by an image capture device, wherein the light source module comprises a plurality of lighting elements configured to emit light and a plurality of background lighting control schemes, and wherein the configuring comprises:
identifying, within a field of view of the image capture device, a foreground object and a background region distinct from the foreground object;
determining an illumination pattern of the light source module based at least in part on:
a distance of the foreground object to the image capture device;
an ambient brightness level of the foreground object;
an ambient brightness level of the background region; and
a selected one of the plurality of background lighting control schemes; and
causing each of the plurality of lighting elements to emit light to generate the determined lighting pattern, wherein each of the plurality of lighting elements is configured to emit different amounts of light based at least in part on a distance of a subject from the image capture device, the ambient brightness level of the subject, and the ambient brightness level of the background region.
17. The method of claim 16, wherein:
the selected background lighting control scheme is a background preservation control scheme; and is provided with
The illumination pattern of the light source module maintains the brightness of the background area at the ambient brightness level of the background area and increases the brightness of the foreground object above the ambient brightness level of the foreground object.
18. The method of claim 16, wherein:
the selected background lighting control scheme is a background compensation control scheme;
the configuring further comprises identifying respective ambient brightness levels of a plurality of background objects in the background region; and is
The illumination pattern of the light source module provides different illumination to the foreground object and the plurality of background objects based at least in part on the respective ambient brightness levels of the plurality of background objects.
19. The method of claim 18, wherein:
the configuring further comprises identifying respective distances of the plurality of background objects; and is
The illumination pattern of the light source module is determined based at least in part on the respective distances of the plurality of background objects to the image capture device.
20. The method of claim 18 or claim 19, wherein:
the configuring further comprises identifying respective reflectance values for the plurality of background objects; and is
The illumination pattern of the light source module is determined based at least in part on the respective reflectance values of the plurality of background objects.
21. The method of claim 15, wherein:
the configuring further comprises identifying respective distances of one or more additional foreground objects within the field of view of the image capture device; and is
The illumination pattern of the light source module is determined based at least in part on respective distances of the one or more additional foreground objects.
CN202211165800.XA 2021-09-24 2022-09-23 Adaptive flash photography, video and/or flash using camera, scene or user input parameters Pending CN115866112A (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US202163248398P 2021-09-24 2021-09-24
US63/248,398 2021-09-24
US17/934,491 US20230101548A1 (en) 2021-09-24 2022-09-22 Adaptive-Flash Photography, Videography, and/or Flashlight Using Camera, Scene, or User Input Parameters
US17/934,491 2022-09-22
US17/934,506 2022-09-22
US17/934,494 US12132996B2 (en) 2021-09-24 2022-09-22 Adaptive-flash photography, videography, and/or flashlight using camera, scene, or user input parameters
US17/934,494 2022-09-22
US17/934,506 US12219267B2 (en) 2021-09-24 2022-09-22 Adaptive-flash photography, videography, and/or flashlight using camera, scene, or user input parameters

Publications (1)

Publication Number Publication Date
CN115866112A true CN115866112A (en) 2023-03-28

Family

ID=85477378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211165800.XA Pending CN115866112A (en) 2021-09-24 2022-09-23 Adaptive flash photography, video and/or flash using camera, scene or user input parameters

Country Status (2)

Country Link
CN (1) CN115866112A (en)
DE (1) DE102022210127A1 (en)

Also Published As

Publication number Publication date
DE102022210127A1 (en) 2023-03-30

Similar Documents

Publication Publication Date Title
US11636641B2 (en) Electronic device for displaying avatar corresponding to external object according to change in position of external object
US20240064419A1 (en) Systems and methods for digital photography
US12219267B2 (en) Adaptive-flash photography, videography, and/or flashlight using camera, scene, or user input parameters
US20230101548A1 (en) Adaptive-Flash Photography, Videography, and/or Flashlight Using Camera, Scene, or User Input Parameters
CN113612930A (en) System and method for capturing digital images
US9582083B2 (en) Directional light sensors
CN105794196A (en) Method, apparatus and computer program product for modifying illuminance in an image
US10249076B2 (en) Image processing apparatus, image capturing apparatus, image processing method and storage medium storing image processing program
WO2019019904A1 (en) White balance processing method and apparatus, and terminal
US11393410B2 (en) Method of acquiring outside luminance using camera sensor and electronic device applying the method
US12132996B2 (en) Adaptive-flash photography, videography, and/or flashlight using camera, scene, or user input parameters
CN107533275A (en) Spatially adjustable flash of light for imaging device
US11736677B2 (en) Projector for active stereo depth sensors
CN115866112A (en) Adaptive flash photography, video and/or flash using camera, scene or user input parameters
CN111935388A (en) Flash lamp control method of mobile terminal, storage medium and equipment
CN112995628B (en) Projection control method and device, storage medium and projection equipment
CN114549607A (en) Method and device for determining main body material, electronic equipment and storage medium
CN113532800A (en) Analysis method of light-transmitting area and related equipment and device
US11985739B2 (en) Low light exposure control using infrared light-emitting diode zones during night vision
US12238399B2 (en) Electronic device including camera
CN117541478B (en) Image processing method and related device
CN117170616A (en) Display output control method and electronic equipment
WO2024129131A1 (en) Image noise reduction based on human vision perception
KR20240052581A (en) Method and electronic device for controlling camera
WO2024129130A1 (en) Per-tile selective processing for video bokeh power reduction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination