WO2024170598A1 - Behind oled authentication - Google Patents
Behind oled authentication Download PDFInfo
- Publication number
- WO2024170598A1 WO2024170598A1 PCT/EP2024/053682 EP2024053682W WO2024170598A1 WO 2024170598 A1 WO2024170598 A1 WO 2024170598A1 EP 2024053682 W EP2024053682 W EP 2024053682W WO 2024170598 A1 WO2024170598 A1 WO 2024170598A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern
- illumination source
- flood
- light
- image
- Prior art date
Links
- 238000005286 illumination Methods 0.000 claims abstract description 337
- 230000005693 optoelectronics Effects 0.000 claims abstract description 98
- 230000003287 optical effect Effects 0.000 claims description 215
- 239000000463 material Substances 0.000 claims description 148
- 238000000034 method Methods 0.000 claims description 135
- 230000008569 process Effects 0.000 claims description 46
- 238000004590 computer program Methods 0.000 claims description 30
- 238000003860 storage Methods 0.000 claims description 16
- 230000001052 transient effect Effects 0.000 claims description 4
- 238000012546 transfer Methods 0.000 description 25
- 239000013598 vector Substances 0.000 description 22
- 230000003595 spectral effect Effects 0.000 description 21
- 238000013527 convolutional neural network Methods 0.000 description 17
- 239000004020 conductor Substances 0.000 description 16
- 238000003384 imaging method Methods 0.000 description 16
- 238000012549 training Methods 0.000 description 16
- 238000003491 array Methods 0.000 description 14
- 238000001514 detection method Methods 0.000 description 14
- 230000001815 facial effect Effects 0.000 description 13
- 230000003247 decreasing effect Effects 0.000 description 12
- 239000000758 substrate Substances 0.000 description 11
- 238000013528 artificial neural network Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 230000015654 memory Effects 0.000 description 9
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 8
- 238000000354 decomposition reaction Methods 0.000 description 8
- 239000011521 glass Substances 0.000 description 8
- 239000010703 silicon Substances 0.000 description 8
- 229910052710 silicon Inorganic materials 0.000 description 8
- 238000010200 validation analysis Methods 0.000 description 8
- 238000013475 authorization Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 7
- 238000013145 classification model Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 238000010801 machine learning Methods 0.000 description 7
- 230000000737 periodic effect Effects 0.000 description 7
- 239000004065 semiconductor Substances 0.000 description 7
- 230000001427 coherent effect Effects 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 230000001419 dependent effect Effects 0.000 description 5
- 230000008707 rearrangement Effects 0.000 description 5
- 239000000853 adhesive Substances 0.000 description 4
- 230000001070 adhesive effect Effects 0.000 description 4
- 238000003705 background correction Methods 0.000 description 4
- 230000005670 electromagnetic radiation Effects 0.000 description 4
- 230000014509 gene expression Effects 0.000 description 4
- 230000000306 recurrent effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000000513 principal component analysis Methods 0.000 description 3
- 230000001902 propagating effect Effects 0.000 description 3
- 239000002096 quantum dot Substances 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- JBRZTFJDHDCESZ-UHFFFAOYSA-N AsGa Chemical compound [As]#[Ga] JBRZTFJDHDCESZ-UHFFFAOYSA-N 0.000 description 2
- 229910001218 Gallium arsenide Inorganic materials 0.000 description 2
- 229910000673 Indium arsenide Inorganic materials 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- RPQDHPTXJYYUPQ-UHFFFAOYSA-N indium arsenide Chemical compound [In]#[As] RPQDHPTXJYYUPQ-UHFFFAOYSA-N 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 235000020061 kirsch Nutrition 0.000 description 2
- 238000010397 one-hybrid screening Methods 0.000 description 2
- 238000012634 optical imaging Methods 0.000 description 2
- 238000012856 packing Methods 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000001454 recorded image Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 208000009119 Giant Axonal Neuropathy Diseases 0.000 description 1
- 238000004026 adhesive bonding Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 239000002537 cosmetic Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 201000003382 giant axonal neuropathy 1 Diseases 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 238000004020 luminiscence type Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 150000002894 organic compounds Chemical class 0.000 description 1
- 238000012946 outsourcing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000003014 reinforcing effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 238000005476 soldering Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/17—Image acquisition using hand-held instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/40—Spoof detection, e.g. liveness detection
- G06V40/45—Detection of the body part being alive
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
Definitions
- the invention relates to an optoelectronic apparatus, use of the optoelectronic apparatus, a device for authenticating a user, a method for authenticating a user of a device.
- the present invention further relates to a computer program, a computer-readable storage medium and a non-transient computer-readable medium.
- the devices, methods and uses according to the present invention specifically may be employed for example in various areas of daily life, security technology, gaming, traffic technology, production technology, photography such as digital photography or video photography for arts, documentation or technical purposes, safety technology, information technology, agriculture, crop protection, maintenance, cosmetics, medical technology or in the sciences. However, other applications are also possible.
- Available authentication systems in mobile devices such as in smartphones, tablets and the like, include cameras.
- Said mobile devices usually have a front display such as an organic lightemitting diode (OLED) area.
- OLED organic lightemitting diode
- a cutout of the display at the position of the camera is required. These cutouts decrease an available display area, the so- called notch, and thus, the display area available to the user.
- notch is an unpleasant feature to the user, the camera is restricted to the outer most position possible to avoid dark areas for example in the middle of the display.
- Active method may use laser based projection technologies for enriching scenes with additional information.
- Rx the camera
- Tx the laser projector
- established active methods such as structured light, active stereo, or time of flight use light typically passing through a maximally transparent protective glass to reconstruct the three-dimensional structure of the illuminated scene from the captured images.
- CMOS camera sensitive in the near infrared spectral range and a laser projector emitting light in the near infrared spectral range may be used, e.g. for ensuring invisibility of the method for the user.
- a smartphone e.g. in order to perform a secure biometric unlocking of the smartphone
- such a setup requires additional cutouts in the display, e.g. in order to be used together with a selfie camera, which may work in the visible spectral range.
- the laser projector may be configured for operating behind the semitransparent OLED display. This may allow reducing the number and size of cutouts within the display.
- the semitransparent OLED-display typically, comprises an OLED-pixel-structure.
- the OLED-pixel structure may be in particular defined by the optical non-transparent cathode.
- the use of transparent conductive tracks and design of the drive electronics can enable (semi-) transparent areas between the OLED pixels. Whereas for the passive operation, e.g. of the selfie camera, suitable transparency can be ensured, light from the laser projector undergoes diffraction loss.
- active methods may have an effective transmission from 3 % to 20 %, only.
- Such diffraction loss occurs for the light emission (Tx) as well as for the light detection (Rx).
- the diffraction loss can be visible as artefacts, e.g. decrease of contrast, in the recorded images.
- known method and devices face the following drawbacks: the resulting light at the light detection is significantly decreased due to double passage through the display; a remaining image quality may be further reduced by diffraction artifacts; the usable laser power, which would be necessary for compensating such losses, is limited due to requirements of eye safety (maximum dose after emission from the display) and display stability (maximum thermal dose).
- WO 2021/259923 A1 describes a projector and illumination module configured for scene illumination and pattern projection.
- the projector and illumination module comprises at least one array of a plurality of individual emitters and at least one optical system. Each of the individual emitters is configured for generating at least one illumination light beam.
- the optical system comprises at least one array of a plurality of transfer devices.
- the array of transfer devices comprises at least one transfer device for each of the individual emitters.
- the array of transfer devices comprises at least two groups of transfer devices. The transfer devices of the two groups differ in at least one property.
- the transfer devices of one of the groups are configured for generating at least one illumination pattern in response to illumination light beams impinging on said transfer devices.
- the transfer devices of the other group are configured for generating diverging light beams in response to illumination light beams impinging on said transfer devices.
- CN 114 779 491 A describes a terminal display module comprising a display panel and a three- dimensional imaging module.
- WO 2022/101429 A1 describes a display device which comprises: at least one illumination source configured for projecting at least one illumination beam on at least one scene; at least one optical sensor having at least one light sensitive area, wherein the optical sensor is configured for measuring at least one reflection light beam generated by the scene in response to illumination by the illumination beam; at least one translucent display configured for displaying information, wherein the illumination source and the optical sensor are placed in direction of propagation of the illumination light beam in front of the display; at least one control unit configured for turning off the display in an area of the illumination source during illumination and/or in an area of the optical sensor during measuring.
- an optoelectronic apparatus comprising: at least one pattern illumination source configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, at least one flood illumination source configured for emitting infrared flood light, at least one image generation unit configured for generating at least one pattern image while the pattern illumination source is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source is emitting infrared flood light.
- a relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm.
- optical apparatus as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a device or system operating on light and electrical currents.
- the term “light” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to electromagnetic radiation in one or more of the infrared, the visible and the ultraviolet spectral range.
- the term “ultraviolet spectral range” generally, refers to electromagnetic radiation having a wavelength of 1 nm to 380 nm, preferably of 100 nm to 380 nm.
- the term “infrared spectral range” (IR) generally refers to electromagnetic radiation of 760 nm to 1000 pm, wherein the range of 760 nm to 1.5 pm is usually denominated as “near infrared spectral range” (NIR) while the range from 1 .5 p to 15 pm is denoted as “mid infrared spectral range” (MidlR) and the range from 15 pm to 1000 pm as “far infrared spectral range” (FIR).
- NIR near infrared spectral range
- light used for the typical purposes of the present invention is light in the infrared (IR) spectral range, more preferred, in the near infrared (NIR) and/or the mid infrared spectral range (MidlR), especially the light having a wavelength of 1 pm to 5 pm, preferably of 1 pm to 3 pm.
- IR infrared
- NIR near infrared
- MidlR mid infrared spectral range
- the term “illuminate”, as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to the process of exposing at least one element to light.
- the term “illumination source” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an arbitrary device configured for generating or providing light in the sense of the above-mentioned definition.
- pattern illumination source as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an arbitrary device configured for generating or providing at least one light pattern, in particular at least one infrared light pattern.
- the term “light pattern” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to at least one arbitrary pattern comprising a plurality of light spots.
- the light spot may be at least partially spatially extended. At least one spot or any spot may have an arbitrary shape.
- a circular shape of at least one spot or any spot may be preferred.
- the spots may be arranged by considering a structure of a display comprised by a device that is further comprising the optoelectronic apparatus. Typically, an arrangement of an OLED-pixel- structure of the display may be considered.
- the term “infrared light pattern” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a light pattern comprising spots in the infrared spectral range.
- the infrared light pattern may be a near infrared light pattern.
- the infrared light may be coherent.
- the infrared light pattern may be a coherent infrared light pattern.
- the pattern illumination source may be configured for emitting light at a single wavelength, e.g. in the near infrared region. In other embodiments, the pattern illumination source may be adapted to emit light with a plurality of wavelengths, e.g. for allowing additional measurements in other wavelengths channels.
- the infrared light pattern may comprise at least one regular and/or constant and/or periodic pattern such as a triangular pattern, a rectangular pattern, a hexagonal pattern or a pattern comprising further convex tilings.
- the infrared light pattern is a hexagonal pattern, preferably a hexagonal infrared light pattern, preferably a 2/5 hexagonal infrared light pattern.
- triangular”, “rectangular” and “hexagonal” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a two-dimensional distribution of spots, in particular unit cells of the pattern.
- the unit cell in a triangular pattern is a triangle.
- a triangular pattern may comprise a plurality of groups of spots, wherein each group comprises at least three spots forming a triangle.
- the spots of rows and columns are rectangular to each other, e.g. square unit cells, rectangular unit cells or centered rectangular unit cells.
- a hexagonal pattern comprises a plurality of groups of spots, wherein each group has at least four spots forming a hexagonal unit cell, e.g. wherein three hexagonal cells form a hexagonal prism.
- an angle of 120° is formed between neighboring spots of rows and columns.
- Different packing density or fraction are thinkable for the pattern, e.g. for the hexagonal pattern, e.g. a 2/5 packing density. Using a periodical 2/5 hexagonal pattern can allow distinguishing between artefacts and usable signal.
- the infrared light pattern may comprise at least one point pattern.
- the infrared light pattern has a low point density.
- the number of infrared light spots is below or equal 4000 spots.
- the infrared light pattern may comprise equal to or less than 3000 spots, preferably equal to or less than 2000 spots.
- the number of spots may be below 2000 and/or above 0, preferably above 5, more preferably above 10, most preferably above 100.
- the infrared light pattern may have a low point density, in particular in comparison with other structured light techniques having typically a point density of 10k - 30k in a field of view of 55x38°. Using such a low point density may allow compensating for the above-mentioned diffraction loss.
- a contrast in the pattern image may be increased.
- Increasing the number of points would decrease the irradiance per point.
- the decreased number of spots may lead to an increase in irradiance of a spot and thus, to an increase in contrast in the pattern image of the projection of the infrared light pattern.
- the infrared light pattern may have a periodic point pattern with a reduced number of spots, wherein each of the spots has a high irradiance.
- Such a light pattern can ensure improved authentication using illumination sources and image generation unit behind a display.
- the low number of spots can ensure complying with eye safety requirements and stability requirements.
- the allowed dose may be divided between the spots of the light pattern.
- At least one of the infrared light spots may be associated with a beam divergence of 0.2° to 0.5°, preferably 0.1 ° to 0.3°.
- beam divergence as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to at least one measure of an increase in at least one diameter and/or at least one diameter equivalent, such as a radius, with a distance from an optical aperture from which the beam emerges.
- the measure may be an angle or an angle equivalent.
- a beam divergence may be determined at 1/e 2 .
- the pattern illumination source may comprise at least one pattern projector configured for generating the infrared light pattern.
- the pattern illumination source e.g. the pattern projector, may comprise at least one emitter, in particular a plurality of emitters.
- emitter as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to at least one arbitrary device configured for providing at least one light beam.
- the light beam may generate the infrared light pattern.
- the emitter may comprise at least one element selected from the group consisting of at least one laser source such as at least one semi-conductor laser, at least one double heterostructure laser, at least one external cavity laser, at least one separate confinement heterostructure laser, at least one quantum cascade laser, at least one distributed Bragg reflector laser, at least one polariton laser, at least one hybrid silicon laser, at least one extended cavity diode laser, at least one quantum dot laser, at least one volume Bragg grating laser, at least one Indium Arsenide laser, at least one Gallium Arsenide laser, at least one transistor laser, at least 50 one diode pumped laser, at least one distributed feedback lasers, at least one quantum well laser, at least one interband cascade laser, at least one semiconductor ring laser, at least one vertical cavity surface emitting laser (VCSEL); at least one non-laser light source such as at least one LED or at least one light bulb.
- at least one laser source such as at least one semi-conductor laser, at least one double heterostructure laser
- the pattern projector comprises at least one VCSEL, preferably a plurality of VCSELs.
- the plurality of VCSELs may be arranged in at least one array, e.g. comprising a matrix of VCSELs.
- the VCSELs may be arranged on the same substrate, or on different substrates.
- the term “vertical-cavity surface-emitting laser” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically specifically may refer, without limitation, to a semiconductor laser diode configured for laser beam emission perpendicular with respect to a top surface. Examples for VCSELs can be found e.g.
- VCSELs are generally known to the skilled person such as from WO 2017/222618 A. Each of the VCSELs is configured for generating at least one light beam.
- the VCSEL or the plurality of VCSELs may be configured for generating the desired spot number equal or below or equal 4000 spots, preferably, equal or below 3000 spots, more preferably equal or below 2000 spots.
- the plurality of generated spots may be associated with the infrared light pattern.
- the VCSELs may be configured for emitting light beams at a wavelength range from 800 to 1000 nm.
- the VCSELs may be configured for emitting light beams at 808 nm, 850 nm, 940 nm, and/or 980 nm.
- the VCSELs emit light at 940 nm, since terrestrial sun radiation has a local minimum in irradiance at this wavelength, e.g. as described in CIE 085-1989ußSolar spectral Irradiance”.
- the pattern illumination source may comprise at least one optical element configured for increasing, e.g. duplicating, the number of spots, e.g. the spots generated by the pattern projector.
- the pattern illumination source, particularly the optical element may comprises at least one diffractive optical element (DOE) and/or at least one metasurface element.
- DOE diffractive optical element
- the DOE and/or the metasurface element may be configured for generating multiple light beams from a single incoming light beam.
- a VCSEL projecting up to 2000 spots and an optical element comprising a plurality of metasurface elements may be used to duplicate the number of spots.
- Further arrangements, particularly comprising a different number of projecting VCSEL and/or at least one different optical element configured for increasing the number of spots may be possible.
- Other multiplication factors are possible.
- a VCSEL or a plurality of VCSELs may be used and the generated laser spots may be duplicated by using at least one DOE.
- the pattern illumination source may comprise at least one transfer device.
- transfer device also denoted as “transfer system”, as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to one or more optical elements which are adapted to modify the light beam, particularly the light beam used for generating at least a portion of the infrared light pattern, such as by modifying one or more of a beam parameter of the light beam, a width of the light beam or a direction of the light beam.
- the transfer device may comprise at least one imaging optical device .
- the transfer device specifically may comprise one or more of: at least one lens, for example at least one lens selected from the group consisting of at least one focus-tunable lens, at least one aspheric lens, at least one spherical lens, at least one Fresnel lens; at least one diffractive optical element; at least one concave mirror; at least one beam deflection element, preferably at least one mirror; at least one beam splitting element, preferably at least one of a beam splitting cube or a beam splitting mirror; at least one multilens system; at least one holographic optical element; at least one meta optical element.
- the transfer device comprises at least one refractive optical lens stack.
- the transfer device may comprise a multi-lens system having refractive properties.
- the light beam or light beams generated by the pattern illumination source may propagate parallel to an optical axis.
- the pattern illumination source may comprise at least one reflective element, preferably at least one prism, for deflecting the light beam onto the optical axis.
- the light beam or light beams, such as the laser light beam, and the optical axis may include an angle of less than 10°, preferably less than 5° or even less than 2°. Other embodiments, however, are feasible. Further, the light beam or light beams may be on the optical axis or off the optical axis.
- the light beam or light beams may be parallel to the optical axis having a distance of less 10 than 10 mm to the optical axis, preferably less than 5 mm to the optical axis or even less than 1 mm to the optical axis or may even coincide with the optical axis.
- the term “flood illumination source” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically specifically may refer, without limitation, to at least one arbitrary device configured for providing substantially continuous spatial illumination.
- the term “flood light” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to substantially continuous spatial illumination, in particular diffuse and/or uniform illumination.
- the flood light has a wavelength in the infrared range, in particular in the near infrared range.
- the flood illumination source may comprise at least one LED or at least one VCSEL, preferably a plurality of VCSELs.
- substantially continuous spatial illumination as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to uniform spatial illumination, wherein areas of non-uniform are possible.
- the area e.g. covering a user, a portion of the user and/or a face of the user, illuminated from the flood illumination source, may be contiguous. Power may be spread over a whole field of illumination.
- illumination provided by the light pattern may comprise at least two contiguous areas, in particular a plurality of contiguous areas, and/or power may be concentrated in small (compared to the whole field of illumination) areas of the field of illumination.
- the infrared flood illumination may be suitable for illuminating a contiguous area, in particular one contiguous area.
- the infrared pattern illumination may be suitable for illuminating at least two contiguous areas.
- the flood illumination source may illuminate a measurement area, such as a user, a portion of the user and/or a face of the user, with a substantially constant illumination intensity.
- the term “constant” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a time aspect during an exposure time. Flood light may vary temporally and/or may be substantially constant over time.
- substantially constant as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a completely constant illumination and embodiments in which deviations from a constant illumination of ⁇ ⁇ 10 %, preferably ⁇ ⁇ 5 %, more preferably ⁇ ⁇ 2 % are possible.
- a relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm.
- the distance between the flood and pattern illumination sources may define the distance between the far most distant points of the flood and pattern illumination sources.
- the relative distance between the flood illumination source and the pattern illumination source may be below 2.5 mm, preferably below 2.0 mm.
- a lower boundary for the relative distance may be 50 pm, preferably 60 pm, more preferably 70 pm, even more preferably 80 pm, most preferably 100 pm.
- the pattern illumination source and the flood illumination source may be combined into one module.
- the pattern illumination source and the flood illumination source may be arranged on the same substrate, in particular having a minimum relative distance.
- the minimum relative distance may be defined by a physical extension of the flood illumination source and the pattern illumination source.
- Arranging the pattern illumination source and the flood illumination source having a relative distance below 3.0 mm can result in decreased space requirement of the two illumination sources.
- said illumination sources can even be combined into one module. Such a reduced space requirement can allow reducing the transparent area(s) in a display necessary for operation of the illumination source(s) behind the display.
- the emitting of the infrared flood light and the illumination of the infrared light pattern may be performed subsequently or at at least partially overlapping times.
- the infrared flood light and the infrared light pattern may be emitted at the same time.
- one of the flood light or the infrared light pattern may be emitted with a lower intensity compared to the other one.
- the pattern illumination source and the flood illumination source may comprise at least one VCSEL, preferably a plurality of VCSELs.
- the pattern illumination source may comprise a plurality of first VCSELs mounted on a first platform.
- the flood illumination source may comprise a plurality of second VCSELs mounted on a second platform.
- the second platform may be beside the first platform.
- the optoelectronic apparatus may comprise a heat sink. Above the heat sink a first increment comprising the first platform may be attached. Above the heat sink a second increment comprising the second platform may be attached. The second increment may be different from the first increment.
- the first platform may be more distant to the optical element configured for increasing, e.g. duplicating, the number of spots.
- the second platform may be closer to the optical element.
- the beam emitted from the second VCSEL may be defocused and thus, form overlapping spots. This leads to a substantially continuous illumination and, thus, to flood illumination.
- the VCSELs of the pattern illumination source mounted on the first platform form a first VCSEL chip.
- the VCSELs of the flood illumination source mounted on the second platform form a second VCSEL chip.
- the optoelectronic apparatus may comprise at least one optical element having deflection properties.
- the VCSELs of the pattern illumination source are mounted on the first platform and form the first VCSEL chip.
- the VCSELs of the flood illumination source are mounted on the second platform and form the second VCSEL chip.
- the optical element may be configured for deflecting light emitted by the pattern illumination source and light emitted by the flood illumination source may be deflected differently.
- the optical element may comprise at least two different areas associated with at least two different deflection behaviors.
- the light emitted from the VCSEL chips may be deflected by the optical element depending on the area of the optical element the light illuminates.
- the first VCSEL chip may illuminate a first area of the optical element and may be deflected by a first angle.
- the second VCSEL chip may illuminate a second area of the optical element and may be deflected by a second angle.
- the optoelectronic apparatus comprises at least two optical elements, e.g.
- the optical element may be wavelength dependent.
- the optical element may be structured to deflect light of different wavelengths differently.
- the VCSEL emitter associated with the pattern illumination source may be associated with a different wavelength than the VCSEL emitter associated with the flood illumination source.
- the two light beams may differ in the beam width.
- the optical element may be configured for deflecting light of different beam widths differently.
- the two light beams generated by the pattern illumination source and the flood illumination source may differ in the beam width. Thus, the light of the pattern illumination source and the flood illumination source may be deflected differently.
- the VCSEL of the pattern illumination source and the VCSEL of the flood illumination source may be arranged on opposing sides.
- the optoelectronic apparatus may comprise
- first Vertical-Cavity Surface- Emitting Laser comprises an active area situated on a bottom surface of the first Vertical-Cavity Surface-Emitting Laser
- the second Vertical-Cavity Surface-Emitting Laser comprises an active area situated on a top surface of the second Vertical-Cavity Surface-Emitting Laser, wherein the top surface is opposite of a bottom surface of the second Vertical-Cavity Surface-Emitting Laser;
- the first Vertical-Cavity Surface- Emitting Laser is arranged with the bottom surface on the supporting member; and wherein the second Vertical-Cavity Surface-Emitting Laser is arranged with the bottom surface on the supporting member.
- active area is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a region in which the illumination light is generated.
- the active area may comprise one or more quantum wells and/or one or more quantum dots that are in which the illumination light is generated.
- an electric current may be applied to the active area.
- top surface as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an area that is located at the uppermost layer of an element.
- the top surface may be the part of the element that is exposed or visible in a top view.
- bottom surface as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an area that is located at the lowermost layer of an element.
- the bottom surface may be the part of the element that is exposed or visible in a bottom view.
- a Vertical-Cavity Surface-Emitting Laser comprises a plurality of layers.
- the top surface of the Vertical-Cavity Surface-Emitting Laser is on a different side of the Vertical- Cavity Surface-Emitting Laser than the bottom surface of the Vertical-Cavity Surface- Emitting Laser.
- the top surface and the bottom surface may be counterparts.
- the optical element comprises at least one supporting member; wherein the first Vertical- Cavity Surface-Emitting Laser is arranged with the bottom surface on the supporting member; and wherein the second Vertical-Cavity Surface-Emitting Laser is arranged with the bottom surface on the supporting member.
- the first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface-Emitting Laser may be arranged on the same supporting member.
- supporting member as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to at least one structural element that is designed for bearing loads and/or provide support to at least one further structural element.
- the supporting element may carry and/or support at least one of: the first Vertical-Cavity Surface-Emitting Laser; the second Vertical-Cavity Surface- Emitting Laser.
- the bottom surface of the first Vertical-Cavity Surface-Emitting Laser may be secured to the supporting member.
- the top surface of the second Vertical-Cavity Surface-Emitting Laser may be secured to the supporting member.
- the term “securing”, or any grammatical variation thereof, as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to stabilizing and/or fixing a first element to a second element.
- the stabilized and/or fixated first element may be in a predetermined relative position in relation to the second element in a manner that the first element and the second element hold the relative position and can withstand at least one force and/or at least one stress acting on at least one of the first element and the second element.
- the second Vertical-Cavity Surface- Emitting Laser; to the supporting member may be performed by at least one of:
- the supporting member may be at least one of:
- heat sink as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an arbitrary heat exchanger configured for dissipating heat generated by at least one component.
- the heat sink may be a passive heat exchanger.
- the heat sink may transfer the heat from the components to a fluid medium, typically air.
- the heat sink may dissipate the heat from one or more of the first Vertical-Cavity Surface-Emitting Laser; and the second Vertical-Cavity Surface- Emitting Laser.
- electrical connector as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an arbitrary electronically device configured for creating at least one electrical connection between a plurality of parts of an electrical circuit or between different electrical circuits.
- stiffing element as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an arbitrary element configured for reinforcing, particularly the supporting member.
- the term “printed circuit board” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an elements configured for connecting a plurality of electronic components to one another.
- the printed circuit board comprises a laminated sandwich structure of conductive and insulating layers.
- the electric components may be secured to conductive pads on an outer layer of the printed circuit board in a manner that they electric components are electrically connected to the conductive pad.
- the printed circuit board may comprise at least one conductive pad, wherein the electrical wire may be electrically connected to the conductive pad.
- the optical element further may comprise at least one optical lens.
- the first Vertical-Cavity Surface-Emitting Laser and/or the second Vertical-Cavity Surface- Emitting Laser may be configured for emitting illumination light through the optical lens.
- a first distance between the first Vertical-Cavity Surface- Emitting Laser, specifically the active area of the first Vertical-Cavity Surface-Emitting Laser, and the at least one optical lens may differ from a second distance between the second Vertical-Cavity Surface- Emitting Laser, specifically the active area of the second Vertical-Cavity Surface-Emitting Laser, and the at least one optical lens.
- optical lens as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an at least partially transparent medium configured for bending and/or focusing light rays.
- the optical lens may have at least one focal length, wherein the first distance or the second distance equals the focal length.
- the respective optical lens is in the focus of the optical lens and generates a sharp image, particularly for generating illumination light comprising an infrared light pattern.
- the further optical lens may not be in the focus of the optical lens and generates a blurred image, particularly from generating illumination light comprising infrared flood light.
- focal length is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a degree of how strongly an optical system, specifically an optical lens, converges and/or diverges light rays.
- the optical system may refer to the inverse of the optical power of the optical system.
- the focal length of an optical system may be calculated by considering and/or evaluating the image formation of an object.
- the focal length f may be evaluated by using the known lens formula wherein v relates the image distance, u relates to the object distance.
- the optical lens may have one or more different focal lengths.
- the first distance and the second distance may differ by a thickness of the first Vertical- Cavity Surface-Emitting Laser or a thickness of the second Vertical-Cavity Surface-Emitting Laser.
- thickness is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a distance between two opposite surfaces of an element, such as the Vertical-Cavity Surface- Emitting Laser.
- the thickness may be the shortest distance between the top surface and the bottom surface of a respective Vertical-Cavity Surface- Emitting Laser.
- the distance may be a length.
- the first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface- Emitting may be parallel.
- a geometrical central axis of the first Vertical- Cavity Surface-Emitting Laser and a geometrical central axis of the second Vertical-Cavity Surface-Emitting Laser may be parallel.
- the term “geometrical central axis” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an imaginary axis that runs through the center of an object, such as the Vertical-Cavity Surface-Emitting Laser.
- the center may be the geometrical center of the object.
- the first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface- Emitting may be coaxial.
- the geometrical central axis of the first Vertical- Cavity Surface-Emitting Laser and the geometrical central axis of the second Vertical-Cavity Surface-Emitting Laser may be coaxial.
- the first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface- Emitting may be coaxial. Particularly thus, The geometrical central axis of the first Vertical- Cavity Surface-Emitting Laser and/or the geometrical central axis of the second Vertical- Cavity Surface-Emitting Laser may be parallel to a geometrical central axis of the at least one optical lens.
- the first Vertical-Cavity Surface-Emitting Laser comprising the active area situated on the bottom surface may be configured for emitting illumination light at least partially in direction of a top surface of the first Vertical-Cavity Surface-Emitting Laser.
- the illumination light may be emitted in a manner that the illumination light is directed towards the optical lens, particularly in a manner that the illumination light is at least partially propagating through the optical lens.
- the first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface-Emitting Laser may emit illumination light in the same direction.
- detection light In the process of illuminating the object with illumination light, detection light will be generated.
- the generated detection light may comprise at least one of illumination light reflected by the object, illumination light scattered by the object, illumination light transmitted by the object, luminescence light generated by the object, e.g. phosphorescence or fluorescence light generated by the object after optical, electrical or acoustic excitation of the object by the illumination light or the like.
- the detection light may directly or indirectly be generated through the illumination of the object by the illumination light.
- At least one further component of the first Vertical-Cavity Surface- Emitting Laser may be at least partially transparent for the illumination light emitted by the first Vertical-Cavity Surface- Emitting Laser.
- the at least one further component of the first Vertical-Cavity Surface-Emitting Laser may be a substrate of the first Vertical-Cavity Surface-Emitting Laser.
- the pattern illumination source and the flood illumination source may be elements of a light emitter structure.
- the pattern illumination source may comprise a first array of light emitters and the flood illumination source may comprise a second array of light emitters.
- the optoelectronic apparatus may comprise a base providing a single plane for mounting the light emitters.
- the optoelectronic apparatus may comprise at least one system of optical elements comprising a plurality of optical elements.
- the system of optical elements may be configured for focusing the emitted infrared light pattern onto a focal plane.
- the system of optical elements may covers the light emitter structure.
- the optoelectronic apparatus may further comprise at least one flood light optical element configured for defocusing light emitted by the light emitters of the flood illumination source thereby forming overlapping light spots, wherein the flood light optical element is configured for leaving the emitted infrared light pattern uninfluenced.
- the term “base” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a carrier on which at least one further element, in particular the light emitter structure, can be mounted.
- the base may comprise a plurality of cavities into which the light emitter structure can be mounted.
- the base may have an arbitrary shape such as a rectangular, a circular, a hexagonal shape. The shape may refer to a side of the base oriented perpendicular to the direction in which height is measured.
- the base may comprise at least one semiconductor substrate.
- the base may be an element of the light emitter structure and/or an additional element.
- the base may be and/or may comprise a thermally conducting printed circuit board (PCB).
- PCB thermally conducting printed circuit board
- the light emitters of the light emitter structure may form a chip of light emitters, e.g. a VCSEL die, e.g. as sawn out of a wafer.
- a chip of light emitters may be mounted on the base, e.g. by using at least one heat conductive adhesive.
- the base may comprise at least one thermally conducting material.
- the base can be the bottom of the optoelectronic apparatus, e.g. of a housing of the optoelectronic apparatus.
- dimensions of the base may be defined by the dimension of the optics and the housing.
- the base and the housing may be separate elements.
- the chip of light emitters may be mounted, e.g. by using at least one heat conductive adhesive, on the base, e.g. the PCB, and the housing may be applied to this combined element.
- the base may comprise at least one thermally conducting material, in particular thermally conducting materials.
- the thermally conducting materials may be configured as a heat exchanger.
- the thermally conducting materials may be configured for regulating the temperature of the light emitter.
- the thermally conducting materials may be configured for transferring heat generated by a light emitter away from the light emitter.
- the thermally conducting materials may comprise at least one composite material.
- the light emitter structure may be mounted on the thermally conducting materials.
- plane as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a surface of the base.
- the sur-face may be continuous.
- the plane may be a flat surface.
- the plane may be designed without curvatures and/or steps.
- the term “provide” a single plane as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to one or more of comprising, having, being usable as, functioning as a surface on which the light emitter structure is mountable.
- the term “light emitter structure” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an assembly of at least four light emitters.
- the light emitter structure comprises a plurality of light emitters.
- the term “light emitter”, also abbreviated as emitter, as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to at least one arbitrary device configured for providing at least one light beam.
- the light beam may generate the infrared light pattern.
- array as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a one dimensional, e.g. row, array, or a two dimensional array, in particular in a matrix having m rows and n columns, with m, n, independently, being positive integers.
- the light emitters of the light emitter structure may be arranged in a periodic pattern.
- the light emitters of the light emitter structure may be arranged in one or more of a grid pattern, a hexagonal pattern, a shifted hexagonal pattern or the like.
- a plurality of light emitters of the light emitter structure may form the first array of light emitters and a plurality of light emitters, different from the light emitters of the first array, of the light emitter structure form the second array of light emitters.
- the light emitter structure may comprise two light emitter arrays, e.g. two VCSEL arrays.
- the arrays are located in one plane.
- the first array and the second array of emitters are produced as a single die directly on the plane or the first array and the second array of emitters are produced separately and are installed, e.g. side by side, on the plane.
- embodiments are possible, in which even more arrays are used, e.g. configured for providing different functions.
- the first array and the second array may be arranged side by side on the plane, in particular adjacent.
- the plane comprises along a direction perpendicular to an optical axis of the optoelectronic apparatus firstly the first array and subsequently the second array.
- cavities of the light emitters of the first array and cavities of the light emitters of the second array may form a combined pattern, in which the cavities of light emitters of the first array and cavities of the light emitters of the second array alternate, e.g. line by line or row by row.
- the light emitters of the pattern illumination source and of the flood illumination source may be active at different points in time.
- system as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an arbitrary set of inter-acting or interdependent components parts forming a whole. Specifically, the components may interact with each other in order to fulfill at least one common function. The at least two components may be handled independently or may be coupled or connectable.
- system of optical elements as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a system comprising at least two optical elements.
- the system of optical elements may comprise one or more of at least one refractive lens, a plurality of refractive lenses; at least one diffractive optical element (DOE), a plurality of DOEs, a plurality of meta lenses.
- the system of optical elements may comprise at least one refractive lens and at least one optical element configured for increasing, e.g. duplicating, the number of spots, e.g. the spots generated by the light emitters of the pattern illumination source.
- the system of optical elements may comprise at least one diffractive optical element (DOE) and/or at least one metasurface element.
- the DOE and/or the metasurface element may be configured for generating multiple light beams from a single incoming light beam.
- a VCSEL projecting up to 2000 spots and an optical element comprising a plurality of metasurface elements may be used to duplicate the number of spots.
- Further arrangements, particularly comprising a different number of projecting VCSEL and/or at least one different optical element configured for increasing the number of spots may be possible.
- Other multiplication factors are possible.
- a VCSEL or a plurality of VCSELs may be used and the generated laser spots may be duplicated by using at least one DOE.
- the system of optical elements covers the light emitter structure.
- the term "covers the light emitter structure" as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to completely or at least partially covering the light emitting structure, e.g. at least the first array of light emitters.
- the system of optical elements may be designed and/or arranged such that it covers the light emitter structure.
- the first array may be covered by the system of optical elements only.
- both of the first and the second array may be covered by the system of optical elements.
- the light emitter structure may be located at the focal point of the system of optical elements. Such an arrangement may allow that the emitted light of the light emitters, in particular of the light emitters of the pattern illumination source, is collimated.
- flood light optical element is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an optical element assigned to the flood illumination source and configured for defocusing light emitted by the light emitters of the flood illumination source thereby forming overlapping light spots.
- the flood light source in combination with the flood light optical element is configured for generating flood light, in particular diffuse illumination.
- the flood light optical element may comprise at least one element selected from the group consisting of: at least one plate with a refractive index larger than 1.4, e.g.
- the second array may be completely covered by the flood light optical element.
- the optoelectronic apparatus comprises a single flood light optical element covering all light emitters of the second array.
- the optoelectronic apparatus comprises a plurality of flood light optical elements.
- each light emitter of the sec-ond array comprises at least one assigned flood light optical element.
- the second array may be located at the focal point of the system of optical elements. Thus, the light generated by the second array would also be in focus. However, the optical imaging is changed by the additional flood light optical element such that the cavities are not collimated properly. This can allow generating a diffuse flood illumination.
- the flood light optical element is configured for leaving the emitted infrared light pattern uninfluenced.
- the term "uninfluenced” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to the fact that the flood light optical element is arranged and/or designed such that the light generated by the pat-tern illumination source does not interact with the flood light optical element.
- the flood light optical element may be arranged and/or designed such that the pattern illumination source is omitted and/or excluded from being covered by the flood light optical element. In particular, the pattern illumination source is not covered by the flood light optical element.
- image generation unit is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to at least one unit of the optoelectronic apparatus configured for generating at least one image.
- the image may be generated via a hardware and/or a software interface, which may be considered as the image generation unit.
- image generation as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to capturing and/or generating and/or determining and/or recording at least one image by using the image generation unit.
- the image generation may comprise imaging and/or recording the image.
- the image generation may comprise capturing a single image and/or a plurality of images such as a sequence of images.
- the capturing and/or generating and/or determining and/or recording of the image may be caused and/or initiated by the hardware and/or the software interface.
- the image generation may comprise recording continuously a sequence of images such as a video or a movie.
- the image generation may be initiated by a user action or may automatically be initiated, e.g. once the presence of at least one object or user within a field of view and/or within a predetermined sector of the field of view of the image generation unit is automatically detected.
- the image generation unit may comprise at least one optical sensor, in particular at least one pixelated optical sensor.
- the image generation unit may comprise at least one CMOS sensor or at least one CCD chip.
- the image generation unit may comprise at least one CMOS sensor, which may be sensitive in the infrared spectral range.
- image as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to data recorded by using the optical sensor, such as a plurality of electronic readings from the CMOS or CCD chip.
- the image may comprise raw image data or may be a pre-processed image.
- the pre-processing may comprise applying at least one filter to the raw image data and/or at least one background correction and/or at least one background subtraction.
- the image generation unit may comprise one or more of at least one monochrome camera e.g. comprising monochrome pixels, at least one color (e.g. RGB) camera e.g. comprising color pixels, at least one IR camera.
- the camera may be a CMOS camera.
- the camera may comprise at least one monochrome camera chip, e.g. a CMOS chip.
- the camera may comprise at least one color camera chip, e.g. an RGB CMOS chip.
- the camera may comprise at least one IR camera chip, e.g. an IR CMOS chip.
- the camera may comprise monochrome, e.g. black and white, pixels and color pixels.
- the color pixels and the monochrome pixels may be combined internally in the camera.
- the camera generally may comprise a one-dimensional or two-dimensional array of image sensors, such as pixels.
- the image generation unit may be at least one camera.
- the camera may be an internal and/or external camera of a device comprising the optoelectronic apparatus.
- the internal and/or external camera of the device may be accessed via a hardware and/or a software interface comprised by the optoelectronic apparatus, which is used as the image generation unit.
- the device is or comprises a smartphone the image generating unit may be a front camera, such as a selfie camera, and/or back camera of the smartphone.
- the image generation unit may have a field of view between 10°x10° and 75°x75°, preferably 55°x65°.
- the field of view is between 20°x20° and 65°x65°, more preferably 30°x30° and 60°x60°, most preferably 55°x65°.
- the image generation unit may have a resolution below 2 MP, preferably between 0.3 MP and 1.5 MP.
- the image generation unit may comprise further elements, such as one or more optical elements, e.g. one or more lenses.
- the optical sensor may be a fix-focus camera, having at least one lens which is fixedly adjusted with respect to the camera.
- the camera may also comprise one or more variable lenses which may be adjusted, automatically or manually.
- the camera may comprise at least one optical filter, e.g. at least one bandpass filter.
- the bandpass filter may be matched to the spectrum of the light emitters. Other cameras, however, are feasible.
- pattern image is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an image generated by the image generation unit while illuminating with the infrared light pattern, e.g. on an object and/or a user.
- the pattern image may comprise an image showing a user, in particular at least parts of the face of the user, while the user is being illuminated with the infrared light pattern, particularly on a respective area of interest comprised by the image.
- the pattern image may be generated by imaging and/or recording light reflected by an object and/or user which is illuminated by the infrared light pattern.
- the pattern image showing the user may comprise at least a portion of the illuminated infrared light pattern on at least a portion the user.
- the illumination by the pattern illumination source and the imaging by using the optical sensor may be synchronized, e.g. by using at least one control unit of the optoelectronic apparatus.
- the term “flood image” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an image generated by the image generation unit while illumination source is emitting infrared flood light, e.g. on an object and/or a user.
- the flood image may comprise an image showing a user, in particular the face of the user, while the user is being illuminated with the flood light.
- the flood image may be generated by imaging and/or recording light reflected by an object and/or user which is illuminated by the flood light.
- the flood image showing the user may comprise at least a portion of the flood light on at least a portion the user.
- the illumination by the flood illumination source and the imaging by using the optical sensor may be synchronized, e.g. by using at least one control unit of the optoelectronic apparatus.
- the image generation unit may be configured for imaging and/or recording the pattern image and the flood image at the same time or at different times.
- the image generation unit may be configured for imaging and/or recording the pattern image and the flood image at at least partially overlapping measurement areas or equivalents of the measurement areas.
- the optoelectronic apparatus may be comprised in a device.
- the optoelectronic apparatus is part of the device.
- the device may comprise at least one display, wherein the infrared light pattern traverses the display while being emitted from the pattern illumination source and/or the infrared flood light traverses the display while being emitted from the flood illumination source.
- the term “display” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an arbitrary shaped device configured for displaying an item of information.
- the item of information may be arbitrary information such as at least one image, at least one diagram, at least one histogram, at least one graphic, text, numbers, at least one sign, an operating menu, and the like.
- the display may be or may comprise at least one screen.
- the display may have an arbitrary shape, e.g. a rectangular shape.
- the display may be a front display of the device.
- the display may be or may comprise at least one organic light-emitting diode (OLED) display.
- OLED organic light-emitting diode
- the term “organic light emitting diode” is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a lightemitting diode (LED) in which an emissive electroluminescent layer is a film of organic compound configured for emitting light in response to an electric current.
- the OLED display may be configured for emitting visible light.
- the display particularly a display area, may be made of and/or may be covered by glass.
- the display may comprise at least one glass cover.
- the display may be at least partially transparent.
- the term “at least partially transparent” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a property of the display to allow light, in particular of a certain wavelength range, e.g. in the infrared spectral region, in particular in the near infrared spectral region, to pass at least partially through.
- the display may be semitransparent in the near infrared region.
- the display may have a transparency of 20 % to 50 % in the near infrared region.
- the display may have a different transparency for other wavelength ranges.
- the present invention may propose an optoelectronic apparatus comprising the image generation unit and two illumination sources that can be placed behind the display of a device.
- the transparent area(s) of the display can allow for operation of the optoelectronic apparatus behind the display.
- the display can be an at least partially transparent display, as described above.
- the display may have a reduced pixel density and/or a reduced pixel size and/or may comprise at least one transparent conducting path.
- the transparent area(s) of the display may have a pixel density of 300-440 PPI (pixels per inch), more preferably 350 to 450 PPI.
- Other areas of the display e.g. non-transparent areas, may have pixel densities higher than 400 PPI, e.g. a pixel density of 450-500 PPI.
- the display comprises a display area.
- display area as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to an active area of the display, in particular an area which is activatable.
- the display may have additional areas such as recesses or cutouts.
- the display may be at least partially transparent in at least one continuous area, preferably in at least two continuous areas. At least one of the continuous areas may cover the image generation unit and/or the pattern illumination source and/or the flood illumination source at least partially.
- the pattern illumination source, the flood illumination source and the image generation unit may be placed in direction of propagation of the infrared light pattern in front of the display. As outlined above, the pattern illumination source and the flood illumination source may be combined into one module. This may allow reducing the transparent area(s) of the display.
- the display may have a first area associated with a first pixel density (Pixels per inch (PPI)) value and a second area associated with a second pixel density value.
- the first pixel density value may be lower than the second pixel density value.
- the first pixel density value may be equal to or below 450 PPI, preferably from 300 to 440 PPI, more preferably 350 to 450 PPI.
- the second pixel density value may be 400 to 500 PPI, preferably 450 to 500 PPI.
- the first pixel density value may be associated with the at least one continuous area being at least partially transparent.
- the display may have a first area associated with a first pixel density value and a second area associated with a second pixel density value, wherein the first pixel density value is below 350 pixels per inch and a second area having a pixels per inch value equal to or above 400.
- the contrast may be defined as a difference between signal and backlight. In this case, the contrast can be considered to be reduced by the transmission. However, the contrast may be defined as a ratio of signal to background. In this case, the contrast may not be reduced by the transmission only, but by the diffraction in the spot pattern projection as the higher orders bring additional intensity besides the zero-th order. Diffraction may further takes place due to the structure of a display wiring leading to a further decrease in irradiance.
- the spot pattern projected through a display can result in main spots corresponding to the pattern before traversing the display area with reduced intensity because higher order spots are generated. Both effects lead to a reduction in irradiance to about 3-5% of initial irradiance and the presence of undesired additional spots in the pattern. In particular, this effect may take place while leaving the device (first pass through the display) and entering the device (second pass through the display).
- the present invention allows for providing a sufficient irradiance and, thus, contrast in the pattern image.
- the contrast can be increased by decreasing the number of spots projected onto the user. This can lead to an increase in irradiance of a spot and thus, to an increase in contrast in an image of the projection of the spot pattern.
- the pattern illumination source and the flood illumination source used for authentication of a user can be combined into one module to reduce the transparent area in displays.
- Using such an optoelectronic apparatus can allow for a rearrangement of the position of a camera, e.g. from the outer most position of the display. Such a rearrangement can allow for an optimized wiring since a further position, the position of the image generation unit, can be optimized. This can decrease the used wiring or allow for an improved battery operation.
- a use of an optoelectronic apparatus according to the present invention for authenticating a user of a device comprising the apparatus is disclosed.
- a device for authenticating a user of a device to perform at least one operation on the device that requires authentication is disclosed.
- the device comprising: at least one flood illumination source configured for emitting infrared flood light; at least one pattern illumination source configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, wherein a relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm; at least one image generation unit configured for generating at least one pattern image while the pattern illumination source is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source is emitting infrared flood light; at least one display covering the flood illumination source, the pattern illumination source and the image generation unit at least partially, at least one authentication unit configured for performing at least one authentication process of a user using the flood image and the pattern image.
- the device in particular, may comprise at least one optoelectronic apparatus according to the present invention.
- optoelectronic apparatus may comprise at least one optoelectronic apparatus according to the present invention.
- the device may be selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or , and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality device, and/or a wearable, such as a smart watch; or another type of portable computer.
- authentication is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to verifying an identity of a user.
- the authentication may comprise distinguishing between the user from other humans or objects, in particular between authorized access from non-authorized accesses.
- the authentication may comprise verifying identity of a respective user and/or assigning identity to a user.
- the authentication may comprise generating and/or providing identity information, e.g. to other devices or units such as to at least one authorization unit for authorization for providing access to the device.
- the identify information may be proofed by the authentication.
- the identity information may be and/or may comprise at least one identity token.
- an image of a face recorded by the image generation unit may be verified to be an image of the user’s face and/or the identity of the user is verified.
- the authenticating may be performed using at least one authentication process.
- the authentication process may comprise a plurality of steps such as at least one face detection on the flood image and at least one identification step in which an identity is assigned to the detected face and/or at least one identity check and/or verifying an identity of the user is performed.
- authentication unit is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to at least one unit configured for performing at least one authentication process of a user.
- the authentication unit may be or may comprise at least one processor.
- the processor may be an arbitrary logic circuitry configured for performing basic operations of a computer or system, and/or, generally, to a device which is configured for performing calculations or logic operations.
- the processor may be configured for processing basic instructions that drive the computer or system.
- the processor may comprise at least one arithmetic logic unit (ALU), at least one floating-point unit (FPU), such as a math co-processor or a numeric co-processor, a plurality of registers, specifically registers configured for supplying operands to the ALU and storing results of operations, and a memory, such as an L1 and L2 cache memory.
- ALU arithmetic logic unit
- FPU floating-point unit
- a plurality of registers specifically registers configured for supplying operands to the ALU and storing results of operations
- a memory such as an L1 and L2 cache memory.
- the processor may be a multi-core processor.
- the processor may be or may comprise a central processing unit (CPU).
- the processor may be or may comprise a microprocessor, thus specifically the processor’s elements may be contained in one single integrated circuitry (IC) chip.
- IC integrated circuitry
- the processor may be or may comprise one or more application-specific integrated circuits (ASICs) and/or one or more field-programmable gate arrays (FPGAs) and/or one or more tensor processing unit (TPU) and/or one or more chip, such as a dedicated machine learning optimized chip, or the like.
- the processor specifically may be configured, such as by software programming, for performing one or more evaluation operations.
- At least one or any component of a computer program configured for performing the authentication process may be executed by the processing device.
- the authentication unit may be or may comprise a connection interface.
- the connection interface may be configured to transfer data from the device to a remote device; or vice versa.
- At least one or any component of a computer program configured for performing the authentication process may be executed by the remote device.
- the authentication unit may perform at least one face detection using the flood image.
- the face detection may be performed locally on the device.
- Face identification i.e. assigning an identity to the detected face, however, may be performed remotely, e.g. in the cloud, e.g. especially when identification needs to be done and not only verification.
- User templates can be stored at the remote device, e.g. in the cloud, and would not need to be stored locally. This can be an advantage in view of storage space and security.
- the authentication unit may be configured for identifying the user based on the flood image.
- the authentication unit may forward data to a remote device.
- the authentication unit may perform the identification of the user based on the flood image, particularly by running an appropriate computer program having a respective functionality.
- identifying as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to assigning an identity to a detected face and/or at least one identity check and/or verifying an identity of the user.
- the authentication process may comprise a plurality of steps.
- the authentication process may comprise performing at least one face detection.
- the face detection step may comprise analyzing the flood image.
- the authentication process may comprise identifying.
- the identifying may comprise assigning an identity to a detected face and/or at least one identity check and/or verifying an identity of the user.
- the identifying may comprise performing a face verification of the imaged face to be the user’s face.
- the identifying the user may comprise matching the flood image, e.g. showing a contour of parts of the user, in particular parts of the user’s face, with a template.
- the identifying of the user may comprise determining if the imaged face is the face of the user, in particular if the imaged face corresponds to at least one image of the user’s face stored in at least one memory, e.g. of the device.
- the analyzing of the flood image may comprise one or more of the following: a filtering; a selection of at least one region of interest; a formation of a difference image between the flood image and at least one offset; an inversion of flood image; a background correction; a decomposition into color channels; a decomposition into hue; saturation; and brightness channels; a frequency decomposition; a singular value decomposition; applying a Canny edge detector; applying a Laplacian of Gaussian filter; applying a Difference of Gaussian filter; applying a Sobel operator; applying a Laplace operator; applying a Scharr operator; applying a Prewitt operator; applying a Roberts operator; applying a Kirsch operator; applying a high-pass filter; applying a low-pass filter; applying a Fourier transformation; applying a Radon- transformation; applying a Hough-transformation; applying a wavelet-transformation; a thresholding; creating a binary image.
- the region of interest may be determined manually by a user or may be determined automatically, such as by recognizing the user within the image.
- the analyzing of the flood image may comprise using at least one image recognition technique, in particular a face recognition technique.
- An image recognition technique comprises at least one process of identifying the user in an image.
- the image recognition may comprise using at least one technique selected from the technique consisting of: color-based image recognition, e.g. using features such as template matching; segmentation and/or blob analysis e.g. using size, or shape; machine learning and/or deep learning e.g. using at least one convolutional neural network.
- the analyzing of the flood image may comprise determining a plurality of facial features.
- the analyzing may comprise comparing, in particular matching, the determined facial features with template features.
- the template features may be features extracted from at least one template.
- the template may be or may comprise at least one image generated in an enrollment process, e.g. when initializing the device. Template may be an image of an authorized user.
- the template features and/or the facial feature may comprise a vector.
- Matching of the features may comprise determining a distance between the vectors.
- the identifying of the user may comprise comparing the distance of the vectors to a least one predefined limit, wherein the user is successfully identified in case the distance is ⁇ the predefined limit at least within tolerances. The user declining and/or rejected otherwise.
- the image recognition may comprise using at least one model, in particular a trained model comprising at least one face recognition model.
- the analyzing of the flood image may be performed by using a face recognition system, such as FaceNet, e.g. as described in Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832.
- the trained model may comprises at least one convolutional neural network.
- the convolutional neural network may be designed as described in M. D. Zeiler and R. Fergus, “Visualizing and understanding convolutional networks”, CoRR, abs/1311.2901 , 2013, or C.
- Learned-Miller “Labeled faces in the wild: A database for studying face recognition in unconstrained environments”, Technical Report 07-49, University of Massachusetts, Amherst, October 2007, the Youtube® Faces Database as described in L. Wolf, T. Hassner, and I. Maoz, “Face recognition in unconstrained videos with matched background similarity”, in IEEE Conf, on CVPR, 2011 , or Google® Facial Expression Comparison dataset.
- the training of the convolutional neural network may be performed as described in Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832.
- the authentication unit may be further configured for determining material data based on the pattern image. Particularly therefore, the authentication unit may forward data to a remote device. Alternatively or in addition, the authentication unit may perform the material determination based on the pattern image, particularly by running an appropriate computer program having a respective functionality. Particularly by considering the material as a parameter for validating the authentication process, the authentication process may be robust against being outwitted by using a recorded image of the user.
- the authentication unit may be configured for extracting the material data from the pattern image by beam profile analysis of the light spots.
- beam profile analysis reference is made to WO 2018/091649 A1 , WO 2018/091638 A1 and WO 2018/091640 A1 , the full content of which is included by reference.
- Beam profile analysis can allow for providing a reliable classification of scenes based on a few light spots.
- Each of the light spots of the pattern image may comprise a beam profile.
- the term “beam profile” may generally refer to at least one intensity distribution of the light spot on the optical sensor as a function of the pixel.
- the beam profile may be selected from the group consisting of a trapezoid beam profile; a triangle beam profile; a conical beam profile and a linear combination of Gaussian beam profiles.
- the authentication unit may be configured for outsourcing at least one step of the authentication process, such as the identifying of the user, and/or at least one step of the validation of the authentication process, such as the consideration of the material data, to a remote device, specifically a server and/or a cloud server.
- the device and the remote device may be part of a computer network, particularly the internet.
- the device may be used as a field device that is used by the user for generating data required in the authentication process and/or its validation.
- the device may transmit the generated data and/or data associated to an intermediate step of the authentication process and/or its validation to the remote device.
- the authentication unit may be and/or may comprise a connection interface configured for transmitting information to the remote device.
- connection interface may specifically be configured for transmitting or exchanging information.
- the connection interface may provide a data transfer connection.
- the connection interface may be or may comprise at least one port comprising one or more of a network or internet port, a USB-port, and a disk drive.
- data from the device may be transmitted to a specific remote device depending on at least one circumstance, such as a date, a day, a load of the specific remote device, and so on.
- the specific remote device may not be selected by the field device. Rather a further device may select to which specific remote device the data may be transmitted.
- the authentication process and and/or the generation of validation data may involve a use of several different entities of the remote device. At least one entity may generate intermediate data and transmit the intermediate data to at least one further entity.
- data from the device may be transmitted to a specific remote device depending on at least one circumstance, such as a date, a day, a load of the specific remote device, and so on.
- the specific remote device may not be selected by the field device. Rather a further device may select to which specific remote device the data may be transmitted.
- the authentication process and and/or the generation of validation data may involve a use of several different entities of the remote device. At least one entity may generate intermediate data and transmit the intermediate data to at least one further entity.
- the authentication unit is configured for using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data.
- the authentication unit may be configured for extracting material data from the pattern image.
- extracting material data from the pattern image may comprise generating the material type and/or data derived from the material type.
- extracting material data may be based on the pattern image.
- Providing the image to a model may comprise and may be followed by receiving the pattern image at an input layer of the model or via a model loss function.
- the model may be a data-driven model.
- Data-driven model may comprise a convolutional neural network and/or an encoder decoder structure such as an autoencoder.
- Other examples for generating a representation may be FFT, wavelets, deep learning, like CNNs, energy models, normalizing flows, GANs, vision transformers, or transformers used for natural language processing, Autoregressive Image Modeling, Normalizing Flows, Deep Autoencoders, Deep Energy-Based Models.
- Supervised or unsupervised schemes may be applicable to generate a representation, also embedding in e.g. cosine or Euclidian metric in ML language.
- the data-driven model may be parametrized according to a training data set including at least one image and material data, preferably at least one pattern image and material data.
- extracting material data may include providing the image to a model and/or receiving material data from the model.
- the data- driven model may be trained according to a training data set including at least one image and material data.
- the data-driven model may be parametrized according to a training data set including at least one image and material data.
- the data-driven model may be parametrized according to a training data set to receive the image and provide material data based on the received image.
- the data-driven model may be trained according to a training data set to receive the image and provide material data as output based on the received image.
- the training data set may comprise at least one image and material data, preferably material data associated with the at least one image.
- the image may comprise a representation of the image.
- the representation may be a lower dimensional representation of the image.
- the representation may comprise at least a part of the data or the information associated with the image.
- the representation of an image may comprise a feature vector.
- determining a representation, in particular a lower-dimensional representation may be based on principal component analysis (PCA) mapping or radial basis function (RBF) mapping. Determining a representation may also be referred to as generating a representation. Generating a representation based on PCA mapping may include clustering based on features in the pattern image and/or partial image. Additionally or alternatively, generating a representation may be based on neural network structures suitable for reducing dimensionality. Neural network structures suitable for reducing dimensionality may comprise encoder and/or decoder.
- neural network structure may be an autoencoder.
- neural network structure may comprise a convolutional neural network (CNN).
- the CNN may comprise at least one convolutional layer and/or at least one pooling layer.
- CNNs may reduce the dimensionality of a partial image and/or an image by applying a convolution, e.g. based on a convolutional layer, and/or by pooling. Applying a convolution may be suitable for selecting feature related to material information of the pattern image.
- a model may be suitable for determining an output based on an input.
- model may be suitable for determining material data based on an image as input.
- a model may be a deterministic model, a data-driven model or a hybrid model.
- the deterministic model preferably, reflects physical phenomena in mathematical form, e.g., including first- principles models.
- a deterministic model may comprise a set of equations that describe an interaction between the material and the patterned electromagnetic radiation thereby resulting in a condition measure, a vital sign measure or the like.
- a data-driven model may be a classification model.
- a hybrid model may be a classification model comprising at least one machine-learning architecture with deterministic or statistical adaptations and model parameters. Statistical or deterministic adaptations may be introduced to improve the quality of the results since those provide a systematic relation between empiricism and theory.
- the data-driven model may be a classification model.
- the classification model may comprise at least one machine-learning architecture and model parameters.
- the machine-learning architecture may be or may comprise one or more of: linear regression, logistic regression, random forest, piecewise linear, nonlinear classifiers, support vector machines, naive Bayes classifications, nearest neighbors, neural networks, convolutional neural networks, generative adversarial networks, support vector machines, or gradient boosting algorithms or the like.
- the model can be a multi-scale neural network or a recurrent neural network (RNN) such as, but not limited to, a gated recurrent unit (GRU) recurrent neural network or a long short-term memory (LSTM) recurrent neural network.
- RNN recurrent neural network
- GRU gated recurrent unit
- LSTM long short-term memory
- the data-driven model may be trained based on the training data set. Training the model may include parametrizing the model.
- the term training may also be denoted as learning.
- the term specifically may refer, without limitation, to a process of building the classification model, in particular determining and/or updating parameters of the classification model. Updating parameters of the classification model may also be referred to as retraining. Retraining may be included when referring to training herein.
- the training data set may include at least one image and material information.
- extracting material data from the image with a data-driven model may comprise providing the image to a data-driven model. Additionally or alternatively, extracting material data from the image with a data-driven model may comprise may comprise generating an embedding associated with the image based on the data-driven model.
- An embedding may refer to a lower dimensional representation associated with the image such as a feature vector. Feature vector may be suitable for suppressing the background while maintaining the material signature indicating the material data.
- background may refer to information independent of the material signature and/or the material data. Further, background may refer to information related to biometric features such as facial features.
- Material data may be determined with the data-driven model based on the embedding associated with the image.
- extracting material data from the image by providing the image to a data-driven model may comprise transforming the image into material data, in particular a material feature vector indicating the material data.
- material data may comprise further the material feature vector and/or material feature vector may be used for determining material data.
- authentication process may be validated based on the extracted material data.
- the validating based on the extracted material data may comprise determining if the extracted material data corresponds a desired material data. Determining if extracted material data matches the desired material data may be referred to as validating. Allowing or declining the user and/or object to perform at least one operation on the device that requires authentication based on the material data may comprise validating the authentication or authentication process. Validating may be based on material data and/or image. Determining if the extracted material data corresponds a desired material data may comprise determining a similarity of the extracted material data and the desired material data. Determining a similarity of the extracted material data and the desired material data may comprise comparing the extracted material data with the desired material data. Desired material data may refer to predetermined material data.
- desired material data may be skin. It may be determined if material data may correspond to the desired material data.
- skin as desired material data may be compared with non-skin material or silicon as material data and the result may be declination since silicon or non-skin material may be different from skin.
- the authentication process or its validation may include generating at least one feature vector from the material data and matching the material feature vector with associate reference template vector for material.
- the authentication unit may be configured for authenticating the user in case the user can be identified and/or if the material data matches the desired material data.
- the device may comprise at least one authorization unit configured for allowing the user to perform at least one operation on the device, e.g. unlocking the device, in case of successful authentication of the user or declining the user to perform at least one operation on the device in case of nonsuccessful authentication. Thereby, the user may become aware of the result of the authentication.
- the present invention discloses a method for authenticating a user of a device to perform at least one operation on the device that requires authentication.
- the device comprising a display and the method comprising: a. illuminating the user with at least one infrared light pattern comprising a plurality of infrared light spots from at least one pattern illumination source of the device, wherein the number of infrared light spots is below or equal 4000 spots; b. illuminating the user with infrared flood light from at least one flood illumination source of the device, wherein the distance between the flood illumination source and the pattern illumination source is below 3.0 mm c.
- the image generation unit and/or the flood and pattern illumination sources are covered at least partially by the display of the device, d. identifying the user based on the flood image by using at least one authentication unit of the device, e. extracting material data from the at least one pattern image by using the authentication unit; and f. allowing the user to perform at least one operation on the device that requires authentication based on the material data and the identifying.
- the method steps may be performed in the given order or may be performed in a different order. Further, one or more additional method steps may be present which are not listed. Further, one, more than one or even all of the method steps may be performed repeatedly.
- the method may comprise using the device according to the present invention, such as according to one or more of the embodiments given above or given in further detail below.
- the identifying of the user may comprise matching the flood image with a template.
- the method may comprise using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data.
- the pattern image and/or the image showing the user while the user is being illuminated with the infrared flood light may be showing at least a portion of a face of the user.
- the single processing device may be configured to exclusively perform at least one computer program, in particular at least one line of computer program code configured to execute at least one algorithm, as used in at least one of the embodiments of the method according to the present invention.
- the computer program as executed on the single processing device may comprise all instructions causing the computer to carry out the method.
- at least one method step may be performed by using at least one remote device, especially selected from at least one of a server or a cloud server, particularly when the device and the remote device may be part of a computer network.
- the computer program may comprise at least one remote component to be executed by the at least one remote processing device to carry out the at least one method step.
- the remote component may have the functionality of performing the identifying of the user and/or the extraction of the material data.
- the computer program may comprise at least one interface configured to forward to and/or receive data from the at least one remote component of the computer program.
- the method may comprise allowing or declining the user to perform at least one operation on the device.
- allowing or declining the user to perform at least one operation on the device that requires authentication based on the material data may include allowing the user to perform at least one operation on the device that requires authentication if the material data matches desired material data and/or authentication may be successful.
- Desired material data may be predetermined material data.
- Authentication may be successful if the user can be identified and/or if the material data matches desired material data.
- allowing or declining the object to perform at least one operation on the device that requires authentication based on the material data may include declining to perform at least one operation on the device that requires authentication if the material data does not match desired material data and/or authentication may be unsuccessful. Authentication may be unsuccessful if the pattern image cannot be matched with an image template and/or if the material data does not match the desired material data.
- At least one operation on the device that requires authentication may be access to the device, e.g. unlocking the device, and/or access to an application, preferably associated with the device and/or access to a part of an application, preferably associated with the device.
- allowing the user to access a resource may include allowing the user to perform at least one operation with a device and/or system.
- the resource may be a device, a system, a function of a device, a function of a system and/or an entity.
- allowing the user to access a resource may include allowing the user to access an entity.
- the entity may be physical entity and/or virtual entity.
- the virtual entity may be a database for example.
- the physical entity may be an area with restricted access.
- the area with restricted access may be one of the following: security areas, rooms, apartments, vehicles, parts of the before mentioned examples, or the like.
- Device and/or system may be locked. The device and/or the system may only be unlocked by authorized user.
- the term “user” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a person intended to and/or using the device.
- the method may be computer implemented.
- computer implemented as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
- the term specifically may refer, without limitation, to a method involving at least one computer and/or at least one computer network.
- the computer and/or computer network may comprise at least one processor which is con-figured for performing at least one of the method steps of the method according to the present invention. Specifically, each of the method steps is performed by the computer and/or computer network. The method may be performed completely automatically, specifically without user interaction.
- a computer program including computer-executable instructions for performing the method according to the present invention in one or more of the embodiments enclosed herein when the program is executed on a computer or computer network.
- the computer program may be stored on a computer-readable data carrier and/or on a computer-readable storage medium.
- the computer program may be executed on at least one processor comprised by the optoelectronic apparatus and/or the device.
- the computer program may generate input data by accessing and/or controlling at least one unit of the optoelectronic apparatus and/or the device, such as the pattern illumination source and/or the flood illumination source and/or the image generation unit.
- the computer program may generate outcome data based on the input data, particularly by using the authentication unit.
- computer-readable data carrier and “computer-readable storage medium” specifically may refer to non-transitory data storage means, such as a hardware storage medium having stored thereon computer-executable instructions.
- the stored computerexecutable instruction may be associate with the computer program.
- the computer-readable data carrier or storage medium specifically may be or may comprise a storage medium such as a random-access memory (RAM) and/or a read-only memory (ROM).
- RAM random-access memory
- ROM read-only memory
- one, more than one or even all of method steps a. to f. as indicated above may be performed by using a computer or a computer network, preferably by using a computer program.
- program code means in order to perform the method according to the present invention in one or more of the embodiments enclosed herein when the program is executed on a computer or computer network.
- the program code means may be stored on a computer-readable data carrier and/or on a computer-readable storage medium.
- a data carrier having a data structure stored thereon, which, after loading into a computer or computer network, such as into a working memory or main memory of the computer or computer network, may execute the method according to one or more of the embodiments disclosed herein.
- a computer program product with program code means stored on a machine-readable carrier, in order to perform the method according to one or more of the embodiments disclosed herein, when the program is executed on a computer or computer network.
- a computer program product refers to the program as a tradable product.
- the product may generally exist in an arbitrary format, such as in a paper format, or on a computer-readable data carrier and/or on a computer-readable storage medium.
- the computer program product may be distributed over a data network.
- Non-transient computer-readable medium including instructions that, when executed by one or more processors, cause the one or more processors to perform the method according to one or more of the embodiments disclosed herein.
- modulated data signal which contains instructions readable by a computer system or computer network, for performing the method according to one or more of the embodiments disclosed herein.
- one or more of the method steps or even all of the method steps of the method according to one or more of the embodiments disclosed herein may be performed by using a computer or computer network.
- any of the method steps including provision and/or manipulation of data may be per-formed by using a computer or computer network.
- these method steps may include any of the method steps, typically except for method steps requiring manual work, such as providing the samples and/or certain aspects of performing the actual measurements.
- a computer or computer network comprising at least one processor, wherein the processor is adapted to perform the method according to one of the embodiments described in this description, a computer loadable data structure that is adapted to perform the method according to one of the embodiments described in this description while the data structure is being executed on a computer, a computer program, wherein the computer program is adapted to perform the method according to one of the embodiments described in this description while the program is being executed on a computer, a computer program comprising program means for performing the method according to one of the embodiments described in this description while the computer program is being executed on a computer or on a computer network, a computer program comprising program means according to the preceding embodiment, wherein the program means are stored on a storage medium readable to a computer, a storage medium, wherein a data structure is stored on the storage medium and wherein the data structure is adapted to perform the method according to one of the embodiments described in this description after having been loaded into a main and/or working storage of a computer
- the terms “have”, “comprise” or “include” or any arbitrary grammatical variations thereof are used in a non-exclusive way. Thus, these terms may both refer to a situation in which, besides the feature introduced by these terms, no further features are present in the entity described in this context and to a situation in which one or more further features are present.
- the expressions “A has B”, “A comprises B” and “A includes B” may both refer to a situation in which, besides B, no other element is present in A (i.e. a situation in which A solely and exclusively consists of B) and to a situation in which, besides B, one or more further elements are present in entity A, such as element C, elements C and D or even further elements.
- the terms “at least one”, “one or more” or similar expressions indicating that a feature or element may be present once or more than once typically are used only once when introducing the respective feature or element. In most cases, when referring to the respective feature or element, the expressions “at least one” or “one or more” are not repeated, nonwithstanding the fact that the respective feature or element may be present once or more than once.
- the terms “preferably”, “more preferably”, “particularly”, “more particularly”, “specifically”, “more specifically” or similar terms are used in conjunction with optional features, without restricting alternative possibilities.
- features introduced by these terms are optional features and are not intended to restrict the scope of the claims in any way.
- the invention may, as the skilled person will recognize, be performed by using alternative features.
- features introduced by "in an embodiment of the invention” or similar expressions are intended to be optional features, without any restriction regarding alternative embodiments of the invention, without any restrictions regarding the scope of the invention and without any restriction regarding the possibility of combining the features introduced in such way with other optional or non-optional features of the invention.
- an optoelectronic apparatus comprising: at least one pattern illumination source configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, at least one flood illumination source configured for emitting infrared flood light, at least one image generation unit configured for generating at least one pattern image while the pattern illumination source is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source is emitting infrared flood light, wherein a relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm.
- the optoelectronic apparatus may comprised in a device, wherein the device may comprise at least one display, wherein the infrared light pattern may traverse the display while being emitted from the pattern illumination source and/or the infrared flood light may traverse the display while being emitted from the flood illumination source.
- the relative distance between the flood illumination source and the pattern illumination source may be below 2.5 mm, preferably below 2.0 mm.
- the infrared light may be coherent, wherein the infrared light pattern may be a coherent infrared light pattern.
- the infrared light pattern may be a hexagonal pattern, preferably a 2/5 hexagonal infrared light pattern.
- At least one of the infrared light spots may be associated with a beam divergence of 0.2° to 0.5°, preferably 0.1 ° to 0.3°.
- the infrared light pattern may be a near infrared light pattern.
- the infrared light pattern may comprise equal or less than 3000 spots, preferably equal or less than 2000 spots.
- the infrared light pattern may comprise at least one point pattern, wherein the infrared light pattern may have a low point density, wherein the infrared light pattern may comprise equal to or less than 2000 spots and/or above 0, preferably above 5, more preferably above 10, most preferably above 100.
- the image generation unit may have a field of view between 10°x10° and 75°x75°, preferably the field of view is between 20°x20° and 65°x65°, more preferably 30°x30° and 60°x60°, most preferably 55°x65.
- the image generation unit may have a resolution below 2 MP, preferably between 0.3 MP and 1.5 MP.
- the image generation unit may comprise at least one CMOS sensor or at least one CCD chip.
- the pattern illumination source may comprise at least one pattern projector configured for generating the infrared light pattern.
- the pattern illumination source such as the pattern projector, may comprise at least one vertical cavity surface-emitting laser (VCSEL), preferably a plurality of VCSELs.
- VCSEL vertical cavity surface-emitting laser
- the pattern illumination source may comprise at least one optical element configured for increasing the number of spots, wherein the optical element may comprise at least one diffractive optical element (DOE) and/or at least one metasurface element.
- DOE diffractive optical element
- the flood illumination source may comprise at least one VCSEL, preferably a plurality of VCSELs.
- the pattern illumination source may comprise a plurality of first VCSELs mounted on a first platform, wherein the flood illumination source may comprise a plurality of second VCSELs mounted on a second platform.
- the optoelectronic apparatus may comprise a heat sink, wherein above the heat sink a first increment comprising the first platform may be attached, wherein above the heat sink a second increment comprising the second platform may be attached.
- the emitting of the infrared flood light and the illumination of the infrared light pattern may be performed subsequently or at at least partially overlapping times.
- the optoelectronic apparatus may comprise at least one optical element with at least two different areas associated with two different deflection behaviors, wherein light emitted from the pattern illumination source and light emitted from the flood illumination source may be deflected differently by the optical element depending on the area of the optical element the light illuminates.
- the pattern illumination source may emit light having a first wavelength and the flood illumination source may emit light having a second wavelength different from the first wavelength
- the optoelectronic apparatus may comprise at least one optical element, wherein the optical element is wavelength dependent.
- the pattern illumination source may emit at least one light beam having a first beam width and the flood illumination source may emit at least one light beam having a second beam width different from the first beam width
- the optoelectronic apparatus may comprise at least one optical element, wherein the optical element may be configured for deflecting light of different beam widths differently.
- the optoelectronic apparatus may comprise:
- the first Vertical-Cavity Surface-Emitting Laser may be arranged with the bottom surface on the supporting member; and wherein the second Vertical-Cavity Surface-Emitting Laser may be arranged with the bottom surface on the supporting member.
- the optoelectronic apparatus may comprise: a light emitter structure comprising a plurality of light emitters, wherein a first array of light emitters of the plurality of light emitters forms the pattern illumination source, wherein a second array of light emitters of the plurality of light emitters, different from the light emitters of the first array, forms the flood illumination source; a base providing a single plane for mounting the light emitters; at least one system of optical elements comprising a plurality of optical elements, wherein the system of optical elements is configured for focusing the emitted infrared light pattern onto a focal plane, wherein the system of optical elements covers the light emitter structure; at least one flood light optical element configured for defocusing light emitted by the light emitters of the flood illumination source thereby forming overlapping light spots, wherein the flood light optical element is configured for leaving the emitted infrared light pattern uninfluenced.
- an optoelectronic apparatus such as disclosed in any one of embodiments, for authenticating a user of a device comprising the apparatus is disclosed.
- a device for authenticating a user of a device to perform at least one operation on the device that requires authentication comprising: at least one flood illumination source configured for emitting infrared flood light; at least one pattern illumination source configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, wherein a relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm; at least one image generation unit configured for generating at least one pattern image while the pattern illumination source is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source is emitting infrared flood light; at least one display covering the flood illumination source, the pattern illumination source and the image generation unit at least partially, at least one authentication unit configured for performing at least one authentication process of a user using the flood image and the pattern image.
- the display may be or may comprise at least one organic light-emitting diode (OLED) display.
- OLED organic light-emitting diode
- the display may be at least partially transparent.
- the display may comprise a display area.
- the display may be made of and/or is covered by glass.
- the display of the device may be at least partially transparent in at least one continuous area, preferably in at least two continuous areas, wherein at least one of the continuous areas may cover the image generation unit and/or the pattern illumination source and/or the flood illumination source at least partially.
- the display may have a first area associated with a first pixel density value and a second area associated with a second pixel density value, wherein the first pixel density value may be lower than the second pixel density value, preferably first pixel density value is equal or below 450 PPI.
- the display may have a first area associated with a first pixel density value and a second area associated with a second pixel density value, wherein the first pixel density value may be below 350 pixels per inch and a second area having a pixels per inch value equal to or above 400.
- the first pixel density value may be associated with the at least one continuous area being at least partially transparent.
- the first pixel density value may be below 350 pixels per inch and a second area having a pixels per inch value equal to or above 400.
- the device may be selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or , and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality device, and/or a wearable, such as a smart watch; or another type of portable computer.
- the authentication unit may be configured for using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data.
- the device may comprise at least one optoelectronic device according to the present invention, such as according to any one of the preceding embodiments referring to an optoelectronic device.
- a method for authenticating a user of a device to perform at least one operation on the device that requires authentication comprising a display and the method comprising: a. illuminating the user with at least one infrared light pattern comprising a plurality of infrared light spots from at least one pattern illumination source of the device, wherein the number of infrared light spots is below or equal 4000 spots; b. illuminating the user with infrared flood light from at least one flood illumination source of the device, wherein the distance between the flood illumination source and the pattern illumination source is below 3.0 mm c.
- the image generation unit and/or the flood and pattern illumination sources are covered at least partially by the display of the device, d. identifying the user based on the flood image by using at least one authentication unit of the device, e. extracting material data from the at least one pattern image by using the authentication unit; and f. allowing the user to perform at least one operation on the device that requires authentication based on the material data and the identifying.
- the method may comprise using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data.
- the identifying of the user may comprise matching the flood image with a template.
- a computer program comprising instructions which, when the program is executed by the device according to any one of the preceding embodiments referring to a device, cause the device to perform the method according to any one of the preceding embodiments referring to a method.
- a computer-readable storage medium comprising instructions which, when the instructions are executed by the device according to any one of the preceding embodiments referring to a device, cause the device to perform the method according to any one of the preceding embodiments referring to a method.
- a non-transient computer-readable medium including instructions that, when executed by one or more processors, cause the one or more processors to perform the method according to any one of the preceding embodiments referring to a method.
- Figure 1 shows an embodiment of a device according to the present invention
- Figure 2 shows an embodiment of a method according to the present invention
- Figures 3A and 3B show embodiments of an optical element having deflection properties
- Figure 4 shows a schematic of an exemplary optical element in a side view
- FIGS 5A and 5B show an embodiment of the optoelectronic apparatus
- Figures 6A and 6B show an embodiment of the optoelectronic apparatus.
- Figure 1 shows an embodiment of a device 110 of the present invention in a highly schematic fashion.
- the device 110 may be selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality device, and/or a wearable, such as a smart watch; or another type of portable computer.
- the device 110 comprises an optoelectronic apparatus 112 according to the present invention.
- the optoelectronic apparatus 112 comprises at least one pattern illumination source 114 configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots.
- the number of infrared light spots is below or equal 4000 spots.
- the pattern illumination source 114 may be configured for generating or providing at least one light pattern, in particular at least one infrared light pattern.
- the light pattern may comprise a plurality of light spots.
- the light spot may be at least partially spatially extended.
- the infrared light pattern may be a near infrared light pattern.
- the infrared light may be coherent.
- the infrared light pattern may be a coherent infrared light pattern.
- the pattern illumination source 114 may be configured for emitting light at a single wavelength, e.g. in the near infrared region. In other embodiments, the pattern illumination source 114 may be adapted to emit light with a plurality of wavelengths, e.g. for allowing additional measurements in other wavelengths channels.
- the infrared light pattern may comprise at least one regular and/or constant and/or periodic pattern such as a triangular pattern, a rectangular pattern, a hexagonal pattern or a pattern comprising further convex tilings.
- the infrared light pattern is a hexagonal pattern, preferably a hexagonal infrared light pattern, preferably a 2/5 hexagonal infrared light pattern. Using a periodical 2/5 hexagonal pattern can allow distinguishing between artefacts and usable signal.
- the infrared light pattern may comprise at least one point pattern.
- the infrared light pattern has a low point density.
- the number of infrared light spots is below or equal 4000 spots.
- the infrared light pattern may comprise equal to or less than 3000 spots, preferably equal to or less than 2000 spots.
- the spots may have a circular shape. Further shapes are possible.
- the infrared light pattern may have a low point density, in particular in comparison with other structured light techniques having typically a point density of 10k - 30k in a field of view of 55x38°. Using such a low point density may allow compensating for the above-mentioned diffraction loss. By decreasing the number of spots projected onto an object and/or a user, a contrast in the pattern image may be increased.
- the decreased number of spots may lead to an increase in irradiance of a spot and thus, to an increase in contrast in the pattern image of the projection of the infrared light pattern.
- the infrared light pattern may have a periodic point pattern with a reduced number of spots, wherein each of the spots has a high irradiance.
- Such a light pattern can ensure improved authentication using the pattern illumination source 114 and, as described above, at least one flood illumination source 116, at least one image generation unit 118, behind a display 120.
- the low number of spots can ensure complying with eye safety requirements and stability requirements.
- the allowed dose may be divided between the spots of the light pattern.
- At least one of the infrared light spots may be associated with a beam divergence of 0.2° to 0.5°, preferably 0.1 ° to 0.3°.
- the pattern illumination source 114 may comprises at least one pattern projector configured for generating the infrared light pattern.
- the pattern illumination source 114 e.g. the pattern projector, may comprise at least one emitter, in particular a plurality of emitters.
- the emitter may comprise at least one element selected from the group consisting of at least one laser source such as at least one semi-conductor laser, at least one double heterostructure laser, at least one external cavity laser, at least one separate confinement heterostructure laser, at least one quantum cascade laser, at least one distributed Bragg reflector laser, at least one polariton laser, at least one hybrid silicon laser, at least one extended cavity diode laser, at least one quantum dot laser, at least one volume Bragg grating laser, at least one Indium Arsenide laser, at least one Gallium Arsenide laser, at least one transistor laser, at least one diode pumped laser, at least one distributed feedback lasers, at least one quantum well laser, at least one interband cascade laser, at least one semiconductor ring laser
- the pattern illumination source 114 e.g. the pattern projector comprises at least one VCSEL, preferably a plurality of VCSELs.
- the plurality of VCSELs may be arranged in at least array, e.g. comprising a matrix of VCSELs.
- the VCSELs may be arranged on a common substrate or on different substrates. Examples for VCSELs can be found e.g. in en.wikipedia.org/wikiA/erticalcavity_surface-emitting_laser.
- VCSELs are generally known to the skilled person such as from WO 2017/222618 A.
- Each of the VCSELs is configured for generating at least one light beam.
- the VCSEL or the plurality of VCSELs may be configured for generating the desired spot number equal or below or equal 4000 spots, preferably, equal or below 3000 spots, more preferably equal or below 2000 spots.
- the VCSELs may be configured for emitting light beams at a wavelength range from 800 to 1000 nm.
- the VCSELs may be configured for emitting light beams at 808 nm, 850 nm, 940 nm, or 98020 nm.
- the VCSELs emit light at 940 nm, since terrestrial sun radiation has a local minimum in irradiance at this wavelength, e.g. as described in CIE 085- 1989 concealSolar spectral Irradiance”.
- the pattern illumination source 114 may comprise at least one optical element, not shown here, configured for increasing, e.g. duplicating, the number of spots, e.g. the spots generated by the pattern projector.
- the pattern illumination source 114 may comprises at least one diffractive optical element (DOE) and/or at least one metasurface element.
- DOE diffractive optical element
- the DOE and/or the metasurface element may be configured for generating multiple light beams from a single incoming light beam.
- a VCSEL projecting up to 2000 spots and an optical element comprising a plurality of metasurface elements may be used to duplicate the number of spots.
- Other duplications are possible.
- a VCSEL or a plurality of VCSELs may be used and the generated laser spots may be duplicated by using at least one DOE.
- the pattern illumination source 114 may comprise at least one transfer device, not shown here.
- the transfer device may comprise at least one imaging optical device .
- the transfer device specifically may comprise one or more of: at least one lens, for example at least one lens selected from the group consisting of at least one focus-tunable lens, at least one aspheric lens, at least one spheric lens, at least one Fresnel lens; at least one diffractive optical element; at least one concave mirror; at least one beam deflection element, preferably at least one mirror; at least one beam splitting element, preferably at least one of a beam splitting cube or a beam splitting mirror; at least one multilens system; at least one holographic optical element; at least one meta optical element.
- the transfer device comprises at least one refractive optical lens stack.
- the transfer device may comprise a multi-lens system having refractive properties.
- the optoelectronic apparatus 112 comprises the at least one flood illumination source 116 configured for emitting infrared flood light.
- the flood illumination source 116 may be configured for providing substantially continuous spatial illumination.
- the flood light may be substantially continuous spatial illumination, in particular diffuse and/or uniform illumination.
- the flood light has a wavelength in the infrared range, in particular in the near infrared range.
- the flood illumination source 116 may comprise at least one VCSEL, preferably a plurality of VCSELs.
- a relative distance between the flood illumination source 116 and the pattern illumination source 114 is below 3.0 mm.
- the relative distance between the flood illumination source 116 and the pattern illumination source 114 is below 2.5 mm, preferably below 2.0 mm.
- the pattern illumination source 114 and the flood illumination source 116 may be combined into one module.
- the pattern illumination source 114 and the flood illumination source 116 may be arranged on the same substrate, in particular having a minimum relative distance.
- the minimum relative distance may be defined by a physical extension of the flood illumination source 116 and the pattern illumination source 114.
- Arranging the pattern illumination source 114 and the flood illumination source 116 having a relative distance below 3.0 mm can result in decreased space requirement of the two illumination sources 114, 116.
- said illumination sources 114, 116 can even be combined into one module.
- Such a reduced space requirement can allow reducing the transparent area(s) in a display necessary for operation of the illumination source(s) 114, 116 behind the display 120.
- the pattern illumination source 114 and the flood illumination source 116 may comprise at least one VCSEL, preferably a plurality of VCSELs.
- the pattern illumination source 114 may comprise a plurality of first VCSELs mounted on a first platform.
- the flood illumination source 116 may comprise a plurality of second VCSELs mounted on a second platform.
- the second platform may be beside the first platform.
- the optoelectronic apparatus 112 may comprise a heat sink. Above the heat sink a first increment comprising the first platform may be attached. Above the heat sink a second increment comprising the second platform may be attached. The second increment may be different from the first increment.
- the first platform may be more distant to the optical element configured for increasing, e.g. duplicating, the number of spots.
- the second platform may be closer to the optical element.
- the beam emitted from the second VCSEL may be defocused and thus, form overlapping spots. This leads to a substantially continuous illumination and, thus, to flood illumination.
- the optoelectronic apparatus 112 comprises the at least one image generation unit 118 configured for generating at least one pattern image while the pattern illumination source 114 is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source 116 is emitting infrared flood light.
- the image generation unit 118 may be at least one one unit of the optoelectronic apparatus 112 configured for generating at least one image.
- the image generation may comprise capturing and/or generating and/or determining and/or recording at least one image by using the image generation unit 118.
- the image generation may comprise imaging and/or recording the image.
- the image generation may comprise capturing a single image and/or a plurality of images such as a sequence of images.
- the image generation may comprise recording continuously a sequence of images such as a video or a movie.
- the image generation may be initiated by a user action or may automatically be initiated, e.g. once the presence of at least one object or user within a field of view and/or within a predetermined sector of the field of view of the image generation unit is automatically detected.
- the image generation unit 118 may comprise at least one optical sensor, in particular at least one pixelated optical sensor.
- the image generation unit 118 may comprise at least one CMOS sensor or at least one CCD chip.
- the image generation unit 118 may comprise at least one CMOS sensor which may be sensitive in the infrared spectral range.
- the image may be image data recorded by using the optical sensor, such as a plurality of electronic readings from the CMOS or CCD chip.
- the image may comprise raw image data or may be a pre- processed image.
- the pre-processing may comprise applying at least one filter to the raw image data and/or at least one background correction and/or at least one background subtraction.
- the image generation unit 118 may comprise a monochrome camera, e.g. comprising monochrome pixels.
- the image generation unit 118 may comprise a color camera, e.g. comprising color pixels.
- the image generation unit may comprise a color CMOS camera.
- the camera may comprise monochrome pixels and color pixels. The color pixels and the monochrome pixels may be combined internally in the camera.
- the image generation unit 118 may comprise at least one color camera (e.g. RGB) and/or at least one monochrome camera, such as a monochrome CMOS.
- the camera may comprise at least one monochrome CMOS chip.
- the camera generally may comprise a one-dimensional or two- dimensional array of image sensors, such as pixels.
- the image generation unit 118 may be at least one color, e.g. RGB, camera.
- the color camera may be a selfie camera of a smartphone.
- the image generation unit 118 may have a field of view between 10°x10° and 75°x75°, preferably 55°x65°.
- the image generation unit may have a resolution below 2 MP, preferably between 0.3 MP and 1.5 MP.
- the image generation unit 118 may comprise may comprise further elements, such as one or more optical elements, e.g. one or more lenses.
- the optical sensor may be a fix- focus camera, having at least one lens which is fixedly adjusted with respect to the camera.
- the camera may also comprise one or more variable lenses which may be adjusted, automatically or manually. Other cameras, however, are feasible.
- the pattern image may be an image generated by the image generation unit 118 while illuminating with the infrared light pattern, e.g. on an object and/or a user.
- the pattern image may comprise an image showing a user, in particular at least parts of the face of the user, while the user is being illuminated with the infrared light pattern.
- the pattern image may be generated by imaging and/or recording light reflected by an object and/or user which is illuminated by the infrared light pattern.
- the illumination by the pattern illumination source 114 and the imaging by using the optical sensor may be synchronized, e.g. by using at least one control unit of the optoelectronic apparatus 112.
- the flood image may be an image generated by the image generation unit 118 while illumination source is emitting infrared flood light, e.g. on an object and/or a user.
- the flood image may comprise an image showing a user, in particular the face of the user, while the user is being illuminated with the flood light.
- the flood image may be generated by imaging and/or recording light reflected by an object and/or user which is illuminated by the flood light.
- the illumination by the flood illumination source 116 and the imaging by using the optical sensor may be synchronized, e.g. by using at least one control unit of the optoelectronic apparatus 112.
- the image generation unit 118 may be configured for imaging and/or recording the pattern image and the flood image at the same time or at different times.
- the optoelectronic apparatus 112 may be comprised in the device 110.
- the device 110 may comprise the at least one display 120, wherein the infrared light pattern traverses the display 120 while being emitted from the pattern illumination source 114 and/or the infrared flood light traverses the display 120 while being emitted from the flood illumination source 116.
- the display 120 may be an arbitrary shaped device configured for displaying an item of information.
- the item of information may be arbitrary information such as at least one image, at least one diagram, at least one histogram, at least one graphic, text, numbers, at least one sign, an operating menu, and the like.
- the display 120 may be or may comprise at least one screen.
- the display 120 may have an arbitrary shape, e.g. a rectangular shape.
- the display 120 may be a front display of the device 110.
- the display 120 may be or may comprise at least one organic light-emitting diode (OLED) display.
- OLED organic light-emitting diode
- the OLED display may be configured for emitting visible light.
- the display 120 may be made of and/or may be covered by glass.
- the display 120 may comprise at least one glass cover.
- the display 120 may be at least partially transparent.
- the display 120 may be semitransparent in the near infrared region.
- the display 120 may have a transparency of 20 % to 50 % in the near infrared region.
- the display 120 may have a different transparency for other wavelength ranges.
- the present invention may propose an optoelectronic apparatus 112 comprising the image generation unit 118 and two illumination sources 114, 116 that can be placed behind the display 120 of a device 110.
- the transparent area(s) of the display 120 can allow for operation of the optoelectronic apparatus 112 behind the display 120.
- the display 120 can be an at least partially transparent display, as described above.
- the display 120 may have a reduced pixel density and/or a reduced pixel size and/or may comprise at least one transparent conducting path.
- the transparent area(s) of the display 120 may have a pixel density of 300-440 PPI (pixels per inch), more preferably 350 to 450 PPI.
- Other areas of the display 120 e.g. non-transparent areas, may have pixel densities higher than 400 PPI, e.g. a pixel density of 450-500 PPI.
- the display 120 comprises a display area.
- the display area may be an active area of the display 120, in particular an area which is activatable.
- the display 120 may have additional areas such as recesses or cutouts.
- the display 120 may be at least partially transparent in at least one continuous area, preferably in at least two continuous areas. At least one of the continuous areas may cover the image generation unit and/or the pattern illumination source 114 and/or the flood illumination source 116 at least partially.
- the pattern illumination source 114, the flood illumination source 116 and the image generation unit 118 may be placed in direction of propagation of the infrared light pattern in front of the display 120. As outlined above, the pattern illumination source 114 and the flood illumination source 116 may be combined into one module. This may allow reducing the transparent area(s) of the display 120.
- the display 120 may have a first area associated with a first pixel density value and a second area associated with a second pixel density value.
- the first pixel density value may be lower than the second pixel density value.
- the first pixel density value may be equal to or below 450 PPI, preferably from 300 to 440 PPI, more preferably 350 to 450 PPI.
- the second pixel density value may be 400 to 500 PPI, preferably 450 to 500 PPI.
- the first pixel density value is associated with the at least one continuous area being at least partially transparent.
- the contrast may be defined as a difference between signal and backlight. In this case, the contrast can be considered to be reduced by the transmission. However, the contrast may be defined as a ratio of signal to background. In this case, the contrast may not be reduced by the transmission only, but by the diffraction in the spot pattern projection as the higher orders bring additional intensity besides the zero-th order. Diffraction may further takes place due to the structure of a display wiring leading to a further decrease in irradiance.
- the spot pattern projected through a display can result in main spots corresponding to the pattern before traversing the display area with reduced intensity because higher order spots are generated. Both effects lead to a reduction in irradiance to about 3-5% of initial irradiance and the presence of undesired additional spots in the pattern.
- the present invention allows for providing a sufficient irradiance and, thus, contrast in the pattern image.
- the contrast can be increased by decreasing the number of spots projected onto the user. This can lead to an increase in irradiance of a spot and thus, to an increase in contrast in an image of the projection of the spot pattern. Moreover, reducing of transparent areas in displays can be possible.
- the pattern illumination source 114 and the flood illumination source 116 used for authentication of a user can be combined into one module to reduce the transparent area in displays.
- Using such an optoelectronic apparatus 112 can allow for a rearrangement of the position of a camera, e.g. from the outer most position of the display.
- Such a rearrangement can allow for an optimized wiring since a further position, the position of the image generation unit 118, can be optimized. This can decrease the used wiring or allow for an improved battery operation.
- the device 110 is configured for authenticating a user of a device 110 to perform at least one operation on the device 110 that requires authentication.
- the authenticating may comprise verifying an identity of a user.
- the authentication may comprise distinguishing between the user from other humans or objects, in particular between authorized access from non-authorized accesses.
- the authentication may comprise verifying identity of a respective user and/or assigning identity to a user.
- the authentication may comprise generating and/or providing identity information, e.g. to other devices or units such as to at least one authorization unit for authorization for providing access to the device 110.
- the identify information may be proofed by the authentication.
- the identity information may be and/or may comprise at least one identity token.
- an image of a face recorded by the image generation unit 118 may be verified to be an image of the user’s face and/or the identity of the user is verified.
- the authenticating of the user may be performed using at least one authentication unit 122.
- the authentication unit 122 may be configured for performing at least one authentication process of a user.
- the authentication unit 122 may comprise at least one processor.
- the processor may be an arbitrary logic circuitry configured for performing basic operations of a computer or system, and/or, generally, to a device which is configured for performing calculations or logic operations. In particular, the processor may be configured for processing basic instructions that drive the computer or system.
- the processor may comprise at least one arithmetic logic unit (ALU), at least one floating-point unit (FPU), such as a math co-processor or a numeric co-processor, a plurality of registers, specifically registers configured for supplying operands to the ALU and storing results of operations, and a memory, such as an L1 and L2 cache memory.
- ALU arithmetic logic unit
- FPU floating-point unit
- a plurality of registers specifically registers configured for supplying operands to the ALU and storing results of operations
- a memory such as an L1 and L2 cache memory.
- the processor may be a multi-core processor.
- the processor may be or may comprise a central processing unit (CPU).
- the processor may be or may comprise a microprocessor, thus specifically the processor’s elements may be contained in one single integrated circuitry (IC) chip.
- IC integrated circuitry
- the processor may be or may comprise one or more application-specific integrated circuits (ASICs) and/or one or more field-programmable gate arrays (FPGAs) and/or one or more tensor processing unit (TPU) and/or one or more chip, such as a dedicated machine learning optimized chip, or the like.
- ASICs application-specific integrated circuits
- FPGAs field-programmable gate arrays
- TPU tensor processing unit
- the processor specifically may be configured, such as by software programming, for performing one or more evaluation operations.
- the authentication unit 122 is configured for identifying the user based on the flood image.
- the identifying may comprise assigning an identity to a detected face and/or at least one identity check and/or verifying an identity of the user.
- the authentication process may comprise a plurality of steps.
- the authentication process may comprise performing at least one face detection.
- the face detection step may comprise analyzing the flood image.
- the authentication process may comprise identifying.
- the identifying may comprise assigning an identity to a detected face and/or at least one identity check and/or verifying an identity of the user.
- the identifying may comprise performing a face verification of the imaged face to be the user’s face.
- the identifying the user may comprise matching the flood image, e.g.
- the identifying of the user may comprise determining if the imaged face is the face of the user, in particular if the imaged face corresponds to at least one image of the user’s face stored in at least one memory, e.g. of the device.
- the analyzing of the flood image may comprise one or more of the following: a filtering; a selection of at least one region of interest; a formation of a difference image between the flood image and at least one offset; an inversion of flood image; a background correction; a decomposition into color channels; a decomposition into hue; saturation; and brightness channels; a frequency decomposition; a singular value decomposition; applying a Canny edge detector; applying a Laplacian of Gaussian filter; applying a Difference of Gaussian filter; applying a Sobel operator; applying a Laplace operator; applying a Scharr operator; applying a Prewitt operator; applying a Roberts operator; applying a Kirsch operator; applying a high-pass filter; applying a low-pass filter; applying a Fourier transformation; applying a Radon- transformation; applying a Hough-transformation; applying a wavelet-transformation; a thresholding; creating a binary image.
- the region of interest may be determined manually by a user or may be determined automatically, such as by recognizing the user within the image.
- the analyzing of the flood image may comprise using at least one image recognition technique, in particular a face recognition technique.
- An image recognition technique comprises at least one process of identifying the user in an image.
- the image recognition may comprise using at least one technique selected from the technique consisting of: color-based image recognition, e.g. using features such as template matching; image segmentation and/or blob analysis e.g. using size, or shape; machine learning and/or deep learning e.g. using at least one convolutional neural network.
- the analyzing of the flood image may comprise determining a plurality of facial features.
- the analyzing may comprise comparing, in particular matching, the determined facial features with template features.
- the template features may be features extracted from at least one template.
- the template may be or may comprise at least one image generated in an enrollment process, e.g. when initializing the device 110. Template may be an image of an authorized user.
- the template features and/or the facial feature may comprise a vector.
- Matching of the features may comprise determining a distance between the vectors.
- the identifying of the user may comprise comparing the distance of the vectors to a least one predefined limit, wherein the user is successfully identified in case the distance is ⁇ the predefined limit at least within tolerances. The user declining and/or rejected otherwise.
- the image recognition may comprise using at least one model, in particular a trained model comprising at least one face recognition model.
- the analyzing of the flood image may be performed by using a face recognition system, such as FaceNet, e.g. as described in Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832.
- the trained model may comprises at least one convolutional neural network.
- the convolutional neural network may be designed as described in M. D. Zeiler and R. Fergus, “Visualizing and understanding convolutional networks”, CoRR, abs/1311.2901 , 2013, or C.
- Learned-Miller “Labeled faces in the wild: A database for studying face recognition in unconstrained environments”, Technical Report 07-49, University of Massachusetts, Amherst, October 2007, the Youtube® Faces Database as described in L. Wolf, T. Hassner, and I. Maoz, “Face recognition in unconstrained videos with matched background similarity”, in IEEE Conf, on CVPR, 2011 , or Google® Facial Expression Comparison dataset.
- the training of the convolutional neural network may be performed as described in Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832.
- the authentication unit 122 is configured for using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data.
- the authentication unit 122 may be configured for extracting material data from the pattern image.
- the authentication unit 122 may be configured for extracting the material data from the pattern image by beam profile analysis of the light spots.
- beam profile analysis can allow for providing a reliable classification of scenes based on a few light spots.
- Each of the light spots of the pattern image may comprise a beam profile.
- the extracting of the material data based on the pattern image may be performed by using at least one model.
- the authentication process may comprise validating based on the extracted material data.
- the validating based on the extracted material data may comprise determining if the extracted material data corresponds a desired material data.
- skin as desired material data may be compared with non-skin material or silicon as material data and the result may be declination since silicon or non-skin material may be different from skin.
- the authentication process or its validation may include generating at least one feature vector from the material data and matching the material feature vector with associate reference template vector for material.
- the authentication unit 122 may be configured for authenticating the user in case the user can be identified and/or if the material data matches the desired material data.
- the device 110 may comprise at least one authorization unit 124 configured for allowing the user to perform at least one operation on the device 110, e.g. unlocking the device 110, in case of successful authentication of the user or declining the user to perform at least one operation on the device 110 in case of non-successful authentication.
- Figure 2 shows an exemplary embodiment of a method for authenticating a user of a device 110 to perform at least one operation on the device 110 that requires authentication.
- the method comprising: a. (reference number 126) illuminating the user with at least one infrared light pattern comprising a plurality of infrared light spots from at least one pattern illumination source 114 of the device 110, wherein the number of infrared light spots is below or equal 4000 spots; b. (reference number 128) illuminating the user with infrared flood light from at least one flood illumination source 116 of the device 110, wherein the distance between the flood illumination source 116 and the pattern illumination source 114 is below 3.0 mm c.
- reference number 130 generating at least one pattern image with an image generation unit 118 of the device 110 showing the user, in particular at least parts of the face of the user, while the user is being illuminated with the infrared light pattern, and generating at least one image with an image generation unit 118 of the device showing the user while the user is being illuminated with the infrared flood light, wherein the image generation unit 118 and/or the illumination sources 114, 116 are covered at least partially by the display 120 of the device 110, d. (reference number 132) identifying the user based on the flood image by using at least one authentication unit 122 of the device 110, e. (reference number 134) extracting material data from the at least one pattern image by using the authentication unit 122; and f. (reference number 136) allowing the user to perform at least one operation on the device 110 that requires authentication based on the material data and the identifying.
- the method steps may be performed in the given order or may be performed in a different order. Further, one or more additional method steps may be present which are not listed. Further, one, more than one or even all of the method steps may be performed repeatedly.
- the pattern image and/or the image showing the user while the user is being illuminated with the infrared flood light may be showing at least a portion of a face of the user.
- the method may be computer implemented.
- the optoelectronic apparatus 112 may comprise at least one optical element 138 having deflection properties.
- the VCSELs of the pattern illumination source 114 are mounted on the first platform and form the first VCSEL chip.
- the VCSELs of the flood illumination source 116 mounted on the second platform and form the second VCSEL chip.
- the optical element 138 may be configured for deflecting light emitted by the pattern illumination source 114 and light emitted by the flood illumination source 116 may be deflected differently.
- Figures 3A and 3B show embodiments of an optical element 138 having deflection properties.
- the optical element 138 may comprise at least two different areas associated with at least two different deflection behaviors.
- the light emitted from the VCSEL chips may be deflected by the optical element 138 depending on the area of the optical element 138 the light illuminates.
- the first VCSEL chip associated with the pattern illumination source 114 may illuminate a first area 140 of the optical element 138 and may be deflected by a first angle.
- the second VCSEL chip associated with the flood illumination source 116 may illuminate a second area 142 of the optical element 138 and may be deflected by a second angle.
- the optical element 138 may be embodied as one optical element having two different deflection properties or the optoelectronic apparatus 112 may comprise at least two optical elements 138, e.g. one of the optical elements 140 is arranged for being illuminated by the pattern illumination source 114, in particular the first VCSEL chip, and the other one of the optical elements 142 is arranged for being illuminated by the flood illumination source 116, in particular the second VCSEL chip.
- the optical element 138 may be wavelength dependent.
- the optical element 138 may be structured to deflect light of different wavelengths differently.
- the VCSEL emitter associated with the pattern illumination source 114 may be associated with a different wavelength than the VCSEL emitter associated with the flood illumination source 116.
- the two light beams may differ in the beam width.
- the optical element 138 may be configured for deflecting light of different beam widths differently.
- the two light beams generated by the pattern illumination source 114 and the flood illumination source 116 may differ in the beam width.
- the light of the pattern illumination source 114 and the flood illumination source 116 may be deflected differently.
- FIG 4 shows a schematic of an exemplary optoelectronic apparatus 210 in a side view.
- the optoelectronic apparatus 210 reference is made to the optoelectronic apparatus 112 described with respect to and shown in Figures 1 to 3. In the following only particularities in comparison with the optoelectronic apparatus 112 are described.
- the optoelectronic apparatus 210 may comprise:
- first Vertical-Cavity Surface-Emitting Laser 212 At least one first Vertical-Cavity Surface-Emitting Laser 212, wherein the first Vertical-Cavity Surface-Emitting Laser comprises an active area 214 situated on a bottom surface 216 of the first Vertical-Cavity Surface-Emitting Laser 212;
- the second Vertical-Cavity Surface-Emitting Laser 218 comprises an active area 220 situated on a top surface 222 of the second Vertical-Cavity Surface-Emitting Laser 218, wherein the top surface 222 is opposite of a bottom surface 124 of the second Vertical-Cavity Surface-Emitting Laser 218;
- first Vertical-Cavity Surface-Emitting Laser 222 is arranged with the bottom surface 216 on the supporting member 226; and wherein the second Vertical-Cavity Surface-Emitting Laser 118 is arranged with the bottom surface 224 on the supporting member 226.
- the first Vertical-Cavity Surface-Emitting Laser 222 and the second Vertical-Cavity Surface-Emitting Laser 218 may be arranged on the same supporting member 226.
- the supporting member 226 may be at least one of:
- an electrical connector 229 specifically a printed circuit board
- the optoelectronic apparatus 210 further may comprise at least one optical lens 228.
- the first Vertical-Cavity Surface-Emitting Laser 212 and/or the second Vertical-Cavity Surface-Emitting Laser 218 may be configured for emitting illumination light 244, 246 through the optical lens.
- a first distance 230 between the first Vertical-Cavity Surface-Emitting Laser 212, specifically the active area 214 of the first Vertical-Cavity Surface-Emitting Laser 212, and the at least one optical lens 228 differs from a second distance 232 between the second Vertical-Cavity Surface-Emitting Laser 218, specifically the active area 220 of the second Vertical-Cavity Surface-Emitting Laser 218, and the at least one optical lens 228.
- the optical lens 228 may have at least one focal length, wherein the first distance 230 or the second distance 232 equals the focal length.
- the first distance 230 and the second distance 232 may differ by a thickness 232 of the first Vertical-Cavity Surface-Emitting Laser 212 or a thickness of the second Vertical-Cavity Surface-Emitting Laser 218.
- the thickness of the first Vertical-Cavity Surface-Emitting Laser 212 and the thickness of the second Vertical-Cavity Surface-Emitting Laser 218 may be the same.
- a geometrical central axis 238 of the first Vertical-Cavity Surface-Emitting Laser 212 and a geometrical central axis 240 of the second Vertical-Cavity Surface-Emitting Laser 218 may be parallel.
- a normal vector to an emitting surface of the first Vertical-Cavity Surface-Emitting Laser 212 and a normal vector to an emitting surface of the second Vertical-Cavity Surface-Emitting Laser 218 may be parallel.
- the normal vector to the emitting surface of the first Vertical-Cavity Surface-Emitting Laser 212 and/or the normal vector to the emitting surface of the second Vertical-Cavity Surface-Emitting Laser 218 may be parallel to a central ray light of a light emitting cone of the respective Vertical-Cavity Surface-Emitting Laser 212, 218.
- the emitting surface may comprise the active area.
- a geometrical central axis 238 of the first Vertical-Cavity Surface-Emitting Laser 212 and a geometrical central axis 240 of the second Vertical-Cavity Surface-Emitting Laser 218 may be coaxial.
- the geometrical central axis 238 of the first Vertical-Cavity Surface-Emitting Laser 212 and/or the geometrical central axis 240 of the second Vertical-Cavity Surface-Emitting Laser 218 may be parallel to a geometrical central axis 242 of the at least one optical lens 228.
- the first Vertical-Cavity Surface-Emitting Laser 212 may comprise the active area 214 situated on the bottom surface 216 may be configured for emitting illumination light 244 at least partially in direction of a top surface 247 of the first Vertical-Cavity Surface-Emitting Laser 212. At least one further component 248 of the first Vertical-Cavity Surface-Emitting Laser 212 may be at least partially transparent for the illumination light 244 emitted by the first Vertical-Cavity Surface-Emitting Laser 212. The at least one further component 248 of the first Vertical-Cavity Surface-Emitting Laser 212 may be a substrate 250 of the Vertical-Cavity Surface-Emitting Laser 212.
- FIGS 5A, 5B and Figures 6A and 6B show highly schematically further embodiments of the optoelectronic apparatus 312.
- the optoelectronic apparatus 312 reference is made to the optoelectronic apparatus 112 and the optoelectronic apparatus 212 as shown and described with respect to Figures 1 to 4. In the following only particularities in comparison with the optoelectronic apparatus 112 and the optoelectronic apparatus 212 are described.
- the optoelectronic apparatus 312 may comprise: a light emitter structure 314 comprising a plurality of light emitters 316, wherein a first array 318 of light emitters of the plurality of light emitters 316 forms a pattern illumination source 320 configured for emitting the infrared light pattern, wherein a second array 322 of light emitters of the plurality of light emitters 316, different from the light emitters of the first array 318, forms a flood illumination source 324 configured for emitting the infrared flood light; a base 325 providing a single plane for mounting the light emitters 316; at least one system of optical elements 326 comprising a plurality of optical elements, wherein the system of optical elements 326 is configured for focusing the emitted infrared light pattern onto a focal plane, wherein the system of optical elements 326 covers the light emitter structure 314; at least one flood light optical element 330 configured for defocusing light emitted by the light emitters of the flood illumination source 324 thereby forming overlapping light
- the base 325 may comprise a plurality of cavities into which the light emitter structure 314 can be mounted.
- the base 325 may have an arbitrary shape such as a rectangular, a circular, a hexagonal shape. The shape may refer to a side of the base oriented perpendicular to the direction in which height is measured.
- the base 325 may comprise at least one semiconductor substrate.
- the base 325 may be an element of the light emitter structure and/or an additional element.
- the base 325 may be and/or may comprise a thermally conducting printed circuit board (PCB).
- PCB thermally conducting printed circuit board
- the light emitters 316 of the light emitter structure 314 may form a chip of light emitters, e.g. a VCSEL die, e.g. as sawn out of a wafer. Such a chip of light emitters 316 may be mounted on the base, e.g. by using at least one heat conductive adhesive.
- the base 325 may comprise at least one thermally conducting material.
- the base 325 can be the bottom of the optoelectronic apparatus 312, e.g. of a housing of the optoelectronic apparatus 312. Thus, dimensions of the base 325 may be defined by the dimension of the optics and the housing. Alternatively, the base 325 and the housing may be separate elements.
- the chip of light emitters 316 may be mounted, e.g. by using at least one heat conductive adhesive, on the base 325, e.g. the PCB, and the housing may be applied to this combined element.
- the base 325 may comprise at least one thermally conducting material, in particular thermally conducting materials.
- the thermally conducting materials may be configured as a heat exchanger.
- the thermally conducting materials may be configured for regulating the temperature of the light emitter.
- the thermally conducting materials may be configured for transferring heat generated by a light emitter away from the light emitter.
- the thermally conducting materials may comprise at least one composite material.
- the light emitter structure may be mounted on the thermally conducting materials.
- Each of the light emitters 316 may comprise at least one vertical cavity surface emitting laser (VCSEL).
- VCSEL vertical cavity surface emitting laser
- the light emitters 316 may be configured for emitting light in the near infrared spectral range, preferably a wavelength of the emitted light is from 760 nm to 1 .5 pm, preferably 940 nm, 1140 nm or >1400 nm.
- the light emitters 316 of the pattern illumination source 320 and of the flood illumination source 324 may be active at different points in time.
- the light emitters 316 of the light emitter structure 314 may be arranged in a periodic pattern.
- the light emitters 316 of the light emitter structure 314 may be arranged in one or more of a grid pattern, a hexagonal pattern, a shifted hexagonal pattern or the like.
- a plurality of light emitters 316 of the light emitter structure 314 may form the first array 318 of light emitters 316 and a plurality of light emitters 316, different from the light emitters 316 of the first array 318, of the light emitter structure 314 form the second array 322 of light emitters 316.
- the light emitter structure 314 may comprise two light emitter arrays (318, 322), e.g. two VCSEL arrays. The arrays (318, 322) are located in one plane.
- the first array 318 and the second array 322 of emitters 316 may be produced as a single die directly on the plane or the first array 318 and the second array 322 of emitters 316 are produced separately and are installed, e.g. side by side, on the plane. However, embodiments are possible, in which even more arrays are used, e.g. configured for providing different functions.
- the system of optical elements 326 may comprise one or more of at least one refractive lens, a plurality of refractive lenses; at least one diffractive optical element (DOE), a plurality of DOEs, a plurality of meta lenses.
- the system of optical elements 326 may comprise at least one refractive lens and at least one optical element configured for increasing, e.g. duplicating, the number of spots, e.g. the spots generated by the light emitters of the pattern illumination source.
- the system of optical elements 326 may comprise at least one diffractive optical element (DOE) and/or at least one metasurface element.
- the DOE and/or the metasurface element may be configured for generating multiple light beams from a single incoming light beam.
- a VCSEL projecting up to 2000 spots and an optical element comprising a plurality of metasurface elements may be used to duplicate the number of spots.
- Further arrangements, particularly comprising a different number of projecting VCSEL and/or at least one different optical element configured for increasing the number of spots may be possible.
- Other multiplication factors are possible.
- a VCSEL or a plurality of VCSELs may be used and the generated laser spots may be duplicated by using at least one DOE.
- the system of optical elements 326 covers the light emitter structure 314.
- the system of optical elements 326 may be designed and/or arranged such that it covers the light emitter structure 314.
- the first array 318 may be covered by the system of optical elements 326 only.
- both of the first array 318 and the second array 322 may be covered by the system of optical elements 326.
- the light emitter structure 314 may be located at the focal point of the system of optical elements 326. Such an arrangement may allow that the emitted light of the light emitters 316, in particular of the light emitters 316 of the pattern illumination source 320, is collimated.
- the flood light optical element 330 may be configured for defocusing light emitted by the light emitters 316 of the flood illumination source 324 thereby forming overlapping light spots.
- the flood light source 324 in combination with the flood light optical element 330 is configured for generating flood light, in particular diffuse illumination.
- the flood light optical element 330 may comprise at least one element selected from the group consisting of: at least one plate with a refractive index larger than 1.4, e.g. a glass plate, at least one diffusor plate, at least one lens, at least one micro lens, at least one prisma, at least one fresnel lens, at least one diffractive optical element (DOE), at least one meta lens.
- DOE diffractive optical element
- the second array 322 may be completely covered by the flood light optical element 330.
- Figures 5A and 5B show an example in which the first array 318 and the second array 322 may be arranged side by side on the plane, in particular adjacent.
- Figure 5A shows a top view of an exemplary layout of the light emitter structure 314.
- the light emitter structure 314 may comprise two VCSEL arrays 318, 322.
- the VCSEL arrays 318, 322 are located on one plane, in this case side by side.
- Figure 5B shows for the layout of the light emitter structure 314 of Figure 5A, an embodiment of operation of the optoelectronic apparatus 312. On the left side of Figure 5B, the optoelectronic apparatus 312 is shown with activated illumination source 320.
- the optoelectronic apparatus 312 is shown with activated flood illumination source 324.
- the plane of the base 325 may comprise along a direction perpendicular to an optical axis of the optoelectronic apparatus 312 firstly the first array 318 and subsequently the second array 322.
- the VCSEL array 318 may define a dot pattern, e.g. a periodic regular pattern such as a simple grid pattern, hexagon pattern, shifted hexagon pattern and the like.
- the VCSEL array 318 may be located at the focal point of the system of optical elements 326, i.e. all cavities are collimated.
- the second array 322 may be responsible for flood illumination, in particular for diffuse illumination. This array 322 may be completely covered by flood light optical element 330.
- the second array 322 may be located at the focal point of the system of optical elements 326, too.
- the light generated by the second array 322 would also be in focus.
- the optical imaging is changed by the additional flood light optical element 330 such that the cavities are not collimated properly. This can allow generating a diffuse flood illumination.
- the flood light optical element 330 is configured for leaving the emitted infrared light pattern uninfluenced.
- the VCSEL array 318 may not be covered by this flood light optical element 330.
- the optoelectronic apparatus 312 comprises a single flood light optical element 330 covering all light emitters of the second array 322.
- the optoelectronic apparatus 312 comprises a plurality of flood light optical elements 330.
- each light emitter 316 of the second array 322 comprises at least one assigned flood light optical element 330.
- Figure 6A shows a top view of an exemplary layout of the light emitter structure 314.
- the light emitter structure 314 may comprise two VCSEL arrays 318, 322.
- the VCSEL arrays 318, 322 are located on one plane.
- Figures 6A and 6B show an example, in which cavities of the light emitters 316 of the first array 318 and cavities of the light emitters 316 of the second array 322 may form a combined pattern, in which the cavities of light emitters 316 of the first array 318 and cavities of the light emitters 316 of the second array 322 alternate, e.g. line by line.
- White circles in Figure 6A denote cavities of the illumination source 320 and black circles denote cavities of the flood illumination source 324, e.g. each having an additional micro lens.
- the optoelectronic apparatus 312 is shown with activated illumination source 320.
- the optoelectronic apparatus 312 is shown with activated flood illumination source 324.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Computer Security & Cryptography (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Vascular Medicine (AREA)
- Artificial Intelligence (AREA)
- Non-Portable Lighting Devices Or Systems Thereof (AREA)
Abstract
An optoelectronic apparatus (112) is disclosed, comprising: - at least one pattern illumination source (114) configured for illuminating at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, - at least one flood illumination source (116) configured for emitting infrared flood light, - at least one image generation unit (118) configured for generating at least one pattern image while the pattern illumination source (114) is illuminating infrared light pattern and configured for generating at least one flood image while the flood illumination source (116) is emitting infrared flood light. A relative distance between the flood illumination source (116) and the pattern illumination source (114) is below 3.0 mm. (Figure 1)20
Description
Behind OLED authentication
Description
Field of the invention
The invention relates to an optoelectronic apparatus, use of the optoelectronic apparatus, a device for authenticating a user, a method for authenticating a user of a device. The present invention further relates to a computer program, a computer-readable storage medium and a non-transient computer-readable medium. The devices, methods and uses according to the present invention specifically may be employed for example in various areas of daily life, security technology, gaming, traffic technology, production technology, photography such as digital photography or video photography for arts, documentation or technical purposes, safety technology, information technology, agriculture, crop protection, maintenance, cosmetics, medical technology or in the sciences. However, other applications are also possible.
Prior art
Available authentication systems in mobile devices such as in smartphones, tablets and the like, include cameras. Said mobile devices usually have a front display such as an organic lightemitting diode (OLED) area. For the integration of a camera, a cutout of the display at the position of the camera is required. These cutouts decrease an available display area, the so- called notch, and thus, the display area available to the user. Usually, because a notch is an unpleasant feature to the user, the camera is restricted to the outer most position possible to avoid dark areas for example in the middle of the display.
Thus, currently available mobile device have the drawback that the display area is decreased due to a camera and the position of the camera cannot be adapted in order to improve process efficiency. There is desire to rearrange the position of a camera while increasing the user experience.
Moreover, in order to further improve security of biometric authentication methods, such as facial recognition, passive image processing methods used for this purpose are being supplemented with active methods. This can allow improved detection of forged authentication attempts. Active method may use laser based projection technologies for enriching scenes with additional information. For such an approach, generally an unaffected optical path for the camera (Rx) and the laser projector (Tx) would be required. For example, established active methods such as structured light, active stereo, or time of flight use light typically passing through a maximally transparent protective glass to reconstruct the three-dimensional structure of the illuminated scene from the captured images. For example, a CMOS camera sensitive in the near infrared spectral range and a laser projector emitting light in the near infrared spectral range may be used, e.g. for ensuring invisibility of the method for the user. As outlined above, in
case of using a smartphone, e.g. in order to perform a secure biometric unlocking of the smartphone, such a setup requires additional cutouts in the display, e.g. in order to be used together with a selfie camera, which may work in the visible spectral range.
The laser projector may be configured for operating behind the semitransparent OLED display. This may allow reducing the number and size of cutouts within the display. The semitransparent OLED-display, typically, comprises an OLED-pixel-structure. The OLED-pixel structure may be in particular defined by the optical non-transparent cathode. The use of transparent conductive tracks and design of the drive electronics can enable (semi-) transparent areas between the OLED pixels. Whereas for the passive operation, e.g. of the selfie camera, suitable transparency can be ensured, light from the laser projector undergoes diffraction loss. Thus, active methods may have an effective transmission from 3 % to 20 %, only. Such diffraction loss occurs for the light emission (Tx) as well as for the light detection (Rx). In addition, the diffraction loss can be visible as artefacts, e.g. decrease of contrast, in the recorded images. Thus, known method and devices face the following drawbacks: the resulting light at the light detection is significantly decreased due to double passage through the display; a remaining image quality may be further reduced by diffraction artifacts; the usable laser power, which would be necessary for compensating such losses, is limited due to requirements of eye safety (maximum dose after emission from the display) and display stability (maximum thermal dose).
WO 2021/259923 A1 describes a projector and illumination module configured for scene illumination and pattern projection. The projector and illumination module comprises at least one array of a plurality of individual emitters and at least one optical system. Each of the individual emitters is configured for generating at least one illumination light beam. The optical system comprises at least one array of a plurality of transfer devices. The array of transfer devices comprises at least one transfer device for each of the individual emitters. The array of transfer devices comprises at least two groups of transfer devices. The transfer devices of the two groups differ in at least one property. The transfer devices of one of the groups are configured for generating at least one illumination pattern in response to illumination light beams impinging on said transfer devices. The transfer devices of the other group are configured for generating diverging light beams in response to illumination light beams impinging on said transfer devices.
CN 114 779 491 A describes a terminal display module comprising a display panel and a three- dimensional imaging module.
WO 2022/101429 A1 describes a display device which comprises: at least one illumination source configured for projecting at least one illumination beam on at least one scene; at least one optical sensor having at least one light sensitive area, wherein the optical sensor is configured for measuring at least one reflection light beam generated by the scene in response to illumination by the illumination beam; at least one translucent display configured for displaying information, wherein the illumination source and the optical sensor are placed in direction of propagation of the illumination light beam in front of the display; at least one control
unit configured for turning off the display in an area of the illumination source during illumination and/or in an area of the optical sensor during measuring.
Problem addressed by the invention
It is therefore an object of the present invention to provide devices and methods facing the above-mentioned technical challenges of known devices and methods. Specifically, it is an object of the present invention to provide devices and methods, which allow rearrangement of the position of the camera as well as improved operation of active techniques of authentication using devices behind a display.
Summary of the invention
This problem is solved by the invention with the features of the independent patent claims. Advantageous developments of the invention, which can be realized individually or in combination, are presented in the dependent claims and/or in the following specification and detailed embodiments.
In a first aspect of the present invention, an optoelectronic apparatus is disclosed. The optoelectronic apparatus comprises: at least one pattern illumination source configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, at least one flood illumination source configured for emitting infrared flood light, at least one image generation unit configured for generating at least one pattern image while the pattern illumination source is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source is emitting infrared flood light.
A relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm.
The term “optoelectronic apparatus” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a device or system operating on light and electrical currents.
The term “light” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to electromagnetic radiation in one or more of the infrared, the visible and the ultraviolet spectral range. Herein, the term “ultraviolet spectral range”, generally, refers to electromagnetic radiation having a wavelength of 1 nm to 380 nm, preferably of 100 nm to 380 nm. Further, in partial accordance with standard ISO-
21348 in a valid version at the date of this document, the term “visible spectral range”, generally, refers to a spectral range of 380 nm to 760 nm. The term “infrared spectral range” (IR) generally refers to electromagnetic radiation of 760 nm to 1000 pm, wherein the range of 760 nm to 1.5 pm is usually denominated as “near infrared spectral range” (NIR) while the range from 1 .5 p to 15 pm is denoted as “mid infrared spectral range” (MidlR) and the range from 15 pm to 1000 pm as “far infrared spectral range” (FIR). Preferably, light used for the typical purposes of the present invention is light in the infrared (IR) spectral range, more preferred, in the near infrared (NIR) and/or the mid infrared spectral range (MidlR), especially the light having a wavelength of 1 pm to 5 pm, preferably of 1 pm to 3 pm.
The term “illuminate”, as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to the process of exposing at least one element to light. The term “illumination source” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an arbitrary device configured for generating or providing light in the sense of the above-mentioned definition.
The term “pattern illumination source” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an arbitrary device configured for generating or providing at least one light pattern, in particular at least one infrared light pattern. The term “light pattern” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to at least one arbitrary pattern comprising a plurality of light spots. The light spot may be at least partially spatially extended. At least one spot or any spot may have an arbitrary shape. In some cases a circular shape of at least one spot or any spot may be preferred. The spots may be arranged by considering a structure of a display comprised by a device that is further comprising the optoelectronic apparatus. Typically, an arrangement of an OLED-pixel- structure of the display may be considered. The term “infrared light pattern” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a light pattern comprising spots in the infrared spectral range. The infrared light pattern may be a near infrared light pattern.
The infrared light may be coherent. The infrared light pattern may be a coherent infrared light pattern.
The pattern illumination source may be configured for emitting light at a single wavelength, e.g. in the near infrared region. In other embodiments, the pattern illumination source may be
adapted to emit light with a plurality of wavelengths, e.g. for allowing additional measurements in other wavelengths channels.
The infrared light pattern may comprise at least one regular and/or constant and/or periodic pattern such as a triangular pattern, a rectangular pattern, a hexagonal pattern or a pattern comprising further convex tilings. For example, the infrared light pattern is a hexagonal pattern, preferably a hexagonal infrared light pattern, preferably a 2/5 hexagonal infrared light pattern. The terms “triangular”, “rectangular” and “hexagonal” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a two-dimensional distribution of spots, in particular unit cells of the pattern. The unit cell in a triangular pattern is a triangle. A triangular pattern may comprise a plurality of groups of spots, wherein each group comprises at least three spots forming a triangle. In a rectangular pattern, the spots of rows and columns are rectangular to each other, e.g. square unit cells, rectangular unit cells or centered rectangular unit cells. A hexagonal pattern comprises a plurality of groups of spots, wherein each group has at least four spots forming a hexagonal unit cell, e.g. wherein three hexagonal cells form a hexagonal prism. In a hexagonal pattern an angle of 120° is formed between neighboring spots of rows and columns. Different packing density or fraction are thinkable for the pattern, e.g. for the hexagonal pattern, e.g. a 2/5 packing density. Using a periodical 2/5 hexagonal pattern can allow distinguishing between artefacts and usable signal.
The infrared light pattern may comprise at least one point pattern. The infrared light pattern has a low point density. The number of infrared light spots is below or equal 4000 spots. The infrared light pattern may comprise equal to or less than 3000 spots, preferably equal to or less than 2000 spots. The number of spots may be below 2000 and/or above 0, preferably above 5, more preferably above 10, most preferably above 100. The infrared light pattern may have a low point density, in particular in comparison with other structured light techniques having typically a point density of 10k - 30k in a field of view of 55x38°. Using such a low point density may allow compensating for the above-mentioned diffraction loss. By decreasing the number of spots projected onto an object and/or a user, a contrast in the pattern image may be increased. Increasing the number of points would decrease the irradiance per point. The decreased number of spots may lead to an increase in irradiance of a spot and thus, to an increase in contrast in the pattern image of the projection of the infrared light pattern. The infrared light pattern may have a periodic point pattern with a reduced number of spots, wherein each of the spots has a high irradiance. Such a light pattern can ensure improved authentication using illumination sources and image generation unit behind a display. Moreover, the low number of spots can ensure complying with eye safety requirements and stability requirements. The allowed dose may be divided between the spots of the light pattern.
At least one of the infrared light spots may be associated with a beam divergence of 0.2° to 0.5°, preferably 0.1 ° to 0.3°. The term “beam divergence” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not
to be limited to a special or customized meaning. The term specifically may refer, without limitation, to at least one measure of an increase in at least one diameter and/or at least one diameter equivalent, such as a radius, with a distance from an optical aperture from which the beam emerges. The measure may be an angle or an angle equivalent. In the context of the present invention, typically, a beam divergence may be determined at 1/e2.
The pattern illumination source may comprise at least one pattern projector configured for generating the infrared light pattern. The pattern illumination source, e.g. the pattern projector, may comprise at least one emitter, in particular a plurality of emitters. The term “emitter” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to at least one arbitrary device configured for providing at least one light beam. The light beam may generate the infrared light pattern. The emitter may comprise at least one element selected from the group consisting of at least one laser source such as at least one semi-conductor laser, at least one double heterostructure laser, at least one external cavity laser, at least one separate confinement heterostructure laser, at least one quantum cascade laser, at least one distributed Bragg reflector laser, at least one polariton laser, at least one hybrid silicon laser, at least one extended cavity diode laser, at least one quantum dot laser, at least one volume Bragg grating laser, at least one Indium Arsenide laser, at least one Gallium Arsenide laser, at least one transistor laser, at least 50 one diode pumped laser, at least one distributed feedback lasers, at least one quantum well laser, at least one interband cascade laser, at least one semiconductor ring laser, at least one vertical cavity surface emitting laser (VCSEL); at least one non-laser light source such as at least one LED or at least one light bulb. For example, the pattern projector comprises at least one VCSEL, preferably a plurality of VCSELs. The plurality of VCSELs may be arranged in at least one array, e.g. comprising a matrix of VCSELs. The VCSELs may be arranged on the same substrate, or on different substrates. The term “vertical-cavity surface-emitting laser” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a semiconductor laser diode configured for laser beam emission perpendicular with respect to a top surface. Examples for VCSELs can be found e.g. in en.wikipedia.org/wiki/Verticalcavity_surface-emitting_laser. VCSELs are generally known to the skilled person such as from WO 2017/222618 A. Each of the VCSELs is configured for generating at least one light beam. The VCSEL or the plurality of VCSELs may be configured for generating the desired spot number equal or below or equal 4000 spots, preferably, equal or below 3000 spots, more preferably equal or below 2000 spots. The plurality of generated spots may be associated with the infrared light pattern. The VCSELs may be configured for emitting light beams at a wavelength range from 800 to 1000 nm. For example, the VCSELs may be configured for emitting light beams at 808 nm, 850 nm, 940 nm, and/or 980 nm. Preferably the VCSELs emit light at 940 nm, since terrestrial sun radiation has a local minimum in irradiance at this wavelength, e.g. as described in CIE 085-1989 „Solar spectral Irradiance”.
The pattern illumination source may comprise at least one optical element configured for increasing, e.g. duplicating, the number of spots, e.g. the spots generated by the pattern projector. The pattern illumination source, particularly the optical element, may comprises at least one diffractive optical element (DOE) and/or at least one metasurface element. The DOE and/or the metasurface element may be configured for generating multiple light beams from a single incoming light beam. For example, a VCSEL projecting up to 2000 spots and an optical element comprising a plurality of metasurface elements may be used to duplicate the number of spots. Further arrangements, particularly comprising a different number of projecting VCSEL and/or at least one different optical element configured for increasing the number of spots may be possible. Other multiplication factors are possible. For example, a VCSEL or a plurality of VCSELs may be used and the generated laser spots may be duplicated by using at least one DOE.
The pattern illumination source may comprise at least one transfer device. The term “transfer device”, also denoted as “transfer system”, as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to one or more optical elements which are adapted to modify the light beam, particularly the light beam used for generating at least a portion of the infrared light pattern, such as by modifying one or more of a beam parameter of the light beam, a width of the light beam or a direction of the light beam. The transfer device may comprise at least one imaging optical device .The transfer device specifically may comprise one or more of: at least one lens, for example at least one lens selected from the group consisting of at least one focus-tunable lens, at least one aspheric lens, at least one spherical lens, at least one Fresnel lens; at least one diffractive optical element; at least one concave mirror; at least one beam deflection element, preferably at least one mirror; at least one beam splitting element, preferably at least one of a beam splitting cube or a beam splitting mirror; at least one multilens system; at least one holographic optical element; at least one meta optical element. Specifically, the transfer device comprises at least one refractive optical lens stack. Thus, the transfer device may comprise a multi-lens system having refractive properties.
The light beam or light beams generated by the pattern illumination source may propagate parallel to an optical axis. The pattern illumination source may comprise at least one reflective element, preferably at least one prism, for deflecting the light beam onto the optical axis. As an example, the light beam or light beams, such as the laser light beam, and the optical axis may include an angle of less than 10°, preferably less than 5° or even less than 2°. Other embodiments, however, are feasible. Further, the light beam or light beams may be on the optical axis or off the optical axis. As an example, the light beam or light beams may be parallel to the optical axis having a distance of less 10 than 10 mm to the optical axis, preferably less than 5 mm to the optical axis or even less than 1 mm to the optical axis or may even coincide with the optical axis.
The term “flood illumination source” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to at least one arbitrary device configured for providing substantially continuous spatial illumination. The term “flood light” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to substantially continuous spatial illumination, in particular diffuse and/or uniform illumination. The flood light has a wavelength in the infrared range, in particular in the near infrared range. The flood illumination source may comprise at least one LED or at least one VCSEL, preferably a plurality of VCSELs. The term “substantially continuous spatial illumination” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to uniform spatial illumination, wherein areas of non-uniform are possible. The area, e.g. covering a user, a portion of the user and/or a face of the user, illuminated from the flood illumination source, may be contiguous. Power may be spread over a whole field of illumination. In contrast, illumination provided by the light pattern may comprise at least two contiguous areas, in particular a plurality of contiguous areas, and/or power may be concentrated in small (compared to the whole field of illumination) areas of the field of illumination. The infrared flood illumination may be suitable for illuminating a contiguous area, in particular one contiguous area. The infrared pattern illumination may be suitable for illuminating at least two contiguous areas.
The flood illumination source may illuminate a measurement area, such as a user, a portion of the user and/or a face of the user, with a substantially constant illumination intensity. The term “constant” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a time aspect during an exposure time. Flood light may vary temporally and/or may be substantially constant over time. The term “substantially constant” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a completely constant illumination and embodiments in which deviations from a constant illumination of < ± 10 %, preferably < ± 5 %, more preferably < ± 2 % are possible.
A relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm. The distance between the flood and pattern illumination sources may define the distance between the far most distant points of the flood and pattern illumination sources. The relative distance between the flood illumination source and the pattern illumination source may be below 2.5 mm, preferably below 2.0 mm. A lower boundary for the relative distance may be 50 pm, preferably 60 pm, more preferably 70 pm, even more preferably 80 pm, most preferably 100 pm.
The pattern illumination source and the flood illumination source may be combined into one module. For example, the pattern illumination source and the flood illumination source may be
arranged on the same substrate, in particular having a minimum relative distance. The minimum relative distance may be defined by a physical extension of the flood illumination source and the pattern illumination source. Arranging the pattern illumination source and the flood illumination source having a relative distance below 3.0 mm can result in decreased space requirement of the two illumination sources. In particular, said illumination sources can even be combined into one module. Such a reduced space requirement can allow reducing the transparent area(s) in a display necessary for operation of the illumination source(s) behind the display.
The emitting of the infrared flood light and the illumination of the infrared light pattern may be performed subsequently or at at least partially overlapping times. For example, the infrared flood light and the infrared light pattern may be emitted at the same time. For example, one of the flood light or the infrared light pattern may be emitted with a lower intensity compared to the other one.
In an embodiment, the pattern illumination source and the flood illumination source may comprise at least one VCSEL, preferably a plurality of VCSELs. The pattern illumination source may comprise a plurality of first VCSELs mounted on a first platform. The flood illumination source may comprise a plurality of second VCSELs mounted on a second platform. The second platform may be beside the first platform. The optoelectronic apparatus may comprise a heat sink. Above the heat sink a first increment comprising the first platform may be attached. Above the heat sink a second increment comprising the second platform may be attached. The second increment may be different from the first increment. Thus, the first platform may be more distant to the optical element configured for increasing, e.g. duplicating, the number of spots. The second platform may be closer to the optical element. The beam emitted from the second VCSEL may be defocused and thus, form overlapping spots. This leads to a substantially continuous illumination and, thus, to flood illumination.
The VCSELs of the pattern illumination source mounted on the first platform form a first VCSEL chip. The VCSELs of the flood illumination source mounted on the second platform form a second VCSEL chip.
The optoelectronic apparatus may comprise at least one optical element having deflection properties. As outlined above, the VCSELs of the pattern illumination source are mounted on the first platform and form the first VCSEL chip. The VCSELs of the flood illumination source are mounted on the second platform and form the second VCSEL chip. The optical element may be configured for deflecting light emitted by the pattern illumination source and light emitted by the flood illumination source may be deflected differently.
For example, the optical element may comprise at least two different areas associated with at least two different deflection behaviors. The light emitted from the VCSEL chips may be deflected by the optical element depending on the area of the optical element the light illuminates. Thus, for example, the first VCSEL chip may illuminate a first area of the optical element and may be deflected by a first angle. The second VCSEL chip may illuminate a
second area of the optical element and may be deflected by a second angle. For example, the optoelectronic apparatus comprises at least two optical elements, e.g. one of the optical elements is arranged for being illuminated by the pattern illumination source, in particular the first VCSEL chip, and the other one of the optical elements is arranged for being illuminated by the flood illumination source, in particular the second VCSEL chip. For example, the optical element may be wavelength dependent. The optical element may be structured to deflect light of different wavelengths differently. The VCSEL emitter associated with the pattern illumination source may be associated with a different wavelength than the VCSEL emitter associated with the flood illumination source. Additionally or alternatively, the two light beams may differ in the beam width. For example, the optical element may be configured for deflecting light of different beam widths differently. The two light beams generated by the pattern illumination source and the flood illumination source may differ in the beam width. Thus, the light of the pattern illumination source and the flood illumination source may be deflected differently.
Different embodiments for the pattern illumination source and the flood illumination source may be possible. For example, the VCSEL of the pattern illumination source and the VCSEL of the flood illumination source may be arranged on opposing sides. For example, the optoelectronic apparatus may comprise
(1 ) at least one first Vertical-Cavity Surface- Emitting Laser, wherein the first Vertical-Cavity Surface-Emitting Laser comprises an active area situated on a bottom surface of the first Vertical-Cavity Surface-Emitting Laser;
(2) at least one second Vertical-Cavity Surface-Emitting Laser, wherein the second Vertical-Cavity Surface-Emitting Laser comprises an active area situated on a top surface of the second Vertical-Cavity Surface-Emitting Laser, wherein the top surface is opposite of a bottom surface of the second Vertical-Cavity Surface-Emitting Laser;
(3) at least one supporting member; wherein the first Vertical-Cavity Surface- Emitting Laser is arranged with the bottom surface on the supporting member; and wherein the second Vertical-Cavity Surface-Emitting Laser is arranged with the bottom surface on the supporting member.
The term “active area” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a region in which the illumination light is generated. Typically, the active area may comprise one or more quantum wells and/or one or more quantum dots that are in which the illumination light is generated. For generating the illumination light an electric current may be applied to the active area.
The term “top surface” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a
special or customized meaning. The term specifically may refer, without limitation, to an area that is located at the uppermost layer of an element. The top surface may be the part of the element that is exposed or visible in a top view. The term “bottom surface” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an area that is located at the lowermost layer of an element. The bottom surface may be the part of the element that is exposed or visible in a bottom view.
Typically, a Vertical-Cavity Surface-Emitting Laser comprises a plurality of layers. The top surface of the Vertical-Cavity Surface-Emitting Laser is on a different side of the Vertical- Cavity Surface-Emitting Laser than the bottom surface of the Vertical-Cavity Surface- Emitting Laser. The top surface and the bottom surface may be counterparts.
The optical element comprises at least one supporting member; wherein the first Vertical- Cavity Surface-Emitting Laser is arranged with the bottom surface on the supporting member; and wherein the second Vertical-Cavity Surface-Emitting Laser is arranged with the bottom surface on the supporting member. The first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface-Emitting Laser may be arranged on the same supporting member.
The term “supporting member” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to at least one structural element that is designed for bearing loads and/or provide support to at least one further structural element. In particular accordance with the present invention, the supporting element may carry and/or support at least one of: the first Vertical-Cavity Surface-Emitting Laser; the second Vertical-Cavity Surface- Emitting Laser.
For the bottom surface of the first Vertical-Cavity Surface-Emitting Laser to be arranged on the supporting member, the bottom surface of the first Vertical-Cavity Surface-Emitting Laser may be secured to the supporting member. For the top surface of the second Vertical-Cavity Surface-Emitting Laser to be arranged on the supporting member, the top surface of the second Vertical-Cavity Surface- Emitting Laser may be secured to the supporting member.
The term “securing”, or any grammatical variation thereof, as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to stabilizing and/or fixing a first element to a second element. The stabilized and/or fixated first element may be in a predetermined relative position in relation
to the second element in a manner that the first element and the second element hold the relative position and can withstand at least one force and/or at least one stress acting on at least one of the first element and the second element.
Arranging and/or securing at least one of:
- the first Vertical-Cavity Surface-Emitting Laser; and
- the second Vertical-Cavity Surface- Emitting Laser; to the supporting member may be performed by at least one of:
- a soldering process;
- a glueing process; and
- a diffusion bonding process.
The supporting member may be at least one of:
- a heat sink;
- an electrical connector, specifically a printed circuit board; and
- a stiffening element.
The term “heat sink” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an arbitrary heat exchanger configured for dissipating heat generated by at least one component. Typically, the heat sink may be a passive heat exchanger. The heat sink may transfer the heat from the components to a fluid medium, typically air. The heat sink may dissipate the heat from one or more of the first Vertical-Cavity Surface-Emitting Laser; and the second Vertical-Cavity Surface- Emitting Laser.
The term “electrical connector” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an arbitrary electronically device configured for creating at least one electrical connection between a plurality of parts of an electrical circuit or between different electrical circuits.
The term “stiffening element” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an arbitrary element configured for reinforcing, particularly the supporting member.
The term “printed circuit board” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an elements configured for connecting a plurality of electronic components to
one another. Typically, the printed circuit board comprises a laminated sandwich structure of conductive and insulating layers. The electric components may be secured to conductive pads on an outer layer of the printed circuit board in a manner that they electric components are electrically connected to the conductive pad. Particularly therefore, the printed circuit board may comprise at least one conductive pad, wherein the electrical wire may be electrically connected to the conductive pad.
The optical element further may comprise at least one optical lens. The first Vertical-Cavity Surface-Emitting Laser and/or the second Vertical-Cavity Surface- Emitting Laser may be configured for emitting illumination light through the optical lens. A first distance between the first Vertical-Cavity Surface- Emitting Laser, specifically the active area of the first Vertical-Cavity Surface-Emitting Laser, and the at least one optical lens may differ from a second distance between the second Vertical-Cavity Surface- Emitting Laser, specifically the active area of the second Vertical-Cavity Surface-Emitting Laser, and the at least one optical lens.
The term “optical lens” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an at least partially transparent medium configured for bending and/or focusing light rays.
The optical lens may have at least one focal length, wherein the first distance or the second distance equals the focal length. Particularly thereby, the respective optical lens is in the focus of the optical lens and generates a sharp image, particularly for generating illumination light comprising an infrared light pattern. Particularly further thereby, the further optical lens may not be in the focus of the optical lens and generates a blurred image, particularly from generating illumination light comprising infrared flood light.
The term “focal length” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a degree of how strongly an optical system, specifically an optical lens, converges and/or diverges light rays. The optical system may refer to the inverse of the optical power of the optical system. Typically, the focal length of an optical system may be calculated by considering and/or evaluating the image formation of an object. The focal length f may be evaluated by using the known lens formula
wherein v relates the image distance, u relates to the object distance. The optical lens may have one or more different focal lengths.
The first distance and the second distance may differ by a thickness of the first Vertical- Cavity Surface-Emitting Laser or a thickness of the second Vertical-Cavity Surface-Emitting Laser.
The term “thickness” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a distance between two opposite surfaces of an element, such as the Vertical-Cavity Surface- Emitting Laser. The thickness may be the shortest distance between the top surface and the bottom surface of a respective Vertical-Cavity Surface- Emitting Laser. The distance may be a length.
The first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface- Emitting may be parallel. Particularly thus, a geometrical central axis of the first Vertical- Cavity Surface-Emitting Laser and a geometrical central axis of the second Vertical-Cavity Surface-Emitting Laser may be parallel. The term “geometrical central axis” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an imaginary axis that runs through the center of an object, such as the Vertical-Cavity Surface-Emitting Laser. The center may be the geometrical center of the object.
The first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface- Emitting may be coaxial. Particularly thus, the geometrical central axis of the first Vertical- Cavity Surface-Emitting Laser and the geometrical central axis of the second Vertical-Cavity Surface-Emitting Laser may be coaxial.
The first Vertical-Cavity Surface-Emitting Laser and the second Vertical-Cavity Surface- Emitting may be coaxial. Particularly thus, The geometrical central axis of the first Vertical- Cavity Surface-Emitting Laser and/or the geometrical central axis of the second Vertical- Cavity Surface-Emitting Laser may be parallel to a geometrical central axis of the at least one optical lens.
The first Vertical-Cavity Surface-Emitting Laser comprising the active area situated on the bottom surface may be configured for emitting illumination light at least partially in direction of a top surface of the first Vertical-Cavity Surface-Emitting Laser. By emitting illumination light at least partially in direction of the top surface on the Vertical-Cavity Surface-Emitting Laser, the illumination light may be emitted in a manner that the illumination light is directed towards the optical lens, particularly in a manner that the illumination light is at least partially propagating through the optical lens. The first Vertical-Cavity Surface-Emitting Laser and
the second Vertical-Cavity Surface-Emitting Laser may emit illumination light in the same direction.
Various sources and paths of light are to be distinguished. In the context of the present invention, a nomenclature is used which, firstly, denotes light propagating from the optical element, particularly the Vertical-Cavity Surface-Emitting Laser, to the object as “illuminating light” or “illumination light”. Secondly, light propagating from the object to the detector is denoted as “detection light”. In the process of illuminating the object with illumination light, detection light will be generated. The generated detection light may comprise at least one of illumination light reflected by the object, illumination light scattered by the object, illumination light transmitted by the object, luminescence light generated by the object, e.g. phosphorescence or fluorescence light generated by the object after optical, electrical or acoustic excitation of the object by the illumination light or the like. Thus, the detection light may directly or indirectly be generated through the illumination of the object by the illumination light.
At least one further component of the first Vertical-Cavity Surface- Emitting Laser may be at least partially transparent for the illumination light emitted by the first Vertical-Cavity Surface- Emitting Laser.
The at least one further component of the first Vertical-Cavity Surface-Emitting Laser may be a substrate of the first Vertical-Cavity Surface-Emitting Laser.
For example, the pattern illumination source and the flood illumination source may be elements of a light emitter structure. The pattern illumination source may comprise a first array of light emitters and the flood illumination source may comprise a second array of light emitters. The optoelectronic apparatus may comprise a base providing a single plane for mounting the light emitters. The optoelectronic apparatus may comprise at least one system of optical elements comprising a plurality of optical elements. The system of optical elements may be configured for focusing the emitted infrared light pattern onto a focal plane. The system of optical elements may covers the light emitter structure. The optoelectronic apparatus may further comprise at least one flood light optical element configured for defocusing light emitted by the light emitters of the flood illumination source thereby forming overlapping light spots, wherein the flood light optical element is configured for leaving the emitted infrared light pattern uninfluenced.
The term “base” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a carrier on which at least one further element, in particular the light emitter structure, can be mounted. The base may comprise a plurality of cavities into which the light emitter structure
can be mounted. The base may have an arbitrary shape such as a rectangular, a circular, a hexagonal shape. The shape may refer to a side of the base oriented perpendicular to the direction in which height is measured. The base may comprise at least one semiconductor substrate. The base may be an element of the light emitter structure and/or an additional element. The base may be and/or may comprise a thermally conducting printed circuit board (PCB).
The light emitters of the light emitter structure may form a chip of light emitters, e.g. a VCSEL die, e.g. as sawn out of a wafer. Such a chip of light emitters may be mounted on the base, e.g. by using at least one heat conductive adhesive. The base may comprise at least one thermally conducting material. The base can be the bottom of the optoelectronic apparatus, e.g. of a housing of the optoelectronic apparatus. Thus, dimensions of the base may be defined by the dimension of the optics and the housing. Alternatively, the base and the housing may be separate elements. For example, the chip of light emitters may be mounted, e.g. by using at least one heat conductive adhesive, on the base, e.g. the PCB, and the housing may be applied to this combined element.
The base may comprise at least one thermally conducting material, in particular thermally conducting materials. The thermally conducting materials may be configured as a heat exchanger. The thermally conducting materials may be configured for regulating the temperature of the light emitter. The thermally conducting materials may be configured for transferring heat generated by a light emitter away from the light emitter. For example, the thermally conducting materials may comprise at least one composite material. The light emitter structure may be mounted on the thermally conducting materials.
The term “plane” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a surface of the base. The sur-face may be continuous. The plane may be a flat surface. The plane may be designed without curvatures and/or steps. The term “provide” a single plane as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to one or more of comprising, having, being usable as, functioning as a surface on which the light emitter structure is mountable.
The term “light emitter structure” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an assembly of at least four light emitters. The light emitter structure comprises a plurality of light emitters. The term “light emitter”, also abbreviated as emitter, as used
herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to at least one arbitrary device configured for providing at least one light beam. The light beam may generate the infrared light pattern.
The term “array” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a one dimensional, e.g. row, array, or a two dimensional array, in particular in a matrix having m rows and n columns, with m, n, independently, being positive integers. The light emitters of the light emitter structure may be arranged in a periodic pattern. The light emitters of the light emitter structure may be arranged in one or more of a grid pattern, a hexagonal pattern, a shifted hexagonal pattern or the like. A plurality of light emitters of the light emitter structure may form the first array of light emitters and a plurality of light emitters, different from the light emitters of the first array, of the light emitter structure form the second array of light emitters. The light emitter structure may comprise two light emitter arrays, e.g. two VCSEL arrays. The arrays are located in one plane. For example, the first array and the second array of emitters are produced as a single die directly on the plane or the first array and the second array of emitters are produced separately and are installed, e.g. side by side, on the plane. However, embodiments are possible, in which even more arrays are used, e.g. configured for providing different functions.
For example, the first array and the second array may be arranged side by side on the plane, in particular adjacent. For example, the plane comprises along a direction perpendicular to an optical axis of the optoelectronic apparatus firstly the first array and subsequently the second array. Other arrangements, however, are possible.
For example, cavities of the light emitters of the first array and cavities of the light emitters of the second array may form a combined pattern, in which the cavities of light emitters of the first array and cavities of the light emitters of the second array alternate, e.g. line by line or row by row.
The light emitters of the pattern illumination source and of the flood illumination source may be active at different points in time.
The term "system" as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an arbitrary set of inter-acting or interdependent components parts forming a whole. Specifically, the components may interact with each other in order to fulfill at least one common function. The at least two components may be handled independently or may be
coupled or connectable. The term “system of optical elements” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a system comprising at least two optical elements. The system of optical elements may comprise one or more of at least one refractive lens, a plurality of refractive lenses; at least one diffractive optical element (DOE), a plurality of DOEs, a plurality of meta lenses. For example, the system of optical elements may comprise at least one refractive lens and at least one optical element configured for increasing, e.g. duplicating, the number of spots, e.g. the spots generated by the light emitters of the pattern illumination source. The system of optical elements may comprise at least one diffractive optical element (DOE) and/or at least one metasurface element. The DOE and/or the metasurface element may be configured for generating multiple light beams from a single incoming light beam. For example, a VCSEL projecting up to 2000 spots and an optical element comprising a plurality of metasurface elements may be used to duplicate the number of spots. Further arrangements, particularly comprising a different number of projecting VCSEL and/or at least one different optical element configured for increasing the number of spots may be possible. Other multiplication factors are possible. For example, a VCSEL or a plurality of VCSELs may be used and the generated laser spots may be duplicated by using at least one DOE.
The system of optical elements covers the light emitter structure. The term "covers the light emitter structure" as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to completely or at least partially covering the light emitting structure, e.g. at least the first array of light emitters. The system of optical elements may be designed and/or arranged such that it covers the light emitter structure. For example, the first array may be covered by the system of optical elements only. For example, both of the first and the second array may be covered by the system of optical elements. The light emitter structure may be located at the focal point of the system of optical elements. Such an arrangement may allow that the emitted light of the light emitters, in particular of the light emitters of the pattern illumination source, is collimated.
The term "flood light optical element" as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an optical element assigned to the flood illumination source and configured for defocusing light emitted by the light emitters of the flood illumination source thereby forming overlapping light spots. The flood light source in combination with the flood light optical element is configured for generating flood light, in particular diffuse illumination. The flood light optical element may comprise at least one element selected from the group consisting
of: at least one plate with a refractive index larger than 1.4, e.g. a glass plate, at least one diffusor plate, at least one lens, at least one micro lens, at least one prisma, at least one fresnel lens, at least one diffractive optical element (DOE), at least one meta lens. The second array may be completely covered by the flood light optical element. For example, the optoelectronic apparatus comprises a single flood light optical element covering all light emitters of the second array. For example, the optoelectronic apparatus comprises a plurality of flood light optical elements. For example, each light emitter of the sec-ond array comprises at least one assigned flood light optical element. As outlined above, the second array may be located at the focal point of the system of optical elements. Thus, the light generated by the second array would also be in focus. However, the optical imaging is changed by the additional flood light optical element such that the cavities are not collimated properly. This can allow generating a diffuse flood illumination.
The flood light optical element is configured for leaving the emitted infrared light pattern uninfluenced. The term "uninfluenced" as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to the fact that the flood light optical element is arranged and/or designed such that the light generated by the pat-tern illumination source does not interact with the flood light optical element. For example, the flood light optical element may be arranged and/or designed such that the pattern illumination source is omitted and/or excluded from being covered by the flood light optical element. In particular, the pattern illumination source is not covered by the flood light optical element.
The term “image generation unit” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to at least one unit of the optoelectronic apparatus configured for generating at least one image. The image may be generated via a hardware and/or a software interface, which may be considered as the image generation unit. The term “image generation” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to capturing and/or generating and/or determining and/or recording at least one image by using the image generation unit. The image generation may comprise imaging and/or recording the image. The image generation may comprise capturing a single image and/or a plurality of images such as a sequence of images. For generating an image via a hardware and/or a software interface, the capturing and/or generating and/or determining and/or recording of the image may be caused and/or initiated by the hardware and/or the software interface. For example, the image generation may comprise recording continuously a sequence of images such as a video or a movie. The image generation may be initiated by a user action or may automatically be initiated, e.g. once the presence of at least one object or user within a field of
view and/or within a predetermined sector of the field of view of the image generation unit is automatically detected.
The image generation unit may comprise at least one optical sensor, in particular at least one pixelated optical sensor. The image generation unit may comprise at least one CMOS sensor or at least one CCD chip. For example, the image generation unit may comprise at least one CMOS sensor, which may be sensitive in the infrared spectral range. The term “image” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to data recorded by using the optical sensor, such as a plurality of electronic readings from the CMOS or CCD chip. The image may comprise raw image data or may be a pre-processed image. For example, the pre-processing may comprise applying at least one filter to the raw image data and/or at least one background correction and/or at least one background subtraction.
For example, the image generation unit may comprise one or more of at least one monochrome camera e.g. comprising monochrome pixels, at least one color (e.g. RGB) camera e.g. comprising color pixels, at least one IR camera. The camera may be a CMOS camera. The camera may comprise at least one monochrome camera chip, e.g. a CMOS chip. The camera may comprise at least one color camera chip, e.g. an RGB CMOS chip. The camera may comprise at least one IR camera chip, e.g. an IR CMOS chip. For example, the camera may comprise monochrome, e.g. black and white, pixels and color pixels. The color pixels and the monochrome pixels may be combined internally in the camera. The camera generally may comprise a one-dimensional or two-dimensional array of image sensors, such as pixels.
As outlined, above, the image generation unit may be at least one camera. For example, the camera may be an internal and/or external camera of a device comprising the optoelectronic apparatus. As described in the above, the internal and/or external camera of the device may be accessed via a hardware and/or a software interface comprised by the optoelectronic apparatus, which is used as the image generation unit. In case, the device is or comprises a smartphone the image generating unit may be a front camera, such as a selfie camera, and/or back camera of the smartphone.
The image generation unit may have a field of view between 10°x10° and 75°x75°, preferably 55°x65°. For example, the field of view is between 20°x20° and 65°x65°, more preferably 30°x30° and 60°x60°, most preferably 55°x65°. The image generation unit may have a resolution below 2 MP, preferably between 0.3 MP and 1.5 MP.
The image generation unit may comprise further elements, such as one or more optical elements, e.g. one or more lenses. As an example, the optical sensor may be a fix-focus camera, having at least one lens which is fixedly adjusted with respect to the camera. Alternatively, however, the camera may also comprise one or more variable lenses which may be adjusted, automatically or manually. The camera may comprise at least one optical filter, e.g.
at least one bandpass filter. The bandpass filter may be matched to the spectrum of the light emitters. Other cameras, however, are feasible.
The term “pattern image” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an image generated by the image generation unit while illuminating with the infrared light pattern, e.g. on an object and/or a user. The pattern image may comprise an image showing a user, in particular at least parts of the face of the user, while the user is being illuminated with the infrared light pattern, particularly on a respective area of interest comprised by the image. The pattern image may be generated by imaging and/or recording light reflected by an object and/or user which is illuminated by the infrared light pattern. The pattern image showing the user may comprise at least a portion of the illuminated infrared light pattern on at least a portion the user. For example, the illumination by the pattern illumination source and the imaging by using the optical sensor may be synchronized, e.g. by using at least one control unit of the optoelectronic apparatus.
The term “flood image” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an image generated by the image generation unit while illumination source is emitting infrared flood light, e.g. on an object and/or a user. The flood image may comprise an image showing a user, in particular the face of the user, while the user is being illuminated with the flood light. The flood image may be generated by imaging and/or recording light reflected by an object and/or user which is illuminated by the flood light. The flood image showing the user may comprise at least a portion of the flood light on at least a portion the user. For example, the illumination by the flood illumination source and the imaging by using the optical sensor may be synchronized, e.g. by using at least one control unit of the optoelectronic apparatus.
The image generation unit may be configured for imaging and/or recording the pattern image and the flood image at the same time or at different times. The image generation unit may be configured for imaging and/or recording the pattern image and the flood image at at least partially overlapping measurement areas or equivalents of the measurement areas.
The optoelectronic apparatus may be comprised in a device. In particular, the optoelectronic apparatus is part of the device. The device may comprise at least one display, wherein the infrared light pattern traverses the display while being emitted from the pattern illumination source and/or the infrared flood light traverses the display while being emitted from the flood illumination source.
The term “display” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an arbitrary shaped device
configured for displaying an item of information. The item of information may be arbitrary information such as at least one image, at least one diagram, at least one histogram, at least one graphic, text, numbers, at least one sign, an operating menu, and the like. The display may be or may comprise at least one screen. The display may have an arbitrary shape, e.g. a rectangular shape. The display may be a front display of the device.
The display may be or may comprise at least one organic light-emitting diode (OLED) display. As used herein, the term “organic light emitting diode” is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a lightemitting diode (LED) in which an emissive electroluminescent layer is a film of organic compound configured for emitting light in response to an electric current. The OLED display may be configured for emitting visible light.
The display, particularly a display area, may be made of and/or may be covered by glass. In particular, the display may comprise at least one glass cover.
The display may be at least partially transparent. The term “at least partially transparent” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a property of the display to allow light, in particular of a certain wavelength range, e.g. in the infrared spectral region, in particular in the near infrared spectral region, to pass at least partially through. For example, the display may be semitransparent in the near infrared region. For example, the display may have a transparency of 20 % to 50 % in the near infrared region. The display may have a different transparency for other wavelength ranges. The present invention may propose an optoelectronic apparatus comprising the image generation unit and two illumination sources that can be placed behind the display of a device. The transparent area(s) of the display can allow for operation of the optoelectronic apparatus behind the display. The display can be an at least partially transparent display, as described above. The display may have a reduced pixel density and/or a reduced pixel size and/or may comprise at least one transparent conducting path. The transparent area(s) of the display may have a pixel density of 300-440 PPI (pixels per inch), more preferably 350 to 450 PPI. Other areas of the display, e.g. non-transparent areas, may have pixel densities higher than 400 PPI, e.g. a pixel density of 450-500 PPI.
The display comprises a display area. The term “display area” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to an active area of the display, in particular an area which is activatable. The display may have additional areas such as recesses or cutouts. The display may be at least partially transparent in at least one continuous area, preferably in at least two continuous areas. At least one of the continuous areas may cover the image generation unit and/or the pattern illumination source and/or the flood illumination source at least partially. The pattern illumination source, the
flood illumination source and the image generation unit may be placed in direction of propagation of the infrared light pattern in front of the display. As outlined above, the pattern illumination source and the flood illumination source may be combined into one module. This may allow reducing the transparent area(s) of the display.
The display may have a first area associated with a first pixel density (Pixels per inch (PPI)) value and a second area associated with a second pixel density value. The first pixel density value may be lower than the second pixel density value. The first pixel density value may be equal to or below 450 PPI, preferably from 300 to 440 PPI, more preferably 350 to 450 PPI. The second pixel density value may be 400 to 500 PPI, preferably 450 to 500 PPI. The first pixel density value may be associated with the at least one continuous area being at least partially transparent. In an embodiment, the display may have a first area associated with a first pixel density value and a second area associated with a second pixel density value, wherein the first pixel density value is below 350 pixels per inch and a second area having a pixels per inch value equal to or above 400.
As outlined above, known displays fulfilling these requirements are less appealing to users and the transparent area is desired to be decreased. Moreover, usually, covering the image generation unit with such a display can result in a low contrast because of the light transmission of 15 % to 50 % in transparent regions. The contrast may be defined as a difference between signal and backlight. In this case, the contrast can be considered to be reduced by the transmission. However, the contrast may be defined as a ratio of signal to background. In this case, the contrast may not be reduced by the transmission only, but by the diffraction in the spot pattern projection as the higher orders bring additional intensity besides the zero-th order. Diffraction may further takes place due to the structure of a display wiring leading to a further decrease in irradiance. Moreover, the spot pattern projected through a display can result in main spots corresponding to the pattern before traversing the display area with reduced intensity because higher order spots are generated. Both effects lead to a reduction in irradiance to about 3-5% of initial irradiance and the presence of undesired additional spots in the pattern. In particular, this effect may take place while leaving the device (first pass through the display) and entering the device (second pass through the display). The present invention allows for providing a sufficient irradiance and, thus, contrast in the pattern image. The contrast can be increased by decreasing the number of spots projected onto the user. This can lead to an increase in irradiance of a spot and thus, to an increase in contrast in an image of the projection of the spot pattern. Moreover, reducing of transparent areas in displays can be possible. The pattern illumination source and the flood illumination source used for authentication of a user can be combined into one module to reduce the transparent area in displays. Using such an optoelectronic apparatus can allow for a rearrangement of the position of a camera, e.g. from the outer most position of the display. Such a rearrangement can allow for an optimized wiring since a further position, the position of the image generation unit, can be optimized. This can decrease the used wiring or allow for an improved battery operation.
In a further aspect of the invention, a use of an optoelectronic apparatus according to the present invention for authenticating a user of a device comprising the apparatus is disclosed.
In a further aspect of the invention, a device for authenticating a user of a device to perform at least one operation on the device that requires authentication is disclosed.
The device comprising: at least one flood illumination source configured for emitting infrared flood light; at least one pattern illumination source configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, wherein a relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm; at least one image generation unit configured for generating at least one pattern image while the pattern illumination source is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source is emitting infrared flood light; at least one display covering the flood illumination source, the pattern illumination source and the image generation unit at least partially, at least one authentication unit configured for performing at least one authentication process of a user using the flood image and the pattern image.
The device, in particular, may comprise at least one optoelectronic apparatus according to the present invention. Thus, for details, options and definitions, reference may be made to the device and the optoelectronic apparatus as discussed above or as described in further detail below.
The device may be selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or , and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality device, and/or a wearable, such as a smart watch; or another type of portable computer.
The term “authenticating” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to verifying an identity of a user. Specifically, the authentication may comprise distinguishing between the user from other humans or objects, in particular between authorized access from non-authorized accesses. The authentication may comprise verifying identity of a respective user and/or assigning identity to a user. The authentication may comprise generating and/or providing identity information, e.g. to other devices or units such as to at least one authorization unit for authorization for providing access to the device. The identify information may be proofed by the authentication. For example, the identity information may be and/or may comprise at least one identity token. In case of successful authentication an image of a face recorded by the image
generation unit may be verified to be an image of the user’s face and/or the identity of the user is verified. The authenticating may be performed using at least one authentication process. The authentication process may comprise a plurality of steps such as at least one face detection on the flood image and at least one identification step in which an identity is assigned to the detected face and/or at least one identity check and/or verifying an identity of the user is performed.
The term “authentication unit” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to at least one unit configured for performing at least one authentication process of a user. The authentication unit may be or may comprise at least one processor. The processor may be an arbitrary logic circuitry configured for performing basic operations of a computer or system, and/or, generally, to a device which is configured for performing calculations or logic operations. In particular, the processor may be configured for processing basic instructions that drive the computer or system. As an example, the processor may comprise at least one arithmetic logic unit (ALU), at least one floating-point unit (FPU), such as a math co-processor or a numeric co-processor, a plurality of registers, specifically registers configured for supplying operands to the ALU and storing results of operations, and a memory, such as an L1 and L2 cache memory. In particular, the processor may be a multi-core processor. Specifically, the processor may be or may comprise a central processing unit (CPU). Additionally or alternatively, the processor may be or may comprise a microprocessor, thus specifically the processor’s elements may be contained in one single integrated circuitry (IC) chip. Additionally or alternatively, the processor may be or may comprise one or more application-specific integrated circuits (ASICs) and/or one or more field-programmable gate arrays (FPGAs) and/or one or more tensor processing unit (TPU) and/or one or more chip, such as a dedicated machine learning optimized chip, or the like. The processor specifically may be configured, such as by software programming, for performing one or more evaluation operations. At least one or any component of a computer program configured for performing the authentication process may be executed by the processing device. Alternatively or in addition, the authentication unit may be or may comprise a connection interface. The connection interface may be configured to transfer data from the device to a remote device; or vice versa. At least one or any component of a computer program configured for performing the authentication process may be executed by the remote device.
For example, the authentication unit may perform at least one face detection using the flood image. The face detection may be performed locally on the device. Face identification, i.e. assigning an identity to the detected face, however, may be performed remotely, e.g. in the cloud, e.g. especially when identification needs to be done and not only verification. User templates can be stored at the remote device, e.g. in the cloud, and would not need to be stored locally. This can be an advantage in view of storage space and security.
The authentication unit may be configured for identifying the user based on the flood image.
Particularly therefore, the authentication unit may forward data to a remote device. Alternatively
or in addition, the authentication unit may perform the identification of the user based on the flood image, particularly by running an appropriate computer program having a respective functionality. The term “identifying” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to assigning an identity to a detected face and/or at least one identity check and/or verifying an identity of the user.
The authentication process may comprise a plurality of steps. For example, the authentication process may comprise performing at least one face detection. The face detection step may comprise analyzing the flood image. In addition, for example, the authentication process may comprise identifying. The identifying may comprise assigning an identity to a detected face and/or at least one identity check and/or verifying an identity of the user. The identifying may comprise performing a face verification of the imaged face to be the user’s face. The identifying the user may comprise matching the flood image, e.g. showing a contour of parts of the user, in particular parts of the user’s face, with a template. The identifying of the user may comprise determining if the imaged face is the face of the user, in particular if the imaged face corresponds to at least one image of the user’s face stored in at least one memory, e.g. of the device.
The analyzing of the flood image may comprise one or more of the following: a filtering; a selection of at least one region of interest; a formation of a difference image between the flood image and at least one offset; an inversion of flood image; a background correction; a decomposition into color channels; a decomposition into hue; saturation; and brightness channels; a frequency decomposition; a singular value decomposition; applying a Canny edge detector; applying a Laplacian of Gaussian filter; applying a Difference of Gaussian filter; applying a Sobel operator; applying a Laplace operator; applying a Scharr operator; applying a Prewitt operator; applying a Roberts operator; applying a Kirsch operator; applying a high-pass filter; applying a low-pass filter; applying a Fourier transformation; applying a Radon- transformation; applying a Hough-transformation; applying a wavelet-transformation; a thresholding; creating a binary image. The region of interest may be determined manually by a user or may be determined automatically, such as by recognizing the user within the image. In particular, the analyzing of the flood image may comprise using at least one image recognition technique, in particular a face recognition technique. An image recognition technique comprises at least one process of identifying the user in an image. The image recognition may comprise using at least one technique selected from the technique consisting of: color-based image recognition, e.g. using features such as template matching; segmentation and/or blob analysis e.g. using size, or shape; machine learning and/or deep learning e.g. using at least one convolutional neural network.
The analyzing of the flood image may comprise determining a plurality of facial features. The analyzing may comprise comparing, in particular matching, the determined facial features with template features. The template features may be features extracted from at least one template.
The template may be or may comprise at least one image generated in an enrollment process, e.g. when initializing the device. Template may be an image of an authorized user. The template features and/or the facial feature may comprise a vector. Matching of the features may comprise determining a distance between the vectors. The identifying of the user may comprise comparing the distance of the vectors to a least one predefined limit, wherein the user is successfully identified in case the distance is < the predefined limit at least within tolerances. The user declining and/or rejected otherwise.
For example, the image recognition may comprise using at least one model, in particular a trained model comprising at least one face recognition model. The analyzing of the flood image may be performed by using a face recognition system, such as FaceNet, e.g. as described in Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832. The trained model may comprises at least one convolutional neural network. For example, the convolutional neural network may be designed as described in M. D. Zeiler and R. Fergus, “Visualizing and understanding convolutional networks”, CoRR, abs/1311.2901 , 2013, or C. Szegedy et al., “Going deeper with convolutions”, CoRR, abs/1409.4842, 2014. For more details with respect to convolutional neural network for the face recognition system reference is made to Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832. As training data labelled image data from an image database may be used. Specifically, labeled faces may be used from one or more of G. B. Huang, M. Ramesh, T. Berg, and E. Learned-Miller, “Labeled faces in the wild: A database for studying face recognition in unconstrained environments”, Technical Report 07-49, University of Massachusetts, Amherst, October 2007, the Youtube® Faces Database as described in L. Wolf, T. Hassner, and I. Maoz, “Face recognition in unconstrained videos with matched background similarity”, in IEEE Conf, on CVPR, 2011 , or Google® Facial Expression Comparison dataset. The training of the convolutional neural network may be performed as described in Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832.
The authentication unit may be further configured for determining material data based on the pattern image. Particularly therefore, the authentication unit may forward data to a remote device. Alternatively or in addition, the authentication unit may perform the material determination based on the pattern image, particularly by running an appropriate computer program having a respective functionality. Particularly by considering the material as a parameter for validating the authentication process, the authentication process may be robust against being outwitted by using a recorded image of the user.
The authentication unit may be configured for extracting the material data from the pattern image by beam profile analysis of the light spots. With respect to beam profile analysis reference is made to WO 2018/091649 A1 , WO 2018/091638 A1 and WO 2018/091640 A1 , the full content of which is included by reference. * Beam profile analysis can allow for providing a reliable classification of scenes based on a few light spots. Each of the light spots of the pattern
image may comprise a beam profile. As used herein, the term “beam profile” may generally refer to at least one intensity distribution of the light spot on the optical sensor as a function of the pixel. The beam profile may be selected from the group consisting of a trapezoid beam profile; a triangle beam profile; a conical beam profile and a linear combination of Gaussian beam profiles.
The authentication unit may be configured for outsourcing at least one step of the authentication process, such as the identifying of the user, and/or at least one step of the validation of the authentication process, such as the consideration of the material data, to a remote device, specifically a server and/or a cloud server. The device and the remote device may be part of a computer network, particularly the internet. Thereby, the device may be used as a field device that is used by the user for generating data required in the authentication process and/or its validation. The device may transmit the generated data and/or data associated to an intermediate step of the authentication process and/or its validation to the remote device. In such a scenario, the authentication unit may be and/or may comprise a connection interface configured for transmitting information to the remote device. Data generated by the remote device used in the authentication process and/or its validation may further be transmitted to the device. This data may be received by the connection interface comprised by the device. The connection interface may specifically be configured for transmitting or exchanging information. In particular, the connection interface may provide a data transfer connection. As an example, the connection interface may be or may comprise at least one port comprising one or more of a network or internet port, a USB-port, and a disk drive.
It is emphasized that data from the device may be transmitted to a specific remote device depending on at least one circumstance, such as a date, a day, a load of the specific remote device, and so on. The specific remote device may not be selected by the field device. Rather a further device may select to which specific remote device the data may be transmitted. The authentication process and and/or the generation of validation data may involve a use of several different entities of the remote device. At least one entity may generate intermediate data and transmit the intermediate data to at least one further entity.
It is emphasized that data from the device may be transmitted to a specific remote device depending on at least one circumstance, such as a date, a day, a load of the specific remote device, and so on. The specific remote device may not be selected by the field device. Rather a further device may select to which specific remote device the data may be transmitted. The authentication process and and/or the generation of validation data may involve a use of several different entities of the remote device. At least one entity may generate intermediate data and transmit the intermediate data to at least one further entity.
The authentication unit is configured for using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data. The authentication unit may be configured for extracting material data from the pattern image.
In an embodiment, extracting material data from the pattern image may comprise generating the material type and/or data derived from the material type. Preferably, extracting material data may be based on the pattern image. Material data may be extracted by using at least one model. Extracting material data may include providing the pattern image to a model and/or receiving material data from the model. Providing the image to a model may comprise and may be followed by receiving the pattern image at an input layer of the model or via a model loss function. The model may be a data-driven model. Data-driven model may comprise a convolutional neural network and/or an encoder decoder structure such as an autoencoder. Other examples for generating a representation may be FFT, wavelets, deep learning, like CNNs, energy models, normalizing flows, GANs, vision transformers, or transformers used for natural language processing, Autoregressive Image Modeling, Normalizing Flows, Deep Autoencoders, Deep Energy-Based Models. Supervised or unsupervised schemes may be applicable to generate a representation, also embedding in e.g. cosine or Euclidian metric in ML language. The data-driven model may be parametrized according to a training data set including at least one image and material data, preferably at least one pattern image and material data. In another embodiment, extracting material data may include providing the image to a model and/or receiving material data from the model. In another embodiment, the data- driven model may be trained according to a training data set including at least one image and material data. In another embodiment, the data-driven model may be parametrized according to a training data set including at least one image and material data. The data-driven model may be parametrized according to a training data set to receive the image and provide material data based on the received image. The data-driven model may be trained according to a training data set to receive the image and provide material data as output based on the received image. The training data set may comprise at least one image and material data, preferably material data associated with the at least one image. The image may comprise a representation of the image. The representation may be a lower dimensional representation of the image. The representation may comprise at least a part of the data or the information associated with the image. The representation of an image may comprise a feature vector. In an embodiment, determining a representation, in particular a lower-dimensional representation may be based on principal component analysis (PCA) mapping or radial basis function (RBF) mapping. Determining a representation may also be referred to as generating a representation. Generating a representation based on PCA mapping may include clustering based on features in the pattern image and/or partial image. Additionally or alternatively, generating a representation may be based on neural network structures suitable for reducing dimensionality. Neural network structures suitable for reducing dimensionality may comprise encoder and/or decoder. In an example, neural network structure may be an autoencoder. In an example, neural network structure may comprise a convolutional neural network (CNN). The CNN may comprise at least one convolutional layer and/or at least one pooling layer. CNNs may reduce the dimensionality of a partial image and/or an image by applying a convolution, e.g. based on a convolutional layer, and/or by pooling. Applying a convolution may be suitable for selecting feature related to material information of the pattern image.
In an embodiment, a model may be suitable for determining an output based on an input. In particular, model may be suitable for determining material data based on an image as input. A model may be a deterministic model, a data-driven model or a hybrid model. The deterministic model, preferably, reflects physical phenomena in mathematical form, e.g., including first- principles models. A deterministic model may comprise a set of equations that describe an interaction between the material and the patterned electromagnetic radiation thereby resulting in a condition measure, a vital sign measure or the like. A data-driven model may be a classification model. A hybrid model may be a classification model comprising at least one machine-learning architecture with deterministic or statistical adaptations and model parameters. Statistical or deterministic adaptations may be introduced to improve the quality of the results since those provide a systematic relation between empiricism and theory. In an embodiment, the data-driven model may be a classification model. The classification model may comprise at least one machine-learning architecture and model parameters. For example, the machine-learning architecture may be or may comprise one or more of: linear regression, logistic regression, random forest, piecewise linear, nonlinear classifiers, support vector machines, naive Bayes classifications, nearest neighbors, neural networks, convolutional neural networks, generative adversarial networks, support vector machines, or gradient boosting algorithms or the like. In the case of a neural network, the model can be a multi-scale neural network or a recurrent neural network (RNN) such as, but not limited to, a gated recurrent unit (GRU) recurrent neural network or a long short-term memory (LSTM) recurrent neural network. The data-driven model may be parametrized according to a training data set. The data-driven model may be trained based on the training data set. Training the model may include parametrizing the model. The term training may also be denoted as learning. The term specifically may refer, without limitation, to a process of building the classification model, in particular determining and/or updating parameters of the classification model. Updating parameters of the classification model may also be referred to as retraining. Retraining may be included when referring to training herein. In an embodiment, the training data set may include at least one image and material information.
In an embodiment, extracting material data from the image with a data-driven model may comprise providing the image to a data-driven model. Additionally or alternatively, extracting material data from the image with a data-driven model may comprise may comprise generating an embedding associated with the image based on the data-driven model. An embedding may refer to a lower dimensional representation associated with the image such as a feature vector. Feature vector may be suitable for suppressing the background while maintaining the material signature indicating the material data. In this context, background may refer to information independent of the material signature and/or the material data. Further, background may refer to information related to biometric features such as facial features. Material data may be determined with the data-driven model based on the embedding associated with the image. Additionally or alternatively, extracting material data from the image by providing the image to a data-driven model may comprise transforming the image into material data, in particular a material feature vector indicating the material data. Hence, material data may comprise further
the material feature vector and/or material feature vector may be used for determining material data.
In an embodiment, authentication process may be validated based on the extracted material data.
In an embodiment, the validating based on the extracted material data may comprise determining if the extracted material data corresponds a desired material data. Determining if extracted material data matches the desired material data may be referred to as validating. Allowing or declining the user and/or object to perform at least one operation on the device that requires authentication based on the material data may comprise validating the authentication or authentication process. Validating may be based on material data and/or image. Determining if the extracted material data corresponds a desired material data may comprise determining a similarity of the extracted material data and the desired material data. Determining a similarity of the extracted material data and the desired material data may comprise comparing the extracted material data with the desired material data. Desired material data may refer to predetermined material data. In an example, desired material data may be skin. It may be determined if material data may correspond to the desired material data. In the example, material data may be non-skin material or silicon. Determining if material data corresponds to a desired material data may comprise comparing material data with desired material data. A comparison of material data with desired material data may result in a allowing and/or declining the user and/or object to perform at least one operation that requires authentication. In the example, skin as desired material data may be compared with non-skin material or silicon as material data and the result may be declination since silicon or non-skin material may be different from skin.
In an embodiment, the authentication process or its validation may include generating at least one feature vector from the material data and matching the material feature vector with associate reference template vector for material.
The authentication unit may be configured for authenticating the user in case the user can be identified and/or if the material data matches the desired material data. The device may comprise at least one authorization unit configured for allowing the user to perform at least one operation on the device, e.g. unlocking the device, in case of successful authentication of the user or declining the user to perform at least one operation on the device in case of nonsuccessful authentication. Thereby, the user may become aware of the result of the authentication.
In a further aspect, the present invention discloses a method for authenticating a user of a device to perform at least one operation on the device that requires authentication.
The device comprising a display and the method comprising:
a. illuminating the user with at least one infrared light pattern comprising a plurality of infrared light spots from at least one pattern illumination source of the device, wherein the number of infrared light spots is below or equal 4000 spots; b. illuminating the user with infrared flood light from at least one flood illumination source of the device, wherein the distance between the flood illumination source and the pattern illumination source is below 3.0 mm c. generating at least one pattern image with an image generation unit of the device showing the user, in particular at least parts of the face of the user, while the user is being illuminated with the infrared light pattern, and generating at least one image with an image generation unit of the device showing the user while the user is being illuminated with the infrared flood light, wherein the image generation unit and/or the flood and pattern illumination sources are covered at least partially by the display of the device, d. identifying the user based on the flood image by using at least one authentication unit of the device, e. extracting material data from the at least one pattern image by using the authentication unit; and f. allowing the user to perform at least one operation on the device that requires authentication based on the material data and the identifying.
The method steps may be performed in the given order or may be performed in a different order. Further, one or more additional method steps may be present which are not listed. Further, one, more than one or even all of the method steps may be performed repeatedly. For details, options and definitions, reference may be made to the optoelectronic apparatus and the device as discussed above. Thus, specifically, the method may comprise using the device according to the present invention, such as according to one or more of the embodiments given above or given in further detail below.
The identifying of the user may comprise matching the flood image with a template.
The method may comprise using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data. The pattern image and/or the image showing the user while the user is being illuminated with the infrared flood light may be showing at least a portion of a face of the user.
All described method steps may be performed by using the device. Therefore, the single processing device may be configured to exclusively perform at least one computer program, in particular at least one line of computer program code configured to execute at least one algorithm, as used in at least one of the embodiments of the method according to the present invention. Herein, the computer program as executed on the single processing device may comprise all instructions causing the computer to carry out the method. Alternatively, or in addition, at least one method step may be performed by using at least one remote device, especially selected from at least one of a server or a cloud server, particularly when the device and the remote device may be part of a computer network. In this case, the computer program
may comprise at least one remote component to be executed by the at least one remote processing device to carry out the at least one method step. The remote component may have the functionality of performing the identifying of the user and/or the extraction of the material data. Further, the computer program may comprise at least one interface configured to forward to and/or receive data from the at least one remote component of the computer program.
The method may comprise allowing or declining the user to perform at least one operation on the device. In an embodiment, allowing or declining the user to perform at least one operation on the device that requires authentication based on the material data may include allowing the user to perform at least one operation on the device that requires authentication if the material data matches desired material data and/or authentication may be successful. Desired material data may be predetermined material data. Authentication may be successful if the user can be identified and/or if the material data matches desired material data. Further, allowing or declining the object to perform at least one operation on the device that requires authentication based on the material data may include declining to perform at least one operation on the device that requires authentication if the material data does not match desired material data and/or authentication may be unsuccessful. Authentication may be unsuccessful if the pattern image cannot be matched with an image template and/or if the material data does not match the desired material data.
At least one operation on the device that requires authentication may be access to the device, e.g. unlocking the device, and/or access to an application, preferably associated with the device and/or access to a part of an application, preferably associated with the device. In an embodiment, allowing the user to access a resource may include allowing the user to perform at least one operation with a device and/or system. The resource may be a device, a system, a function of a device, a function of a system and/or an entity. Additionally and/or alternatively, allowing the user to access a resource may include allowing the user to access an entity. The entity may be physical entity and/or virtual entity. The virtual entity may be a database for example. The physical entity may be an area with restricted access. The area with restricted access may be one of the following: security areas, rooms, apartments, vehicles, parts of the before mentioned examples, or the like. Device and/or system may be locked. The device and/or the system may only be unlocked by authorized user.
The term “user” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a person intended to and/or using the device.
The method may be computer implemented. The term "computer implemented " as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a method involving at least one computer and/or at least one
computer network. The computer and/or computer network may comprise at least one processor which is con-figured for performing at least one of the method steps of the method according to the present invention. Specifically, each of the method steps is performed by the computer and/or computer network. The method may be performed completely automatically, specifically without user interaction.
Further disclosed and proposed herein is a computer program including computer-executable instructions for performing the method according to the present invention in one or more of the embodiments enclosed herein when the program is executed on a computer or computer network. Specifically, the computer program may be stored on a computer-readable data carrier and/or on a computer-readable storage medium. The computer program may be executed on at least one processor comprised by the optoelectronic apparatus and/or the device. The computer program may generate input data by accessing and/or controlling at least one unit of the optoelectronic apparatus and/or the device, such as the pattern illumination source and/or the flood illumination source and/or the image generation unit. The computer program may generate outcome data based on the input data, particularly by using the authentication unit.
As used herein, the terms “computer-readable data carrier” and “computer-readable storage medium” specifically may refer to non-transitory data storage means, such as a hardware storage medium having stored thereon computer-executable instructions. The stored computerexecutable instruction may be associate with the computer program. The computer-readable data carrier or storage medium specifically may be or may comprise a storage medium such as a random-access memory (RAM) and/or a read-only memory (ROM).
Thus, specifically, one, more than one or even all of method steps a. to f. as indicated above may be performed by using a computer or a computer network, preferably by using a computer program.
Further disclosed and proposed herein is a computer program product having program code means, in order to perform the method according to the present invention in one or more of the embodiments enclosed herein when the program is executed on a computer or computer network. Specifically, the program code means may be stored on a computer-readable data carrier and/or on a computer-readable storage medium.
Further disclosed and proposed herein is a data carrier having a data structure stored thereon, which, after loading into a computer or computer network, such as into a working memory or main memory of the computer or computer network, may execute the method according to one or more of the embodiments disclosed herein.
Further disclosed and proposed herein is a computer program product with program code means stored on a machine-readable carrier, in order to perform the method according to one or more of the embodiments disclosed herein, when the program is executed on a computer or computer network. As used herein, a computer program product refers to the program as a
tradable product. The product may generally exist in an arbitrary format, such as in a paper format, or on a computer-readable data carrier and/or on a computer-readable storage medium. Specifically, the computer program product may be distributed over a data network.
Further disclosed and proposed herein is a non-transient computer-readable medium including instructions that, when executed by one or more processors, cause the one or more processors to perform the method according to one or more of the embodiments disclosed herein.
Finally, disclosed and proposed herein is a modulated data signal which contains instructions readable by a computer system or computer network, for performing the method according to one or more of the embodiments disclosed herein.
Referring to the computer-implemented aspects of the invention, one or more of the method steps or even all of the method steps of the method according to one or more of the embodiments disclosed herein may be performed by using a computer or computer network. Thus, generally, any of the method steps including provision and/or manipulation of data may be per-formed by using a computer or computer network. Generally, these method steps may include any of the method steps, typically except for method steps requiring manual work, such as providing the samples and/or certain aspects of performing the actual measurements.
Specifically, further disclosed herein are: a computer or computer network comprising at least one processor, wherein the processor is adapted to perform the method according to one of the embodiments described in this description, a computer loadable data structure that is adapted to perform the method according to one of the embodiments described in this description while the data structure is being executed on a computer, a computer program, wherein the computer program is adapted to perform the method according to one of the embodiments described in this description while the program is being executed on a computer, a computer program comprising program means for performing the method according to one of the embodiments described in this description while the computer program is being executed on a computer or on a computer network, a computer program comprising program means according to the preceding embodiment, wherein the program means are stored on a storage medium readable to a computer, a storage medium, wherein a data structure is stored on the storage medium and wherein the data structure is adapted to perform the method according to one of the embodiments described in this description after having been loaded into a main and/or working storage of a computer or of a computer network, and a computer program product having program code means, wherein the program code means can be stored or are stored on a storage medium, for performing the method ac-cording
to one of the embodiments described in this description, if the program code means are executed on a computer or on a computer network.
As used herein, the terms “have”, “comprise” or “include” or any arbitrary grammatical variations thereof are used in a non-exclusive way. Thus, these terms may both refer to a situation in which, besides the feature introduced by these terms, no further features are present in the entity described in this context and to a situation in which one or more further features are present. As an example, the expressions “A has B”, “A comprises B” and “A includes B” may both refer to a situation in which, besides B, no other element is present in A (i.e. a situation in which A solely and exclusively consists of B) and to a situation in which, besides B, one or more further elements are present in entity A, such as element C, elements C and D or even further elements.
Further, it shall be noted that the terms “at least one”, “one or more” or similar expressions indicating that a feature or element may be present once or more than once typically are used only once when introducing the respective feature or element. In most cases, when referring to the respective feature or element, the expressions “at least one” or “one or more” are not repeated, nonwithstanding the fact that the respective feature or element may be present once or more than once.
Further, as used herein, the terms "preferably", "more preferably", "particularly", "more particularly", "specifically", "more specifically" or similar terms are used in conjunction with optional features, without restricting alternative possibilities. Thus, features introduced by these terms are optional features and are not intended to restrict the scope of the claims in any way. The invention may, as the skilled person will recognize, be performed by using alternative features. Similarly, features introduced by "in an embodiment of the invention" or similar expressions are intended to be optional features, without any restriction regarding alternative embodiments of the invention, without any restrictions regarding the scope of the invention and without any restriction regarding the possibility of combining the features introduced in such way with other optional or non-optional features of the invention.
Overall, in the context of the present invention, the following embodiments are regarded as preferred:
In an embodiment, an optoelectronic apparatus is disclosed, comprising: at least one pattern illumination source configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, at least one flood illumination source configured for emitting infrared flood light, at least one image generation unit configured for generating at least one pattern image while the pattern illumination source is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source is emitting infrared flood light,
wherein a relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm.
In an embodiment, the optoelectronic apparatus may comprised in a device, wherein the device may comprise at least one display, wherein the infrared light pattern may traverse the display while being emitted from the pattern illumination source and/or the infrared flood light may traverse the display while being emitted from the flood illumination source.
In an embodiment, the relative distance between the flood illumination source and the pattern illumination source may be below 2.5 mm, preferably below 2.0 mm.
In an embodiment, the infrared light may be coherent, wherein the infrared light pattern may be a coherent infrared light pattern.
In an embodiment, the infrared light pattern may be a hexagonal pattern, preferably a 2/5 hexagonal infrared light pattern.
In an embodiment, at least one of the infrared light spots may be associated with a beam divergence of 0.2° to 0.5°, preferably 0.1 ° to 0.3°.
In an embodiment, the infrared light pattern may be a near infrared light pattern.
In an embodiment, the infrared light pattern may comprise equal or less than 3000 spots, preferably equal or less than 2000 spots.
In an embodiment, the infrared light pattern may comprise at least one point pattern, wherein the infrared light pattern may have a low point density, wherein the infrared light pattern may comprise equal to or less than 2000 spots and/or above 0, preferably above 5, more preferably above 10, most preferably above 100.
In an embodiment, the image generation unit may have a field of view between 10°x10° and 75°x75°, preferably the field of view is between 20°x20° and 65°x65°, more preferably 30°x30° and 60°x60°, most preferably 55°x65.
In an embodiment, the image generation unit may have a resolution below 2 MP, preferably between 0.3 MP and 1.5 MP.
In an embodiment, the image generation unit may comprise at least one CMOS sensor or at least one CCD chip.
In an embodiment, the pattern illumination source may comprise at least one pattern projector configured for generating the infrared light pattern.
In an embodiment, the pattern illumination source, such as the pattern projector, may comprise at least one vertical cavity surface-emitting laser (VCSEL), preferably a plurality of VCSELs.
In an embodiment, the pattern illumination source may comprise at least one optical element configured for increasing the number of spots, wherein the optical element may comprise at least one diffractive optical element (DOE) and/or at least one metasurface element.
In an embodiment, the flood illumination source may comprise at least one VCSEL, preferably a plurality of VCSELs.
In an embodiment, the pattern illumination source may comprise a plurality of first VCSELs mounted on a first platform, wherein the flood illumination source may comprise a plurality of second VCSELs mounted on a second platform.
In an embodiment, the optoelectronic apparatus may comprise a heat sink, wherein above the heat sink a first increment comprising the first platform may be attached, wherein above the heat sink a second increment comprising the second platform may be attached.
In an embodiment, the emitting of the infrared flood light and the illumination of the infrared light pattern may be performed subsequently or at at least partially overlapping times.
In an embodiment, the optoelectronic apparatus may comprise at least one optical element with at least two different areas associated with two different deflection behaviors, wherein light emitted from the pattern illumination source and light emitted from the flood illumination source may be deflected differently by the optical element depending on the area of the optical element the light illuminates.
In an embodiment, the pattern illumination source may emit light having a first wavelength and the flood illumination source may emit light having a second wavelength different from the first wavelength, wherein the optoelectronic apparatus may comprise at least one optical element, wherein the optical element is wavelength dependent.
In an embodiment, the pattern illumination source may emit at least one light beam having a first beam width and the flood illumination source may emit at least one light beam having a second beam width different from the first beam width, wherein the optoelectronic apparatus may comprise at least one optical element, wherein the optical element may be configured for deflecting light of different beam widths differently.
In an embodiment, the optoelectronic apparatus may comprise:
(1 ) at least one first Vertical-Cavity Surface-Emitting Laser, wherein the first Vertical- Cavity Surface-Emitting Laser comprises an active area situated on a bottom surface of the first Vertical-Cavity Surface-Emitting Laser;
(2) at least one second Vertical-Cavity Surface-Emitting Laser, wherein the second Vertical-Cavity Surface-Emitting Laser comprises an active area situated on a top surface of the second Vertical-Cavity Surface-Emitting Laser, wherein the top surface is opposite of a bottom surface of the second Vertical-Cavity Surface- Emitting Laser;
(3) at least one supporting member; wherein the first Vertical-Cavity Surface-Emitting Laser may be arranged with the bottom surface on the supporting member; and wherein the second Vertical-Cavity Surface-Emitting Laser may be arranged with the bottom surface on the supporting member.
In an embodiment, the optoelectronic apparatus may comprise: a light emitter structure comprising a plurality of light emitters, wherein a first array of light emitters of the plurality of light emitters forms the pattern illumination source, wherein a second array of light emitters of the plurality of light emitters, different from the light emitters of the first array, forms the flood illumination source; a base providing a single plane for mounting the light emitters; at least one system of optical elements comprising a plurality of optical elements, wherein the system of optical elements is configured for focusing the emitted infrared light pattern onto a focal plane, wherein the system of optical elements covers the light emitter structure; at least one flood light optical element configured for defocusing light emitted by the light emitters of the flood illumination source thereby forming overlapping light spots, wherein the flood light optical element is configured for leaving the emitted infrared light pattern uninfluenced.
In an embodiment, a use of an optoelectronic apparatus according to the present invention, such as disclosed in any one of embodiments, for authenticating a user of a device comprising the apparatus is disclosed.
In an embodiment, a device for authenticating a user of a device to perform at least one operation on the device that requires authentication, the device comprising: at least one flood illumination source configured for emitting infrared flood light; at least one pattern illumination source configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, wherein a relative distance between the flood illumination source and the pattern illumination source is below 3.0 mm; at least one image generation unit configured for generating at least one pattern image while the pattern illumination source is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source is emitting infrared flood light; at least one display covering the flood illumination source, the pattern illumination source and the image generation unit at least partially,
at least one authentication unit configured for performing at least one authentication process of a user using the flood image and the pattern image.
In an embodiment, the display may be or may comprise at least one organic light-emitting diode (OLED) display.
In an embodiment, the display may be at least partially transparent.
In an embodiment, the display may comprise a display area.
In an embodiment, the display may be made of and/or is covered by glass.
In an embodiment, the display of the device may be at least partially transparent in at least one continuous area, preferably in at least two continuous areas, wherein at least one of the continuous areas may cover the image generation unit and/or the pattern illumination source and/or the flood illumination source at least partially.
In an embodiment, the display may have a first area associated with a first pixel density value and a second area associated with a second pixel density value, wherein the first pixel density value may be lower than the second pixel density value, preferably first pixel density value is equal or below 450 PPI.
In an embodiment, the display may have a first area associated with a first pixel density value and a second area associated with a second pixel density value, wherein the first pixel density value may be below 350 pixels per inch and a second area having a pixels per inch value equal to or above 400.
In an embodiment, the first pixel density value may be associated with the at least one continuous area being at least partially transparent.
In an embodiment, the first pixel density value may be below 350 pixels per inch and a second area having a pixels per inch value equal to or above 400.
In an embodiment, the device may be selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or , and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality device, and/or a wearable, such as a smart watch; or another type of portable computer.
In an embodiment, the authentication unit may be configured for using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data.
In an embodiment, the device may comprise at least one optoelectronic device according to the present invention, such as according to any one of the preceding embodiments referring to an optoelectronic device.
In an embodiment, a method for authenticating a user of a device to perform at least one operation on the device that requires authentication is disclosed, the device comprising a display and the method comprising: a. illuminating the user with at least one infrared light pattern comprising a plurality of infrared light spots from at least one pattern illumination source of the device, wherein the number of infrared light spots is below or equal 4000 spots; b. illuminating the user with infrared flood light from at least one flood illumination source of the device, wherein the distance between the flood illumination source and the pattern illumination source is below 3.0 mm c. generating at least one pattern image with an image generation unit of the device showing the user while the user is being illuminated with the infrared light pattern, and generating at least one image with an image generation unit of the device showing the user while the user is being illuminated with the infrared flood light, wherein the image generation unit and/or the flood and pattern illumination sources are covered at least partially by the display of the device, d. identifying the user based on the flood image by using at least one authentication unit of the device, e. extracting material data from the at least one pattern image by using the authentication unit; and f. allowing the user to perform at least one operation on the device that requires authentication based on the material data and the identifying.
In an embodiment, the method may comprise using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data.
In an embodiment, the identifying of the user may comprise matching the flood image with a template.
In an embodiment, wherein the method is computer-implemented.
In an embodiment, a computer program is disclosed, comprising instructions which, when the program is executed by the device according to any one of the preceding embodiments referring to a device, cause the device to perform the method according to any one of the preceding embodiments referring to a method.
In an embodiment, a computer-readable storage medium is disclosed, comprising instructions which, when the instructions are executed by the device according to any one of the preceding embodiments referring to a device, cause the device to perform the method according to any one of the preceding embodiments referring to a method.
In an embodiment, a non-transient computer-readable medium is disclosed, including instructions that, when executed by one or more processors, cause the one or more processors to perform the method according to any one of the preceding embodiments referring to a method.
Brief description of the figures
Further optional details and features of the invention are evident from the description of preferred exemplary embodiments which follows in conjunction with the dependent claims. In this context, the particular features may be implemented in an isolated fashion or in combination with other features. The invention is not restricted to the exemplary embodiments. The exemplary embodiments are shown schematically in the figures. Identical reference numerals in the individual figures refer to identical elements or elements with identical function, or elements which correspond to one another with regard to their functions.
Specifically, in the figures:
Figure 1 shows an embodiment of a device according to the present invention;
Figure 2 shows an embodiment of a method according to the present invention;
Figures 3A and 3B show embodiments of an optical element having deflection properties;
Figure 4 shows a schematic of an exemplary optical element in a side view;
Figures 5A and 5B show an embodiment of the optoelectronic apparatus; and
Figures 6A and 6B show an embodiment of the optoelectronic apparatus.
Detailed description of the embodiments:
Figure 1 shows an embodiment of a device 110 of the present invention in a highly schematic fashion. For example, the device 110 may be selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality device, and/or a wearable, such as a smart watch; or another type of portable computer.
In this embodiment, the device 110 comprises an optoelectronic apparatus 112 according to the present invention. The optoelectronic apparatus 112 comprises at least one pattern illumination
source 114 configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots. The number of infrared light spots is below or equal 4000 spots.
The pattern illumination source 114 may be configured for generating or providing at least one light pattern, in particular at least one infrared light pattern. The light pattern may comprise a plurality of light spots. The light spot may be at least partially spatially extended. The infrared light pattern may be a near infrared light pattern. The infrared light may be coherent. The infrared light pattern may be a coherent infrared light pattern. The pattern illumination source 114 may be configured for emitting light at a single wavelength, e.g. in the near infrared region. In other embodiments, the pattern illumination source 114 may be adapted to emit light with a plurality of wavelengths, e.g. for allowing additional measurements in other wavelengths channels. The infrared light pattern may comprise at least one regular and/or constant and/or periodic pattern such as a triangular pattern, a rectangular pattern, a hexagonal pattern or a pattern comprising further convex tilings. For example, the infrared light pattern is a hexagonal pattern, preferably a hexagonal infrared light pattern, preferably a 2/5 hexagonal infrared light pattern. Using a periodical 2/5 hexagonal pattern can allow distinguishing between artefacts and usable signal.
The infrared light pattern may comprise at least one point pattern. The infrared light pattern has a low point density. The number of infrared light spots is below or equal 4000 spots. The infrared light pattern may comprise equal to or less than 3000 spots, preferably equal to or less than 2000 spots. The spots may have a circular shape. Further shapes are possible. The The infrared light pattern may have a low point density, in particular in comparison with other structured light techniques having typically a point density of 10k - 30k in a field of view of 55x38°. Using such a low point density may allow compensating for the above-mentioned diffraction loss. By decreasing the number of spots projected onto an object and/or a user, a contrast in the pattern image may be increased. Increasing the number of points would decrease the irradiance per point. The decreased number of spots may lead to an increase in irradiance of a spot and thus, to an increase in contrast in the pattern image of the projection of the infrared light pattern. The infrared light pattern may have a periodic point pattern with a reduced number of spots, wherein each of the spots has a high irradiance. Such a light pattern can ensure improved authentication using the pattern illumination source 114 and, as described above, at least one flood illumination source 116, at least one image generation unit 118, behind a display 120. Moreover, the low number of spots can ensure complying with eye safety requirements and stability requirements. The allowed dose may be divided between the spots of the light pattern.
At least one of the infrared light spots may be associated with a beam divergence of 0.2° to 0.5°, preferably 0.1 ° to 0.3°.
The pattern illumination source 114 may comprises at least one pattern projector configured for generating the infrared light pattern. The pattern illumination source 114, e.g. the pattern projector, may comprise at least one emitter, in particular a plurality of emitters. The emitter may
comprise at least one element selected from the group consisting of at least one laser source such as at least one semi-conductor laser, at least one double heterostructure laser, at least one external cavity laser, at least one separate confinement heterostructure laser, at least one quantum cascade laser, at least one distributed Bragg reflector laser, at least one polariton laser, at least one hybrid silicon laser, at least one extended cavity diode laser, at least one quantum dot laser, at least one volume Bragg grating laser, at least one Indium Arsenide laser, at least one Gallium Arsenide laser, at least one transistor laser, at least one diode pumped laser, at least one distributed feedback lasers, at least one quantum well laser, at least one interband cascade laser, at least one semiconductor ring laser, at least one vertical cavity surface emitting laser (VCSEL); at least one non-laser light source such as at least one LED or at least one light bulb. For example, the pattern illumination source 114, e.g. the pattern projector comprises at least one VCSEL, preferably a plurality of VCSELs. The plurality of VCSELs may be arranged in at least array, e.g. comprising a matrix of VCSELs. The VCSELs may be arranged on a common substrate or on different substrates. Examples for VCSELs can be found e.g. in en.wikipedia.org/wikiA/erticalcavity_surface-emitting_laser. VCSELs are generally known to the skilled person such as from WO 2017/222618 A. Each of the VCSELs is configured for generating at least one light beam. The VCSEL or the plurality of VCSELs may be configured for generating the desired spot number equal or below or equal 4000 spots, preferably, equal or below 3000 spots, more preferably equal or below 2000 spots. The VCSELs may be configured for emitting light beams at a wavelength range from 800 to 1000 nm. For example, the VCSELs may be configured for emitting light beams at 808 nm, 850 nm, 940 nm, or 98020 nm. Preferably the VCSELs emit light at 940 nm, since terrestrial sun radiation has a local minimum in irradiance at this wavelength, e.g. as described in CIE 085- 1989 „Solar spectral Irradiance”.
The pattern illumination source 114 may comprise at least one optical element, not shown here, configured for increasing, e.g. duplicating, the number of spots, e.g. the spots generated by the pattern projector. The pattern illumination source 114 may comprises at least one diffractive optical element (DOE) and/or at least one metasurface element. The DOE and/or the metasurface element may be configured for generating multiple light beams from a single incoming light beam. For example, a VCSEL projecting up to 2000 spots and an optical element comprising a plurality of metasurface elements may be used to duplicate the number of spots. Other duplications are possible. For example, a VCSEL or a plurality of VCSELs may be used and the generated laser spots may be duplicated by using at least one DOE.
The pattern illumination source 114 may comprise at least one transfer device, not shown here. The transfer device may comprise at least one imaging optical device .The transfer device specifically may comprise one or more of: at least one lens, for example at least one lens selected from the group consisting of at least one focus-tunable lens, at least one aspheric lens, at least one spheric lens, at least one Fresnel lens; at least one diffractive optical element; at least one concave mirror; at least one beam deflection element, preferably at least one mirror; at least one beam splitting element, preferably at least one of a beam splitting cube or a beam splitting mirror; at least one multilens system; at least one holographic optical element; at least
one meta optical element. Specifically, the transfer device comprises at least one refractive optical lens stack. Thus, the transfer device may comprise a multi-lens system having refractive properties.
The optoelectronic apparatus 112 comprises the at least one flood illumination source 116 configured for emitting infrared flood light.
The flood illumination source 116 may be configured for providing substantially continuous spatial illumination. The flood light may be substantially continuous spatial illumination, in particular diffuse and/or uniform illumination. The flood light has a wavelength in the infrared range, in particular in the near infrared range. The flood illumination source 116 may comprise at least one VCSEL, preferably a plurality of VCSELs.
A relative distance between the flood illumination source 116 and the pattern illumination source 114 is below 3.0 mm. The relative distance between the flood illumination source 116 and the pattern illumination source 114 is below 2.5 mm, preferably below 2.0 mm. The pattern illumination source 114 and the flood illumination source 116 may be combined into one module. For example, the pattern illumination source 114 and the flood illumination source 116 may be arranged on the same substrate, in particular having a minimum relative distance. The minimum relative distance may be defined by a physical extension of the flood illumination source 116 and the pattern illumination source 114. Arranging the pattern illumination source 114 and the flood illumination source 116 having a relative distance below 3.0 mm can result in decreased space requirement of the two illumination sources 114, 116. In particular, said illumination sources 114, 116 can even be combined into one module. Such a reduced space requirement can allow reducing the transparent area(s) in a display necessary for operation of the illumination source(s) 114, 116 behind the display 120.
In an embodiment, the pattern illumination source 114 and the flood illumination source 116 may comprise at least one VCSEL, preferably a plurality of VCSELs. The pattern illumination source 114 may comprise a plurality of first VCSELs mounted on a first platform. The flood illumination source 116 may comprise a plurality of second VCSELs mounted on a second platform. The second platform may be beside the first platform. The optoelectronic apparatus 112 may comprise a heat sink. Above the heat sink a first increment comprising the first platform may be attached. Above the heat sink a second increment comprising the second platform may be attached. The second increment may be different from the first increment. Thus, the first platform may be more distant to the optical element configured for increasing, e.g. duplicating, the number of spots. The second platform may be closer to the optical element. The beam emitted from the second VCSEL may be defocused and thus, form overlapping spots. This leads to a substantially continuous illumination and, thus, to flood illumination.
The optoelectronic apparatus 112 comprises the at least one image generation unit 118 configured for generating at least one pattern image while the pattern illumination source 114 is
emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source 116 is emitting infrared flood light.
The image generation unit 118 may be at least one one unit of the optoelectronic apparatus 112 configured for generating at least one image. The image generation may comprise capturing and/or generating and/or determining and/or recording at least one image by using the image generation unit 118. The image generation may comprise imaging and/or recording the image. The image generation may comprise capturing a single image and/or a plurality of images such as a sequence of images. For example, the image generation may comprise recording continuously a sequence of images such as a video or a movie. The image generation may be initiated by a user action or may automatically be initiated, e.g. once the presence of at least one object or user within a field of view and/or within a predetermined sector of the field of view of the image generation unit is automatically detected.
The image generation unit 118 may comprise at least one optical sensor, in particular at least one pixelated optical sensor. The image generation unit 118 may comprise at least one CMOS sensor or at least one CCD chip. For example, the image generation unit 118 may comprise at least one CMOS sensor which may be sensitive in the infrared spectral range. The image may be image data recorded by using the optical sensor, such as a plurality of electronic readings from the CMOS or CCD chip. The image may comprise raw image data or may be a pre- processed image. For example, the pre-processing may comprise applying at least one filter to the raw image data and/or at least one background correction and/or at least one background subtraction.
For example, the image generation unit 118 may comprise a monochrome camera, e.g. comprising monochrome pixels. For example, the image generation unit 118 may comprise a color camera, e.g. comprising color pixels. The image generation unit may comprise a color CMOS camera. For example, the camera may comprise monochrome pixels and color pixels. The color pixels and the monochrome pixels may be combined internally in the camera. The image generation unit 118 may comprise at least one color camera (e.g. RGB) and/or at least one monochrome camera, such as a monochrome CMOS. The camera may comprise at least one monochrome CMOS chip. The camera generally may comprise a one-dimensional or two- dimensional array of image sensors, such as pixels.
As outlined, above, the image generation unit 118 may be at least one color, e.g. RGB, camera. For example, the color camera may be a selfie camera of a smartphone.
The image generation unit 118 may have a field of view between 10°x10° and 75°x75°, preferably 55°x65°. The image generation unit may have a resolution below 2 MP, preferably between 0.3 MP and 1.5 MP.
The image generation unit 118 may comprise may comprise further elements, such as one or more optical elements, e.g. one or more lenses. As an example, the optical sensor may be a fix-
focus camera, having at least one lens which is fixedly adjusted with respect to the camera. Alternatively, however, the camera may also comprise one or more variable lenses which may be adjusted, automatically or manually. Other cameras, however, are feasible.
The pattern image may be an image generated by the image generation unit 118 while illuminating with the infrared light pattern, e.g. on an object and/or a user. The pattern image may comprise an image showing a user, in particular at least parts of the face of the user, while the user is being illuminated with the infrared light pattern. The pattern image may be generated by imaging and/or recording light reflected by an object and/or user which is illuminated by the infrared light pattern. For example, the illumination by the pattern illumination source 114 and the imaging by using the optical sensor may be synchronized, e.g. by using at least one control unit of the optoelectronic apparatus 112.
The flood image may be an image generated by the image generation unit 118 while illumination source is emitting infrared flood light, e.g. on an object and/or a user. The flood image may comprise an image showing a user, in particular the face of the user, while the user is being illuminated with the flood light. The flood image may be generated by imaging and/or recording light reflected by an object and/or user which is illuminated by the flood light. For example, the illumination by the flood illumination source 116 and the imaging by using the optical sensor may be synchronized, e.g. by using at least one control unit of the optoelectronic apparatus 112.
The image generation unit 118 may be configured for imaging and/or recording the pattern image and the flood image at the same time or at different times.
The optoelectronic apparatus 112 may be comprised in the device 110. The device 110 may comprise the at least one display 120, wherein the infrared light pattern traverses the display 120 while being emitted from the pattern illumination source 114 and/or the infrared flood light traverses the display 120 while being emitted from the flood illumination source 116.
The display 120 may be an arbitrary shaped device configured for displaying an item of information. The item of information may be arbitrary information such as at least one image, at least one diagram, at least one histogram, at least one graphic, text, numbers, at least one sign, an operating menu, and the like. The display 120 may be or may comprise at least one screen. The display 120 may have an arbitrary shape, e.g. a rectangular shape. The display 120 may be a front display of the device 110.
The display 120 may be or may comprise at least one organic light-emitting diode (OLED) display. The OLED display may be configured for emitting visible light.
The display 120 may be made of and/or may be covered by glass. In particular, the display 120 may comprise at least one glass cover.
The display 120 may be at least partially transparent. For example, the display 120 may be semitransparent in the near infrared region. For example, the display 120 may have a transparency of 20 % to 50 % in the near infrared region. The display 120 may have a different transparency for other wavelength ranges. The present invention may propose an optoelectronic apparatus 112 comprising the image generation unit 118 and two illumination sources 114, 116 that can be placed behind the display 120 of a device 110. The transparent area(s) of the display 120 can allow for operation of the optoelectronic apparatus 112 behind the display 120. The display 120 can be an at least partially transparent display, as described above. The display 120 may have a reduced pixel density and/or a reduced pixel size and/or may comprise at least one transparent conducting path. The transparent area(s) of the display 120 may have a pixel density of 300-440 PPI (pixels per inch), more preferably 350 to 450 PPI. Other areas of the display 120, e.g. non-transparent areas, may have pixel densities higher than 400 PPI, e.g. a pixel density of 450-500 PPI.
The display 120 comprises a display area. The display area may be an active area of the display 120, in particular an area which is activatable. The display 120 may have additional areas such as recesses or cutouts. The display 120 may be at least partially transparent in at least one continuous area, preferably in at least two continuous areas. At least one of the continuous areas may cover the image generation unit and/or the pattern illumination source 114 and/or the flood illumination source 116 at least partially. The pattern illumination source 114, the flood illumination source 116 and the image generation unit 118 may be placed in direction of propagation of the infrared light pattern in front of the display 120. As outlined above, the pattern illumination source 114 and the flood illumination source 116 may be combined into one module. This may allow reducing the transparent area(s) of the display 120.
The display 120 may have a first area associated with a first pixel density value and a second area associated with a second pixel density value. The first pixel density value may be lower than the second pixel density value. The first pixel density value may be equal to or below 450 PPI, preferably from 300 to 440 PPI, more preferably 350 to 450 PPI. The second pixel density value may be 400 to 500 PPI, preferably 450 to 500 PPI. The first pixel density value is associated with the at least one continuous area being at least partially transparent.
As outlined above, known displays fulfilling these requirements are less appealing to users and the transparent area is desired to be decreased. Moreover, usually, covering the image generation unit with such a display can result in a low contrast because of the light transmission of 15 % to 50 % in transparent regions. The contrast may be defined as a difference between signal and backlight. In this case, the contrast can be considered to be reduced by the transmission. However, the contrast may be defined as a ratio of signal to background. In this case, the contrast may not be reduced by the transmission only, but by the diffraction in the spot pattern projection as the higher orders bring additional intensity besides the zero-th order. Diffraction may further takes place due to the structure of a display wiring leading to a further decrease in irradiance. Moreover, the spot pattern projected through a display can result in
main spots corresponding to the pattern before traversing the display area with reduced intensity because higher order spots are generated. Both effects lead to a reduction in irradiance to about 3-5% of initial irradiance and the presence of undesired additional spots in the pattern. The present invention allows for providing a sufficient irradiance and, thus, contrast in the pattern image. The contrast can be increased by decreasing the number of spots projected onto the user. This can lead to an increase in irradiance of a spot and thus, to an increase in contrast in an image of the projection of the spot pattern. Moreover, reducing of transparent areas in displays can be possible. The pattern illumination source 114 and the flood illumination source 116 used for authentication of a user can be combined into one module to reduce the transparent area in displays. Using such an optoelectronic apparatus 112 can allow for a rearrangement of the position of a camera, e.g. from the outer most position of the display. Such a rearrangement can allow for an optimized wiring since a further position, the position of the image generation unit 118, can be optimized. This can decrease the used wiring or allow for an improved battery operation.
The device 110 is configured for authenticating a user of a device 110 to perform at least one operation on the device 110 that requires authentication. The authenticating may comprise verifying an identity of a user. Specifically, the authentication may comprise distinguishing between the user from other humans or objects, in particular between authorized access from non-authorized accesses. The authentication may comprise verifying identity of a respective user and/or assigning identity to a user. The authentication may comprise generating and/or providing identity information, e.g. to other devices or units such as to at least one authorization unit for authorization for providing access to the device 110. The identify information may be proofed by the authentication. For example, the identity information may be and/or may comprise at least one identity token. In case of successful authentication an image of a face recorded by the image generation unit 118 may be verified to be an image of the user’s face and/or the identity of the user is verified.
The authenticating of the user may be performed using at least one authentication unit 122. The authentication unit 122 may be configured for performing at least one authentication process of a user. The authentication unit 122 may comprise at least one processor. The processor may be an arbitrary logic circuitry configured for performing basic operations of a computer or system, and/or, generally, to a device which is configured for performing calculations or logic operations. In particular, the processor may be configured for processing basic instructions that drive the computer or system. As an example, the processor may comprise at least one arithmetic logic unit (ALU), at least one floating-point unit (FPU), such as a math co-processor or a numeric co-processor, a plurality of registers, specifically registers configured for supplying operands to the ALU and storing results of operations, and a memory, such as an L1 and L2 cache memory. In particular, the processor may be a multi-core processor. Specifically, the processor may be or may comprise a central processing unit (CPU). Additionally or alternatively, the processor may be or may comprise a microprocessor, thus specifically the processor’s elements may be contained in one single integrated circuitry (IC) chip. Additionally or
alternatively, the processor may be or may comprise one or more application-specific integrated circuits (ASICs) and/or one or more field-programmable gate arrays (FPGAs) and/or one or more tensor processing unit (TPU) and/or one or more chip, such as a dedicated machine learning optimized chip, or the like. The processor specifically may be configured, such as by software programming, for performing one or more evaluation operations.
The authentication unit 122 is configured for identifying the user based on the flood image. The identifying may comprise assigning an identity to a detected face and/or at least one identity check and/or verifying an identity of the user. The authentication process may comprise a plurality of steps. For example, the authentication process may comprise performing at least one face detection. The face detection step may comprise analyzing the flood image. In addition, for example, the authentication process may comprise identifying. The identifying may comprise assigning an identity to a detected face and/or at least one identity check and/or verifying an identity of the user. The identifying may comprise performing a face verification of the imaged face to be the user’s face. The identifying the user may comprise matching the flood image, e.g. showing a contour of parts of the user, in particular parts of the user’s face, with a template. The identifying of the user may comprise determining if the imaged face is the face of the user, in particular if the imaged face corresponds to at least one image of the user’s face stored in at least one memory, e.g. of the device.
The analyzing of the flood image may comprise one or more of the following: a filtering; a selection of at least one region of interest; a formation of a difference image between the flood image and at least one offset; an inversion of flood image; a background correction; a decomposition into color channels; a decomposition into hue; saturation; and brightness channels; a frequency decomposition; a singular value decomposition; applying a Canny edge detector; applying a Laplacian of Gaussian filter; applying a Difference of Gaussian filter; applying a Sobel operator; applying a Laplace operator; applying a Scharr operator; applying a Prewitt operator; applying a Roberts operator; applying a Kirsch operator; applying a high-pass filter; applying a low-pass filter; applying a Fourier transformation; applying a Radon- transformation; applying a Hough-transformation; applying a wavelet-transformation; a thresholding; creating a binary image. The region of interest may be determined manually by a user or may be determined automatically, such as by recognizing the user within the image. In particular, the analyzing of the flood image may comprise using at least one image recognition technique, in particular a face recognition technique. An image recognition technique comprises at least one process of identifying the user in an image. The image recognition may comprise using at least one technique selected from the technique consisting of: color-based image recognition, e.g. using features such as template matching; image segmentation and/or blob analysis e.g. using size, or shape; machine learning and/or deep learning e.g. using at least one convolutional neural network.
The analyzing of the flood image may comprise determining a plurality of facial features. The analyzing may comprise comparing, in particular matching, the determined facial features with
template features. The template features may be features extracted from at least one template. The template may be or may comprise at least one image generated in an enrollment process, e.g. when initializing the device 110. Template may be an image of an authorized user. The template features and/or the facial feature may comprise a vector. Matching of the features may comprise determining a distance between the vectors. The identifying of the user may comprise comparing the distance of the vectors to a least one predefined limit, wherein the user is successfully identified in case the distance is < the predefined limit at least within tolerances. The user declining and/or rejected otherwise.
For example, the image recognition may comprise using at least one model, in particular a trained model comprising at least one face recognition model. The analyzing of the flood image may be performed by using a face recognition system, such as FaceNet, e.g. as described in Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832. The trained model may comprises at least one convolutional neural network. For example, the convolutional neural network may be designed as described in M. D. Zeiler and R. Fergus, “Visualizing and understanding convolutional networks”, CoRR, abs/1311.2901 , 2013, or C. Szegedy et aL, “Going deeper with convolutions”, CoRR, abs/1409.4842, 2014. For more details with respect to convolutional neural network for the face recognition system reference is made to Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832. As training data labelled image data from an image database may be used. Specifically, labeled faces may be used from one or more of G. B. Huang, M. Ramesh, T. Berg, and E. Learned-Miller, “Labeled faces in the wild: A database for studying face recognition in unconstrained environments”, Technical Report 07-49, University of Massachusetts, Amherst, October 2007, the Youtube® Faces Database as described in L. Wolf, T. Hassner, and I. Maoz, “Face recognition in unconstrained videos with matched background similarity”, in IEEE Conf, on CVPR, 2011 , or Google® Facial Expression Comparison dataset. The training of the convolutional neural network may be performed as described in Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering”, arXiv: 1503.03832.
The authentication unit 122 is configured for using a facial recognition authentication process operating on the flood image, the pattern image and/or extracted material data. The authentication unit 122 may be configured for extracting material data from the pattern image.
The authentication unit 122 may be configured for extracting the material data from the pattern image by beam profile analysis of the light spots. With respect to beam profile analysis reference is made to WO 2018/091649 A1 , WO 2018/091638 A1 and WO 2018/091640 A1 , the full content of which is included by reference. Beam profile analysis can allow for providing a reliable classification of scenes based on a few light spots. Each of the light spots of the pattern image may comprise a beam profile. The extracting of the material data based on the pattern image may be performed by using at least one model.
The authentication process may comprise validating based on the extracted material data. The validating based on the extracted material data may comprise determining if the extracted material data corresponds a desired material data. Determining if extracted material data matches the desired material data may be referred to as validating. Allowing or declining the user and/or object to perform at least one operation on the device 110 that requires authentication based on the material data may comprise validating the authentication or authentication process. Validating may be based on material data and/or image. Determining if the extracted material data corresponds a desired material data may comprise determining a similarity of the extracted material data and the desired material data. Determining a similarity of the extracted material data and the desired material data may comprise comparing the extracted material data with the desired material data. Desired material data may refer to predetermined material data. In an example, desired material data may be skin. It may be determined if material data may correspond to the desired material data. In the example, material data may be non-skin material or silicon. Determining if material data corresponds to a desired material data may comprise comparing material data with desired material data. A comparison of material data with desired material data may result in a allowing and/or declining the user and/or object to perform at least one operation that requires authentication. In the example, skin as desired material data may be compared with non-skin material or silicon as material data and the result may be declination since silicon or non-skin material may be different from skin. In an embodiment, the authentication process or its validation may include generating at least one feature vector from the material data and matching the material feature vector with associate reference template vector for material.
The authentication unit 122 may be configured for authenticating the user in case the user can be identified and/or if the material data matches the desired material data. The device 110 may comprise at least one authorization unit 124 configured for allowing the user to perform at least one operation on the device 110, e.g. unlocking the device 110, in case of successful authentication of the user or declining the user to perform at least one operation on the device 110 in case of non-successful authentication.
Figure 2 shows an exemplary embodiment of a method for authenticating a user of a device 110 to perform at least one operation on the device 110 that requires authentication. the method comprising: a. (reference number 126) illuminating the user with at least one infrared light pattern comprising a plurality of infrared light spots from at least one pattern illumination source 114 of the device 110, wherein the number of infrared light spots is below or equal 4000 spots; b. (reference number 128) illuminating the user with infrared flood light from at least one flood illumination source 116 of the device 110, wherein the distance between the flood illumination source 116 and the pattern illumination source 114 is below 3.0 mm
c. (reference number 130) generating at least one pattern image with an image generation unit 118 of the device 110 showing the user, in particular at least parts of the face of the user, while the user is being illuminated with the infrared light pattern, and generating at least one image with an image generation unit 118 of the device showing the user while the user is being illuminated with the infrared flood light, wherein the image generation unit 118 and/or the illumination sources 114, 116 are covered at least partially by the display 120 of the device 110, d. (reference number 132) identifying the user based on the flood image by using at least one authentication unit 122 of the device 110, e. (reference number 134) extracting material data from the at least one pattern image by using the authentication unit 122; and f. (reference number 136) allowing the user to perform at least one operation on the device 110 that requires authentication based on the material data and the identifying.
The method steps may be performed in the given order or may be performed in a different order. Further, one or more additional method steps may be present which are not listed. Further, one, more than one or even all of the method steps may be performed repeatedly. The pattern image and/or the image showing the user while the user is being illuminated with the infrared flood light may be showing at least a portion of a face of the user.
The method may be computer implemented.
The optoelectronic apparatus 112 may comprise at least one optical element 138 having deflection properties. As outlined above, the VCSELs of the pattern illumination source 114 are mounted on the first platform and form the first VCSEL chip. The VCSELs of the flood illumination source 116 mounted on the second platform and form the second VCSEL chip. The optical element 138 may be configured for deflecting light emitted by the pattern illumination source 114 and light emitted by the flood illumination source 116 may be deflected differently. Figures 3A and 3B show embodiments of an optical element 138 having deflection properties.
In the embodiment of Figure 3A, the optical element 138 may comprise at least two different areas associated with at least two different deflection behaviors. The light emitted from the VCSEL chips may be deflected by the optical element 138 depending on the area of the optical element 138 the light illuminates. Thus, for example, the first VCSEL chip associated with the pattern illumination source 114 may illuminate a first area 140 of the optical element 138 and may be deflected by a first angle. The second VCSEL chip associated with the flood illumination source 116 may illuminate a second area 142 of the optical element 138 and may be deflected by a second angle. The optical element 138 may be embodied as one optical element having two different deflection properties or the optoelectronic apparatus 112 may comprise at least two optical elements 138, e.g. one of the optical elements 140 is arranged for being illuminated by the pattern illumination source 114, in particular the first VCSEL chip, and the other one of
the optical elements 142 is arranged for being illuminated by the flood illumination source 116, in particular the second VCSEL chip.
In the embodiment of Figure 3B, the optical element 138 may be wavelength dependent. The optical element 138 may be structured to deflect light of different wavelengths differently. The VCSEL emitter associated with the pattern illumination source 114 may be associated with a different wavelength than the VCSEL emitter associated with the flood illumination source 116. Additionally or alternatively, the two light beams may differ in the beam width. For example, the optical element 138 may be configured for deflecting light of different beam widths differently. The two light beams generated by the pattern illumination source 114 and the flood illumination source 116 may differ in the beam width. Thus, the light of the pattern illumination source 114 and the flood illumination source 116 may be deflected differently.
Figure 4 shows a schematic of an exemplary optoelectronic apparatus 210 in a side view. With respect to details of the optoelectronic apparatus 210 reference is made to the optoelectronic apparatus 112 described with respect to and shown in Figures 1 to 3. In the following only particularities in comparison with the optoelectronic apparatus 112 are described.
The optoelectronic apparatus 210 may comprise:
(1 ) at least one first Vertical-Cavity Surface-Emitting Laser 212, wherein the first Vertical-Cavity Surface-Emitting Laser comprises an active area 214 situated on a bottom surface 216 of the first Vertical-Cavity Surface-Emitting Laser 212;
(2) at least one second Vertical-Cavity Surface-Emitting Laser 218, wherein the second Vertical-Cavity Surface-Emitting Laser 218 comprises an active area 220 situated on a top surface 222 of the second Vertical-Cavity Surface-Emitting Laser 218, wherein the top surface 222 is opposite of a bottom surface 124 of the second Vertical-Cavity Surface-Emitting Laser 218;
(3) at least one supporting member 226; wherein the first Vertical-Cavity Surface-Emitting Laser 222 is arranged with the bottom surface 216 on the supporting member 226; and wherein the second Vertical-Cavity Surface-Emitting Laser 118 is arranged with the bottom surface 224 on the supporting member 226. The first Vertical-Cavity Surface-Emitting Laser 222 and the second Vertical-Cavity Surface-Emitting Laser 218 may be arranged on the same supporting member 226.
The supporting member 226 may be at least one of:
- a heat sink 227;
- an electrical connector 229, specifically a printed circuit board,
- a stiffening element.
The optoelectronic apparatus 210 further may comprise at least one optical lens 228. The first Vertical-Cavity Surface-Emitting Laser 212 and/or the second Vertical-Cavity Surface-Emitting Laser 218 may be configured for emitting illumination light 244, 246 through the optical lens. A first distance 230 between the first Vertical-Cavity Surface-Emitting Laser 212, specifically the
active area 214 of the first Vertical-Cavity Surface-Emitting Laser 212, and the at least one optical lens 228 differs from a second distance 232 between the second Vertical-Cavity Surface-Emitting Laser 218, specifically the active area 220 of the second Vertical-Cavity Surface-Emitting Laser 218, and the at least one optical lens 228.
The optical lens 228 may have at least one focal length, wherein the first distance 230 or the second distance 232 equals the focal length. The first distance 230 and the second distance 232 may differ by a thickness 232 of the first Vertical-Cavity Surface-Emitting Laser 212 or a thickness of the second Vertical-Cavity Surface-Emitting Laser 218. The thickness of the first Vertical-Cavity Surface-Emitting Laser 212 and the thickness of the second Vertical-Cavity Surface-Emitting Laser 218 may be the same.
A geometrical central axis 238 of the first Vertical-Cavity Surface-Emitting Laser 212 and a geometrical central axis 240 of the second Vertical-Cavity Surface-Emitting Laser 218 may be parallel. A normal vector to an emitting surface of the first Vertical-Cavity Surface-Emitting Laser 212 and a normal vector to an emitting surface of the second Vertical-Cavity Surface-Emitting Laser 218 may be parallel. The normal vector to the emitting surface of the first Vertical-Cavity Surface-Emitting Laser 212 and/or the normal vector to the emitting surface of the second Vertical-Cavity Surface-Emitting Laser 218 may be parallel to a central ray light of a light emitting cone of the respective Vertical-Cavity Surface-Emitting Laser 212, 218. The emitting surface may comprise the active area. A geometrical central axis 238 of the first Vertical-Cavity Surface-Emitting Laser 212 and a geometrical central axis 240 of the second Vertical-Cavity Surface-Emitting Laser 218 may be coaxial. The geometrical central axis 238 of the first Vertical-Cavity Surface-Emitting Laser 212 and/or the geometrical central axis 240 of the second Vertical-Cavity Surface-Emitting Laser 218 may be parallel to a geometrical central axis 242 of the at least one optical lens 228.
The first Vertical-Cavity Surface-Emitting Laser 212 may comprise the active area 214 situated on the bottom surface 216 may be configured for emitting illumination light 244 at least partially in direction of a top surface 247 of the first Vertical-Cavity Surface-Emitting Laser 212. At least one further component 248 of the first Vertical-Cavity Surface-Emitting Laser 212 may be at least partially transparent for the illumination light 244 emitted by the first Vertical-Cavity Surface-Emitting Laser 212. The at least one further component 248 of the first Vertical-Cavity Surface-Emitting Laser 212 may be a substrate 250 of the Vertical-Cavity Surface-Emitting Laser 212.
Figures 5A, 5B and Figures 6A and 6B show highly schematically further embodiments of the optoelectronic apparatus 312. With respect to details of the optoelectronic apparatus 312 reference is made to the optoelectronic apparatus 112 and the optoelectronic apparatus 212 as shown and described with respect to Figures 1 to 4. In the following only particularities in comparison with the optoelectronic apparatus 112 and the optoelectronic apparatus 212 are described.
The optoelectronic apparatus 312 may comprise: a light emitter structure 314 comprising a plurality of light emitters 316, wherein a first array 318 of light emitters of the plurality of light emitters 316 forms a pattern illumination source 320 configured for emitting the infrared light pattern, wherein a second array 322 of light emitters of the plurality of light emitters 316, different from the light emitters of the first array 318, forms a flood illumination source 324 configured for emitting the infrared flood light; a base 325 providing a single plane for mounting the light emitters 316; at least one system of optical elements 326 comprising a plurality of optical elements, wherein the system of optical elements 326 is configured for focusing the emitted infrared light pattern onto a focal plane, wherein the system of optical elements 326 covers the light emitter structure 314; at least one flood light optical element 330 configured for defocusing light emitted by the light emitters of the flood illumination source 324 thereby forming overlapping light spots, wherein the flood light optical element 330 is configured for leaving the emitted infrared light pattern uninfluenced.
The base 325 may comprise a plurality of cavities into which the light emitter structure 314 can be mounted. The base 325 may have an arbitrary shape such as a rectangular, a circular, a hexagonal shape. The shape may refer to a side of the base oriented perpendicular to the direction in which height is measured. The base 325 may comprise at least one semiconductor substrate. The base 325 may be an element of the light emitter structure and/or an additional element. The base 325 may be and/or may comprise a thermally conducting printed circuit board (PCB).
The light emitters 316 of the light emitter structure 314 may form a chip of light emitters, e.g. a VCSEL die, e.g. as sawn out of a wafer. Such a chip of light emitters 316 may be mounted on the base, e.g. by using at least one heat conductive adhesive. The base 325 may comprise at least one thermally conducting material. The base 325 can be the bottom of the optoelectronic apparatus 312, e.g. of a housing of the optoelectronic apparatus 312. Thus, dimensions of the base 325 may be defined by the dimension of the optics and the housing. Alternatively, the base 325 and the housing may be separate elements. For example, the chip of light emitters 316 may be mounted, e.g. by using at least one heat conductive adhesive, on the base 325, e.g. the PCB, and the housing may be applied to this combined element.
The base 325 may comprise at least one thermally conducting material, in particular thermally conducting materials. The thermally conducting materials may be configured as a heat exchanger. The thermally conducting materials may be configured for regulating the temperature of the light emitter. The thermally conducting materials may be configured for transferring heat generated by a light emitter away from the light emitter. For example, the thermally conducting materials may comprise at least one composite material. The light emitter structure may be mounted on the thermally conducting materials.
Each of the light emitters 316 may comprise at least one vertical cavity surface emitting laser (VCSEL). The light emitters 316 may be configured for emitting light in the near infrared spectral range, preferably a wavelength of the emitted light is from 760 nm to 1 .5 pm, preferably 940 nm, 1140 nm or >1400 nm.
The light emitters 316 of the pattern illumination source 320 and of the flood illumination source 324 may be active at different points in time.
The light emitters 316 of the light emitter structure 314 may be arranged in a periodic pattern. The light emitters 316 of the light emitter structure 314 may be arranged in one or more of a grid pattern, a hexagonal pattern, a shifted hexagonal pattern or the like. A plurality of light emitters 316 of the light emitter structure 314 may form the first array 318 of light emitters 316 and a plurality of light emitters 316, different from the light emitters 316 of the first array 318, of the light emitter structure 314 form the second array 322 of light emitters 316. The light emitter structure 314 may comprise two light emitter arrays (318, 322), e.g. two VCSEL arrays. The arrays (318, 322) are located in one plane.
The first array 318 and the second array 322 of emitters 316 may be produced as a single die directly on the plane or the first array 318 and the second array 322 of emitters 316 are produced separately and are installed, e.g. side by side, on the plane. However, embodiments are possible, in which even more arrays are used, e.g. configured for providing different functions.
The system of optical elements 326 may comprise one or more of at least one refractive lens, a plurality of refractive lenses; at least one diffractive optical element (DOE), a plurality of DOEs, a plurality of meta lenses. For example, the system of optical elements 326 may comprise at least one refractive lens and at least one optical element configured for increasing, e.g. duplicating, the number of spots, e.g. the spots generated by the light emitters of the pattern illumination source. In particular, the system of optical elements 326 may comprise at least one diffractive optical element (DOE) and/or at least one metasurface element. The DOE and/or the metasurface element may be configured for generating multiple light beams from a single incoming light beam. For example, a VCSEL projecting up to 2000 spots and an optical element comprising a plurality of metasurface elements may be used to duplicate the number of spots. Further arrangements, particularly comprising a different number of projecting VCSEL and/or at least one different optical element configured for increasing the number of spots may be possible. Other multiplication factors are possible. For example, a VCSEL or a plurality of VCSELs may be used and the generated laser spots may be duplicated by using at least one DOE.
The system of optical elements 326 covers the light emitter structure 314. The system of optical elements 326 may be designed and/or arranged such that it covers the light emitter structure 314. For example, the first array 318 may be covered by the system of optical elements 326 only. For example, both of the first array 318 and the second array 322 may be covered by the
system of optical elements 326. The light emitter structure 314 may be located at the focal point of the system of optical elements 326. Such an arrangement may allow that the emitted light of the light emitters 316, in particular of the light emitters 316 of the pattern illumination source 320, is collimated.
The flood light optical element 330 may be configured for defocusing light emitted by the light emitters 316 of the flood illumination source 324 thereby forming overlapping light spots. The flood light source 324 in combination with the flood light optical element 330 is configured for generating flood light, in particular diffuse illumination. The flood light optical element 330 may comprise at least one element selected from the group consisting of: at least one plate with a refractive index larger than 1.4, e.g. a glass plate, at least one diffusor plate, at least one lens, at least one micro lens, at least one prisma, at least one fresnel lens, at least one diffractive optical element (DOE), at least one meta lens. The second array 322 may be completely covered by the flood light optical element 330.
Figures 5A and 5B show an example in which the first array 318 and the second array 322 may be arranged side by side on the plane, in particular adjacent. Figure 5A shows a top view of an exemplary layout of the light emitter structure 314. In this embodiment, the light emitter structure 314 may comprise two VCSEL arrays 318, 322. The VCSEL arrays 318, 322 are located on one plane, in this case side by side. Figure 5B shows for the layout of the light emitter structure 314 of Figure 5A, an embodiment of operation of the optoelectronic apparatus 312. On the left side of Figure 5B, the optoelectronic apparatus 312 is shown with activated illumination source 320. On the right side of Figure 5B, the optoelectronic apparatus 312 is shown with activated flood illumination source 324. The plane of the base 325 may comprise along a direction perpendicular to an optical axis of the optoelectronic apparatus 312 firstly the first array 318 and subsequently the second array 322. The VCSEL array 318 may define a dot pattern, e.g. a periodic regular pattern such as a simple grid pattern, hexagon pattern, shifted hexagon pattern and the like. The VCSEL array 318 may be located at the focal point of the system of optical elements 326, i.e. all cavities are collimated. The second array 322 may be responsible for flood illumination, in particular for diffuse illumination. This array 322 may be completely covered by flood light optical element 330. As shown in Figure 5B, the second array 322 may be located at the focal point of the system of optical elements 326, too. Thus, the light generated by the second array 322 would also be in focus. However, the optical imaging is changed by the additional flood light optical element 330 such that the cavities are not collimated properly. This can allow generating a diffuse flood illumination. The flood light optical element 330 is configured for leaving the emitted infrared light pattern uninfluenced. The VCSEL array 318 may not be covered by this flood light optical element 330.
As shown in Figure 5B, the optoelectronic apparatus 312 comprises a single flood light optical element 330 covering all light emitters of the second array 322. Other embodiments are feasible, e.g. as shown in Figure 6B.
In Figure 6B, for example, the optoelectronic apparatus 312 comprises a plurality of flood light optical elements 330. For example, each light emitter 316 of the second array 322 comprises at least one assigned flood light optical element 330. Figure 6A shows a top view of an exemplary layout of the light emitter structure 314. In this embodiment, the light emitter structure 314 may comprise two VCSEL arrays 318, 322. The VCSEL arrays 318, 322 are located on one plane. Figures 6A and 6B show an example, in which cavities of the light emitters 316 of the first array 318 and cavities of the light emitters 316 of the second array 322 may form a combined pattern, in which the cavities of light emitters 316 of the first array 318 and cavities of the light emitters 316 of the second array 322 alternate, e.g. line by line. White circles in Figure 6A denote cavities of the illumination source 320 and black circles denote cavities of the flood illumination source 324, e.g. each having an additional micro lens. On the left side of Figure 6B, the optoelectronic apparatus 312 is shown with activated illumination source 320. On the right side of Figure 6B, the optoelectronic apparatus 312 is shown with activated flood illumination source 324.
List of reference numbers
110 device
112 optoelectronic apparatus
114 pattern illumination source
116 flood illumination source
118 image generation unit
120 display
122 authentication unit
124 authorization unit
126 illuminating the user with infrared light pattern
128 illuminating the user with infrared flood light
130 generating at least one pattern image and generating at least one flood image
132 identifying the user
134 extracting material data
136 allowing
138 optical element having deflection properties
140 first area
142 second area
210 optoelectronic apparatus
212 first Vertical-Cavity Surface-Emitting Laser
214 active area
216 bottom surface
218 second Vertical-Cavity Surface- Emitting Laser
220 active area
222 top surface
224 bottom surface
226 supporting member
227 heat sink
228 optical lens
229 electrical connector
230 first distance
232 second distance
234 thickness
236 thickness
238 central axis
240 central axis
242 central axis
244 illumination light
246 illumination light
247 top surface
component substrate optoelectronic apparatus light emitter structure light emitters first array pattern illumination source second array flood light source base system of optical elements flood light optical element
Claims
1 . An optoelectronic apparatus (112) comprising: at least one pattern illumination source (114) configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, at least one flood illumination source (116) configured for emitting infrared flood light, at least one image generation unit (118) configured for generating at least one pattern image while the pattern illumination source (114) is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source (116) is emitting infrared flood light, wherein a relative distance between the flood illumination source (116) and the pattern illumination source (114) is below 3.0 mm.
2. The optoelectronic apparatus (112) according to claim 1 , wherein the optoelectronic apparatus (112) is comprised in a device (110), wherein the device (110) comprises at least one display (120), wherein the infrared light pattern traverses the display (120) while being emitted from the pattern illumination source (114) and/or the infrared flood light traverses the display while being emitted from the flood illumination source (116).
3. The optoelectronic apparatus (112) according to any one of claims 1 or 2, wherein the relative distance between the flood illumination source (116) and the pattern illumination source (114) is below 2.5 mm, preferably below 2.0 mm.
4. The optoelectronic apparatus (112) according to any one of claims 1 to 3, wherein the infrared light pattern is a hexagonal pattern, preferably a 2/5 hexagonal infrared light pattern, and/or wherein at least one of the infrared light spots is associated with a beam divergence of 0.2° to 0.5°, preferably 0.1 ° to 0.3°.
5. The optoelectronic apparatus (112) according to any one of claims 1 to 4, wherein the infrared light pattern comprises equal or less than 3000 spots, preferably equal or less than 2000 spots.
6. The optoelectronic apparatus (112) according to any one of claims 1 to 5, wherein the pattern illumination source (114) comprises at least one vertical cavity surface-emitting laser (VCSEL), preferably a plurality of VCSELs, and/or wherein the pattern illumination source comprises at least one optical element configured for increasing the number of spots, wherein the optical element comprises at least one diffractive optical element (DOE) and/or at least one metasurface element.
7. The optoelectronic apparatus (112) according to any one of claims 1 to 6, wherein the pattern illumination source (114) comprises a plurality of first VCSELs mounted on a first platform, wherein the flood illumination source (116) comprises a plurality of second VCSELs mounted on a second platform, wherein the optoelectronic apparatus (112) comprises a heat sink, wherein above the heat sink a first increment comprising the first platform is attached, wherein above the heat sink a second increment comprising the second platform is attached.
8. Use of an optoelectronic apparatus (112) according to any one of claims 1 to 7 for authenticating a user of a device (110) comprising the apparatus (112).
9. A device (110) for authenticating a user of a device (110) to perform at least one operation on the device (110) that requires authentication, the device (110) comprising: at least one flood illumination source (116) configured for emitting infrared flood light; at least one pattern illumination source (114) configured for emitting at least one infrared light pattern comprising a plurality of infrared light spots, wherein the number of infrared light spots is below or equal 4000 spots, wherein a relative distance between the flood illumination source (116) and the pattern illumination source (114) is below 3.0 mm; at least one image generation unit (118) configured for generating at least one pattern image while the pattern illumination source (114) is emitting infrared light pattern and configured for generating at least one flood image while the flood illumination source (116) is emitting infrared flood light; at least one display (120) covering the flood illumination source (116), the pattern illumination source (114) and the image generation unit (118) at least partially, at least one authentication unit configured for performing at least one authentication process of a user using the flood image and the pattern image.
10. The device (110) according to claims 2 or 9, wherein the display (120) is or comprises at least one organic light-emitting diode (OLED) display.
11 . The device (110) according to claims 2 or 9, wherein the display (120) has a first area associated with a first pixel density value and a second area associated with a second pixel density value, wherein the first pixel density value is below 350 pixels per inch and a second area having a pixels per inch value equal to or above 400.
12. The device (110) according to any one of claims 2, 9 or 10, wherein the device (110) is selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality
device, and/or a wearable, such as a smart watch; or another type of portable computer.
13. A method for authenticating a user of a device (110) to perform at least one operation on the device (110) that requires authentication, the device (110) comprising a display (120) and the method comprising: a. (126) illuminating the user with the infrared light pattern comprising a plurality of infrared light spots from a pattern illumination source (114) of the device (110), wherein the number of infrared light spots is below or equal 4000 spots; b. (128) illuminating the user with infrared flood light from a flood illumination source (116) of the device, wherein the distance between the flood illumination source (116) and the pattern illumination source (114) is below 3.0 mm c. (130) generating at least one pattern image with an image generation unit (118) of the device (110) showing the user while the user is being illuminated with the infrared light pattern, and generating at least one image with an image generation unit (118) of the device showing the user while the user is being illuminated with the infrared flood light, wherein the image generation unit (118) and/or the flood and pattern illumination sources (114, 116) are covered at least partially by the display (120) of the device (110), d. (132) identifying the user based on the flood image by using at least one authentication unit (122) of the device (110), e. (134) extracting material data from the at least one pattern image by using the authentication unit (122); and f. (136) allowing the user to perform at least one operation on the device (110) that requires authentication based on the material data and the identifying.
14. A computer program comprising instructions which, when the program is executed by the device (110) according to any one of the preceding claims referring to a device (110), cause the device (110) to perform the method according to any one of the preceding claims referring to a method.
15. A computer-readable storage medium comprising instructions which, when the instructions are executed by the device according to any one of the preceding claims referring to a device (110), cause the device (110) to perform the method according to any one of the preceding claims referring to a method.
16. A non-transient computer-readable medium including instructions that, when executed by one or more processors, cause the one or more processors to perform the method according to any one of the preceding claims referring to a method.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP23156862 | 2023-02-15 | ||
EP23156862.7 | 2023-02-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024170598A1 true WO2024170598A1 (en) | 2024-08-22 |
Family
ID=85251741
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2024/053682 WO2024170598A1 (en) | 2023-02-15 | 2024-02-14 | Behind oled authentication |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024170598A1 (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017222618A1 (en) | 2016-06-23 | 2017-12-28 | Apple Inc. | Top-emission vcsel-array with integrated diffuser |
WO2018091640A2 (en) | 2016-11-17 | 2018-05-24 | Trinamix Gmbh | Detector for optically detecting at least one object |
WO2021259923A1 (en) | 2020-06-23 | 2021-12-30 | Trinamix Gmbh | Projector for diffuse illumination and structured light |
US20220069048A1 (en) * | 2020-09-03 | 2022-03-03 | Samsung Display Co., Ltd. | Display panel and display apparatus including the same |
WO2022101429A1 (en) | 2020-11-13 | 2022-05-19 | Trinamix Gmbh | Depth measurement through display |
US20220190064A1 (en) * | 2020-12-11 | 2022-06-16 | Google Llc | Display structure for smooth transition between different pixel density regions in an oled display |
CN114779491A (en) | 2022-06-20 | 2022-07-22 | 安思疆科技(南京)有限公司 | Terminal display module and mobile terminal |
-
2024
- 2024-02-14 WO PCT/EP2024/053682 patent/WO2024170598A1/en unknown
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017222618A1 (en) | 2016-06-23 | 2017-12-28 | Apple Inc. | Top-emission vcsel-array with integrated diffuser |
WO2018091640A2 (en) | 2016-11-17 | 2018-05-24 | Trinamix Gmbh | Detector for optically detecting at least one object |
WO2018091638A1 (en) | 2016-11-17 | 2018-05-24 | Trinamix Gmbh | Detector for optically detecting at least one object |
WO2018091649A1 (en) | 2016-11-17 | 2018-05-24 | Trinamix Gmbh | Detector for optically detecting at least one object |
WO2021259923A1 (en) | 2020-06-23 | 2021-12-30 | Trinamix Gmbh | Projector for diffuse illumination and structured light |
US20220069048A1 (en) * | 2020-09-03 | 2022-03-03 | Samsung Display Co., Ltd. | Display panel and display apparatus including the same |
WO2022101429A1 (en) | 2020-11-13 | 2022-05-19 | Trinamix Gmbh | Depth measurement through display |
US20220190064A1 (en) * | 2020-12-11 | 2022-06-16 | Google Llc | Display structure for smooth transition between different pixel density regions in an oled display |
CN114779491A (en) | 2022-06-20 | 2022-07-22 | 安思疆科技(南京)有限公司 | Terminal display module and mobile terminal |
Non-Patent Citations (6)
Title |
---|
ANONYMOUS: "trinamiX Face Authentication - A new gold standard for biometric authentication", 15 November 2022 (2022-11-15), pages 1 - 2, XP093056080, Retrieved from the Internet <URL:https://trinamixsensing.com/media/trinamix_face_authentication_product_brief_snapdragon_summit_2022.pdf> [retrieved on 20230620] * |
C. SZEGEDY ET AL.: "Going deeper with convolutions", CORR, ABS/1409.4842, 2014 |
FLORIAN SCHROFFDMITRY KALENICHENKOJAMES PHILBIN: "FaceNet: A Unified Embedding for Face Recognition and Clustering", ARXIV: 1503.03832 |
G. B. HUANGM. RAMESHT. BERGE. LEARNED-MILLER: "Labeled faces in the wild: A database for studying face recognition in unconstrained environments", TECHNICAL REPORT 07-49, UNIVERSITY OF MASSACHUSETTS, AMHERST, October 2007 (2007-10-01) |
L. WOLFT. HASSNERI. MAOZ: "Face recognition in unconstrained videos with matched background similarity", IEEE CONF. ON CVPR, 2011 |
M. D. ZEILERR. FERGUS: "Visualizing and understanding convolutional networks", CORR, ABS/1311.2901, 2013 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10650540B2 (en) | Determining sparse versus dense pattern illumination | |
US20220026041A1 (en) | Illumination device and electronic apparatus including the same | |
US9504384B2 (en) | Information processing apparatus and information processing method | |
US12288421B2 (en) | Optical skin detection for face unlock | |
CN116438467A (en) | Depth measurement by display | |
US20240402342A1 (en) | Extended material detection involving a multi wavelength projector | |
WO2024170598A1 (en) | Behind oled authentication | |
US11906421B2 (en) | Enhanced material detection by stereo beam profile analysis | |
US20210264625A1 (en) | Structured light code overlay | |
WO2024170597A1 (en) | Behind oled authentication | |
WO2024200502A1 (en) | Masking element | |
EP4530666A1 (en) | 2in1 projector with polarized vcsels and beam splitter | |
US12205382B2 (en) | Apparatus and method for automatic license plate recognition of a vehicle | |
WO2025012337A1 (en) | A method for authenticating a user of a device | |
WO2025051733A1 (en) | Optical element for authenticating a user | |
WO2024231531A1 (en) | Projector with oled | |
WO2025046067A1 (en) | Optical elements on flood vcsels for 2in1 projectors | |
WO2025046063A1 (en) | Single vcsel combined dot and flood projector | |
WO2025040650A1 (en) | Face authentication reference measurement timing | |
WO2025003364A1 (en) | Rolling shutter rgb-ir sensor | |
WO2025045642A1 (en) | Biometric recognition system | |
WO2025036901A1 (en) | Interlock for an optical lens of an optical element | |
WO2025068196A1 (en) | Biometric recognition dataset generation | |
US20240027188A1 (en) | 8bit conversion | |
WO2025040591A1 (en) | Skin roughness as security feature for face unlock |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24704213 Country of ref document: EP Kind code of ref document: A1 |