EP3724674A1 - 3d sensorsystem mit einer freiformoptik - Google Patents
3d sensorsystem mit einer freiformoptikInfo
- Publication number
- EP3724674A1 EP3724674A1 EP18826219.0A EP18826219A EP3724674A1 EP 3724674 A1 EP3724674 A1 EP 3724674A1 EP 18826219 A EP18826219 A EP 18826219A EP 3724674 A1 EP3724674 A1 EP 3724674A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- light
- illumination
- scene
- sensor system
- measuring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000005286 illumination Methods 0.000 claims abstract description 223
- 238000012545 processing Methods 0.000 claims abstract description 38
- 239000007787 solid Substances 0.000 claims abstract description 33
- 238000000034 method Methods 0.000 claims abstract description 17
- 230000003287 optical effect Effects 0.000 claims description 83
- 238000005259 measurement Methods 0.000 claims description 58
- 238000001514 detection method Methods 0.000 claims description 41
- 230000001419 dependent effect Effects 0.000 claims description 38
- 238000009826 distribution Methods 0.000 claims description 31
- 230000008859 change Effects 0.000 claims description 19
- 238000011156 evaluation Methods 0.000 claims description 19
- 230000002123 temporal effect Effects 0.000 claims description 6
- 206010034960 Photophobia Diseases 0.000 claims description 5
- 208000013469 light sensitivity Diseases 0.000 claims description 5
- 230000009471 action Effects 0.000 claims description 4
- 230000007423 decrease Effects 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 claims description 2
- 230000010363 phase shift Effects 0.000 claims description 2
- 230000000875 corresponding effect Effects 0.000 description 15
- 230000033001 locomotion Effects 0.000 description 11
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 238000007493 shaping process Methods 0.000 description 8
- 238000009825 accumulation Methods 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 6
- 238000010191 image analysis Methods 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 230000003068 static effect Effects 0.000 description 5
- 230000006978 adaptation Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 4
- 230000003595 spectral effect Effects 0.000 description 4
- 230000003044 adaptive effect Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000002776 aggregation Effects 0.000 description 2
- 238000004220 aggregation Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000010287 polarization Effects 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 239000002699 waste material Substances 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000000149 argon plasma sintering Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008092 positive effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4811—Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
- G01S7/4815—Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4816—Constructional features, e.g. arrangements of optical elements of receivers alone
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
- G01S7/4863—Detector arrays, e.g. charge-transfer gates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4868—Controlling received signal intensity or exposure of sensor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/491—Details of non-pulse systems
- G01S7/4912—Receivers
- G01S7/4913—Circuits for detection, sampling, integration or read-out
- G01S7/4914—Circuits for detection, sampling, integration or read-out of detector arrays, e.g. charge-transfer gates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/491—Details of non-pulse systems
- G01S7/4912—Receivers
- G01S7/4918—Controlling received signal intensity, gain or exposure of sensor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/32—Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/499—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using polarisation effects
Definitions
- the present invention relates to a sensor system and a method for three-dimensional acquisition of a scene based on transit time measurements. Furthermore, the present invention relates to several uses of such
- Actuators operated closure body used, which facilitate the handling of the respective closure body for operators or automatically operated without any action, for example, when an object to be passed to the opening passes into the region of the opening.
- Such an opening may, for example, be a passageway in a building.
- a closure body may be, for example, a door or a gate.
- a face recognition for example, a face recognition.
- a 3D sensor system for the field of application of the monitoring of automatically opening doors and / or gates, which is based on the principle of transit time measurement of light beams emitted by illumination sources and after an at least partial reflection or 180 ° Backscatter be detected by a light receiver.
- Such sensor systems are commonly referred to as "time-of-flight” (TOF) sensor systems.
- TOF sensor systems have the disadvantage that with
- the intensity of the (backscattered) measuring light to be detected by a light receiver of the TOF sensor is weakened in two respects. In the case of a punctiform illumination light source without a special focus, it scales
- Attenuation of the illumination light emitted by the illumination sources with l / d / 2, where d is the distance to the illumination light source.
- d is the distance to the illumination light source.
- Illumination light is scattered isotropically, perceived as a point light source. As a result, this leads to a l / d 'scaling of the intensity of the received measurement light.
- beamforming realized in any manner, for example focusing, of the illumination light, of the measuring light and / or in the case of non-isotropic scattering of the illumination light with a preferred beam
- Intensity attenuation correspondingly lower, but still contributes to one significant loss of light output at. This in turn leads to a correspondingly poor energy efficiency of a TOF sensor system.
- Sensor system comprises (a) a lighting device for illuminating the scene with an illumination light along an illumination beam path; (B) a measuring device (BL) for receiving measuring light along a
- Measuring light beam path wherein the measuring light is at least partially backscattered by at least one object in the scene illumination light and (b2) for measuring distances between the sensor system and the at least one object based on a light transit time of the illumination light and
- Measuring light beam path is arranged.
- the free-form optical system is configured in such a way that (dl), when arranged in the illumination beam path, has a
- Illumination light intensity of the illumination light depends on the solid angle of the illumination beam path, and (d2) in an arrangement in the Measuring light beam path, a measuring light intensity of the measuring light depends on the solid angle of the measuring light beam path, so that a distance-based loss of intensity of the illumination light and the measuring light is at least partially compensated.
- the described sensor system which is a so-called Time Of Flight (TOF) sensor system, is based on the knowledge that at least partially compensated by a suitably designed freeform optics of the scene geometry and the spatial angle dependent influences on the intensity distribution of the received measurement light.
- the free-form optics ensure that in an image of the scene generated by the received measurement light, there are neither (excessively) underexposed nor (excessively) overexposed portions. If the free-form optical system is in the beam path of the illumination light, then already by a suitable space angle-dependent intensity distribution of
- Lighting light ensured that the measuring light from each part of the scene with a homogeneous distribution of intensity on a
- Light receiver of the measuring device hits.
- a suitable collection of measuring light beams ensures or contributes to ensuring that the measuring light is as homogeneous as possible
- Intensity distribution hits the light receiver.
- the free-form optical system in the illumination beam path ensures that the scene is illuminated or exposed with a lighting angle-dependent illumination intensity.
- the measuring light beam path provides the
- Freeform optics for the fact that the measuring light depending on the space angle collected differently, in particular with a space angle-dependent focusing. Both effects contribute to that of all parts of the scene
- received measuring light is at least approximately equal in terms of its intensity. This can avoid being in an image of the captured scene from the measuring light underexposed and / or overexposed portions are.
- Intensity distribution of the illumination light or a spatial characteristic of the illumination light and / or a spatial angle-dependent collection of measurement light intensity or spatial measurement light collection characteristic are set so that the intensity of the illumination light in particular is just as high as it is associated with a reliable detection of the respective solid angle range Subsection of the scene is required.
- scene may in particular be understood to mean that spatial area which is optically detected by the sensor system. Objects in the scene are recognized by a suitable image analysis.
- the data processing device can make use of known methods for image evaluation and / or image analysis.
- Data processing device can therefore be a special
- object can be understood to mean any spatially physical structure which has a surface texture which results in an at least Partial reflection or scattering of illumination light leads and thus by the resulting measuring light for the measuring device is visible.
- the object may be an object such as a motor vehicle or a living being such as a human.
- the object may be in relation to that
- Sensor system be static or dormant object. Furthermore, the object may also move within, leave or enter the scene.
- speed distance / time.
- the absolute value of the speed and / or the motion vector i. additionally the
- illumination light in this document means those electromagnetic waves which are emitted by a light source or a lighting unit of the illumination device and strike the relevant object of the scene.
- the “measuring light” are the electromagnetic waves scattered by or on the object (back), which are received by the measuring device or a light receiver of the measuring device and used for the three-dimensional evaluation of the scene, together with the
- characteristic of a scene can be understood as the entirety of all spatial structures and in particular all objects which are detected by the sensor system.
- the characteristic of the scene changes when (i) new objects enter the scene when (ii) objects already in the scene change their position and / or their visual appearance, in particular their optical scattering behavior, and / or if (iii) Object left the scene.
- some structures can be recognized as being relevant and other structures as being less or even irrelevant by the data processing device by means of image processing and / or image recognition.
- free-form optics can be understood as any optical structure that modifies the beam path of light beams, which provides for a spatial angle-dependent modification of the intensity of the illumination light and / or for a space angle-dependent collection (the intensity of the measurement light).
- the freeform optics can be a static optic. This means that the scene regardless of their current characteristics in different scene captures (and scene detections or
- Scene evaluations by the data processing device is always illuminated with the same spatial angle-dependent modification of the illumination light and / or detected with the same spatial angle-dependent modification of the measurement light.
- the operating state is always illuminated with the same spatial angle-dependent modification of the illumination light and / or detected with the same spatial angle-dependent modification of the measurement light.
- a free-form optics can consist of an optical element.
- a freeform optics can also consist of a plurality of optical elements connected in series and realized, for example, by means of a lens system.
- distance-based loss of intensity can mean the one
- Illumination light beams or the measuring light beams (after a scattering or reflection on an object) is caused.
- this loss of illumination light scales with l / d ⁇ , where d is the distance to the point light source.
- d is the distance to the point light source.
- the measuring light when a point of the object at which the illumination light is scattered isotropically, as a point light source.
- Intensity of the received measuring light is given.
- the "distance-based intensity loss" is correspondingly lower, but in practice nevertheless a significant loss, which reduces the energy efficiency of a TOF sensor.
- these losses are at least partially reduced or compensated by the free-form optics.
- optical and / or “light” may refer to electromagnetic waves having a particular wavelength or frequency or a particular spectrum of wavelengths or frequencies.
- the electromagnetic waves used can be assigned to the spectral range visible to the human eye.
- electromagnetic waves associated with the ultraviolet (UV) or infrared (IR) spectral regions may also be used.
- the IR spectral range can extend into the long-wave IR range with wavelengths between 3.5 pm to 15 pm, which are determined by means of the
- Light receiver of the sensor can be detected.
- Freeform optics can also be optically used profitably in TOF sensor systems which sequentially illuminate the scene
- Scanning light beam such as a laser beam.
- the sensor system further comprises a further free-form optics, wherein the free-form optics in the Illumination beam path is arranged and the further free-form optics is arranged in the measuring light beam path.
- Illumination light and measuring light can be optimally modified independently of each other.
- both "lights" can be adaptive independently of each other in terms of their spatial angle-dependent
- the free-form optical system has at least one reflective optical element, at least one refractive optical element and / or at least one diffractive optical element.
- reflective means (at least partial) repulsion of electromagnetic waves at an interface, with respect to a normal of this interface the angle of incidence of the incident electromagnetic wave equal to
- the reflective interface of the freeform optics has the highest possible
- a reflective element may in particular be a mirror.
- the freeform optics can thus at least one mirror (with a suitably curved
- refractive or refraction is the Change of propagation direction of an electromagnetic wave due to a spatial change of its propagation velocity, which is specifically described for light waves by the refractive index n of a medium.
- a refractive element of the described free-form optics may be a suitable shaped lens.
- Other optically refractive elements such as, for example, a (modified prism) can also be used in order to realize a suitable beam angle-dependent beam shaping of the illumination light and / or the measurement light.
- the term "diffractive” or “diffraction” (or diffraction) in this context refers generally to the spatial deflection of an electromagnetic wave at structural obstacles. Such obstacles may be an edge, a hole or a one-dimensional, a two-dimensional or even a three-dimensional grid.
- the diffractive optical element can be, for example and preferably, a diffractive optical element (DOE), which advantageously permits a dynamic adaptation or adaptation of the space angle-dependent characteristic of the illumination light and / or the space angle-dependent characteristic of the measurement light.
- DOE diffractive optical element
- the free-form optical system has at least one spatially and / or structurally changeable optical element, a change of the optical element resulting in a change of the solid angle dependency of the illumination light intensity and / or
- Intensity distribution of illumination light and / or measurement light to be adapted to a change in the characteristic of the scene to be detected that all (relevant) objects in the scene with high accuracy can be detected.
- the described change of the free-form optics can thus be caused by the data processing device during the operation of the sensor system depending on the scene.
- the free-form optics can also be tested according to the principle of "try-and-error" or by other statistical,
- a spatial change of the optical element can, for example, be a simple displacement and / or rotation of this optical element in a coordinate system of the sensor system and / or relative to other optical elements of the freeform optical system. This can be done preferably by means of an actuator in an automatic manner, wherein a control of the actuator from the data processing device or from any other with respect to the sensor device internal or external freeform control device can take place.
- variable optical element may also be an elastic element, for example a deformable mirror.
- the elastic member may be a pressure-deformable member of an elastic optical material which changes its optical properties (reflection, diffraction and / or refraction behavior) under pressure.
- a structural change in a DOE can lead to a desired modification of the solid angle dependence of the illumination light and / or the measurement light.
- DLP digital light processing
- Microelectromechanical systems MEMS devices for a targeted movement of a plurality of optical elements,
- micromirrors are used, so that there is a desired spatial angle-dependent intensity distribution of the illumination light and / or the measuring light.
- the free-form optical system and / or the illumination device is or are configured, which
- illumination light and / or the measurement light with a space angle-dependent intensity distribution which at least approximately compensates for an edge light drop, in particular a natural edge light drop according to the one brightness in an image when a uniformly bright subject is imaged by a lens by the factor cos'M compared to the brightness in the image Center of the picture decreases.
- Measuring device can be used.
- the compensation of the natural peripheral light drop by the free-form optical system described herein and / or illumination by the illumination device uneven across the scene contribute particularly to an improvement of the light intensity ratios ,
- the compensation of the natural Randlichtabfalls by a suitable Room angle-dependent intensity modification of illumination light and / or measurement light can be at least 30%, preferably 50%, more preferably 80% and even more preferably 90% or even 95%. These percentages refer (with a fixed focal length of the objective used) to the ratio between the intensities at the edge of the image of the scene (for the measuring light) imaged on the measuring device which (a) coincides with the
- Illumination light controller which is configured, the
- Illumination light which describes the dependence of the illumination intensity of the illumination light on the solid angle, is dynamically changeable during an operation of the sensor system.
- one and the same scene can be recorded multiple times under different lighting conditions (successively).
- different data sets of one and the same scene are available to the data processing device, so that by means of a suitable method of image analysis (by the data processing device) that data record can be used for determining the three-dimensional characteristic of the scene which most accurately reproduces the scene. If necessary, an "a priori knowledge" of the optical and / or
- an optimal lighting characteristic can also be determined according to the "try-and-error" principle or by other statistical optimization procedures. This can be done dynamically during a real operation of the sensor system or as part of a calibration by means of a recognition of suitable reference objects (as in the above-described spatial and / or structural change of an optical element of the freeform optics).
- Lighting characteristics recorded 3D images of the scene are processed together so that a comprehensive data set is available for a final determination of the 3D characteristic of the scene.
- different partial areas of the scene can be characterized in that for a first partial area a first partial data record recorded in a first illumination characteristic and for a second partial area of the second partial data set recorded for a second illumination characteristic for the determination of the
- Data processing device coupled to the illumination light controller and configured to evaluate the determined three-dimensional characteristic of the scene and based on a result of this evaluation to change the characteristic of the illumination light.
- Scene detection is dependent on the angle of illumination illuminated by the illumination device, of measurement and evaluation results, which have been determined from a previous scene detection.
- the characteristic of the lighting becomes so dynamically on the basis of results of a
- Appropriate control of the lighting device may depend on current environmental conditions resulting in the result of
- Such environmental conditions may be weather conditions such as the presence of rain, snow, hail, fog, smoke, suspended particles, etc. in the scene.
- the result of the evaluation depends on the optical scattering behavior of at least one object contained in the scene. This has the advantage that in addition to the distance-based
- Light receiver of the measuring device impinges.
- a brightness which is as uniform as possible over the light-sensitive surface of the light receiver favors precise distance measurement by the described TOF sensor system.
- the litter or. Reflection behavior depends on the wavelength or the frequency of the illumination light. Consideration of such a frequency dependence can advantageously contribute to a further improvement of the scene lighting and the subsequent scene evaluation.
- a scanning the scene laser beam can be directed in a known manner via two rotatable mirrors with mutually non-parallel and preferably perpendicular to each other oriented axes of rotation to each illuminated point of the scene.
- a (dynamically adaptive) deflection can also non-mechanical optical elements such as diffractive
- Optical elements are used.
- the deflection can be controlled in particular by the illumination light control device described above.
- a dynamic adaptation of the freeform lens by means of The above-described spatial and / or structural change of an optical element of the freeform lens can provide for a suitable beam deflection of the laser beam.
- the at least approximately punctiform illumination light source may be a (sufficiently strong) semiconductor diode, for example a laser or light emitting diode.
- a (sufficiently strong) semiconductor diode for example a laser or light emitting diode.
- Beam shaping systems and in particular the described freeform optics are used.
- suitable optical elements can also be used.
- Beam deflection, beam splitting and / or beam merge can be used.
- the plurality of illumination light sources which are also in particular laser or light-emitting diodes, can be controlled (in particular individually) by the illumination light control device described above. This advantageously allows an adaptively controlled or even regulated adjustment of the characteristic of the illumination light.
- a flat light source can also be the source for a spatially-dependent, non-homogeneous intensity distribution. If it is a spatially homogeneously illuminated surface, suitable optical elements for
- Beam shaping in particular (also) the described freeform optics are used to realize a spatial angle dependent uneven illumination of the scene.
- the free-form optical system and / or the illumination device is or are configured, which
- the illumination light has a rectangular beam cross section.
- the beam cross-section is adapted to achieve as homogeneous as possible illumination to the shape of the scene to be detected.
- a suitable shaping of the beam cross section can be realized not only by a corresponding shaping of the luminous area of the illumination device and / or configuration of the freeform optics, but also by optical components such as mirrors and refractive optical elements (e.g.
- Lens system can be adapted in a suitable manner. Also diffractive optical elements (DOE) can be used, which optionally even one
- micromirror arrays known from the so-called Digital Light Processing (DLP) projection technology can also be used.
- DLP Digital Light Processing
- MEMS microelectromechanical systems
- Measuring device on a light receiver having a plurality of pixels for receiving the measuring light, wherein first pixels in a first portion of the light receiver have a first Lichtsensittechnik and second pixels in a second portion of the light receiver having a second Lichtsensittechnik.
- the second light sensitivity is different from the first one
- the unequal pixel sensitivity described may also help to ensure that in an image of the scene the difference between a brighter region and a darker region is not so great as to jeopardize reliable scene capture and / or scene evaluation by the computing device.
- light receivers are used in which the spatial distribution between first pixels and second pixels can be varied dynamically or adaptively. Such a variation may also be controlled by the data processing device and may depend on a result of a previous scene evaluation.
- the term light sensitivity relates to the ability of the pixel in question to accumulate incident photons within a short exposure time.
- a different pixel sensitivity may e.g. by reducing the noise of individual pixels or zones of pixels. Since the noise is often correlated with the heat of the sensor, e.g. a higher sensitivity can be achieved by means of a heat pump (e.g., a Peltier element) for a portion of the light receiver. The more punctually this temperature change can be generated on the light receiver, the higher the energy efficiency of the sensor system can be.
- a heat pump e.g., a Peltier element
- Measuring device further coupled to a coupled to the light receiver
- a light receiver control device wherein the light receiver control device and the light receiver are configured such that in a modified Operation of the sensor system at least two pixels of the plurality of pixels are combined to form a parent pixel.
- Such aggregation of pixels also referred to as "binning" has the effect of increasing the number of photons of the measurement light collected during a scene acquisition of one pixel, in proportion to the number of pixels, at the expense of spatial resolution pixels summed up to a parent pixel. Due to the resulting increased photon accumulation per (superordinate) pixel, the light sensitivity per pixel is significantly increased. This reduces
- a "binning" is therefore particularly advantageous for a weak measuring light when a high spatial resolution is not required.
- the plurality of pixels are grouped together so that a certain number of pixels are combined into a higher-level pixel.
- the certain number may be, for example, (preferably) two, three, (preferably) four, six, (preferably) eight, or (preferably) nine. Of course, an even stronger summary of pixels is possible.
- binning can also be carried out locally in only at least a partial area of the active areas of the light receiver via the surface of the light receiver. Although this leads to an inhomogeneous spatial resolution, which is not necessarily desired. However, the disadvantage of such an inhomogeneous spatial resolution is in many cases due to the increased photon accumulation (per pixel).
- a local "binning" can be done at least in some known light receivers without special electronic or apparative elements simply by a corresponding control of the light receiver, which Control determines the "binning" and thus the operating mode of the sensor system.
- a local "binning" is performed in that, measured by the measuring device and / or by the
- Data processing device learns exactly those areas of the
- Light receiver which have received in at least one previous scene detection too little light energy, by a suitable control of the light receiver by the light receiver control device in subsequent scene captures in a suitable manner to higher-level pixels
- Such dynamically controlled or controlled “binning” can during a normal operation of the sensor system (learned) and / or during the configuration of the sensor system, for example in
- the spatial resolution of the light receiver along different directions is different in each case if the individual pixels have a square shape. This can be exploited in some cases in an advantageous manner.
- Such an application is present, for example, when movement of an object of the scene along a previously known spatial direction is to be detected with increased accuracy.
- the number of pixels arranged along a line perpendicular to this known spatial direction may be larger than the number of pixels arranged along a line perpendicular thereto. Then the spatial resolution along the
- Movement direction and the motion profile of such a linearly moving object can also with a comparatively weak measuring light with a particularly high accuracy can be determined.
- Measuring means on (a) one or the light receiver for receiving the measuring light and (b) a measuring unit connected downstream of the light receiver, which is configured to measure the light running time based on (i) a measurement of the time span between a transmission of a pulse of the
- the sensor system is configured in such a way that it is possible to switch between the two different measurement principles "pulse mode" and "phase measurement” flexibly or, if necessary, between the two different measurement principles.
- the light receiver has a light-sensitive surface which, as described above, is subdivided into a multiplicity of pixels. With or on each pixel those photons of the measuring light are accumulated, which from a certain
- the measuring unit is used to determine the runtime of the associated light beams of the illumination light and the measuring light for each pixel.
- Sensor system further comprises a holder, which at least with the
- the measuring device is mechanically coupled, wherein the holder is designed such that the sensor system can be attached to a stationary in relation to the scene to be detected holding structure.
- the holder ensures that the described sensor system can be a stationary system which has a certain spatially fixed detection range and therefore always monitors the same scene, which of course can have a different scene characteristic at different times.
- spatially stationary objects that are present in the scene can be detected in an image analysis and hidden in a further image analysis with respect to movement profiles. As a result, computing power can be saved and the energy efficiency of the described sensor system can be improved.
- the stationary support structure may be directly or indirectly mechanically coupled to a device for controlling a coverage characteristic of an opening to be passed through the object by at least one closure body.
- this device in addition to a suitable guide or
- Closing on, in particular for moving the closing body between a closed position and an open position (and vice versa).
- the opening can be an entrance
- the closure body may be a door, for example a front door or a garage door.
- Holding structure may be, for example, the stationary frame structure of an entrance, such as the frame of a door.
- Data processing device further configured such that a
- Covering characteristic of an opening to be passed by an object by at least one closing body is controllable. This allows the opening, which is, for example, an entrance (or an exit) of a building, are automatically monitored in an energetically favorable manner, and by a suitable actuation of an actuator, the closing body can be automatically moved between an open position and a closed position.
- a method comprises (a) illuminating the scene with an illuminating light emitted by a lighting device along a
- Illuminating beam path is emitted; (b) receiving, by means of a measuring device, measuring light along a measuring light beam path, wherein the measuring light is at least partially backscattered by at least one object in the scene illumination light; (c) measuring by means of
- Illumination light intensity of the illumination light depends on the solid angle of the illumination beam path and (ii) in an arrangement in the
- Measuring light beam path a measuring light intensity of the measuring light of the
- Solid angle of the measuring light beam path depends, so that a distance-based loss of intensity of the illumination light and the measuring light is at least partially compensated.
- the described method is also based on the knowledge that a suitably designed or configured free-form optical system of the
- the free-form optics can provide a suitable space-angle-dependent intensity distribution of the illumination light so that the measurement light from each subarea of the scene strikes a light receiver of the measuring device with as homogeneous a distribution of intensity as possible.
- the beam path of the measuring light the
- Measuring light intensity ensure, so that (also) the measuring light with a homogeneous distribution of intensity as possible hits the light receiver.
- the method further comprises (a) detecting an object in the scene; (b) comparing the detected object with at least one stored in a database
- Comparison object and, (c) if the object agrees with a comparison object within predetermined permissible deviations, identifying the object as an object authorized for a particular action.
- the approved action may, for example, be an authorized passage through an opening in a building, which opening is closed by a closure body prior to identification as an approved object and is opened only after successful identification by a corresponding movement of the closure body.
- the objects to be identified may preferably be persons and / or vehicles.
- Successful identification may be to control or activate a closure mechanism for a closure body prior to opening a building.
- Covering characteristic of an opening to be passed by an object by at least one closing body is
- Covering characteristic which is controlled by the described sensor system or at least co-controlled. Because such sensor systems
- Sensor system can be monitored in an energetically efficient manner and larger distances, which naturally leads to an earlier detection of a
- Opening request of the closure body leads. This can be of great advantage, especially in the case of fast moving objects. Furthermore, the scene can be detected in an energy-efficient manner with a wider detection angle, which can lead, for example, to an early detection of transverse traffic moving transversely to the opening and thus to a more reliable detection of objects in the security area of the closure system. This can be suppressed in cross traffic unnecessary opening request.
- the opening is an entrance or an exit, in particular an emergency exit in a building.
- an existing, but possibly not moving object in a passage area can monitor an input or output, in particular a blocked emergency exit detected, and the corresponding information to an affiliated system, for example to a
- the object is a person or a vehicle.
- the building may in particular be a house or a garage.
- a sensor system described above for detecting and / or controlling traffic flows of objects moving through a scene of the sensor system, the scene being represented by a spatial
- Detection range of the sensor system is determined.
- This use is also based on the finding that an energy-efficient sensor system is important in traffic detection and / or traffic flow control, since this sensor system is typically constantly in operation and, moreover, a very high number of such sensor systems are typically used, in particular for larger traffic flows.
- the relevant objects for the traffic stream in question can be any relevant objects for the traffic stream in question.
- TOF-based sensor systems can generally be used both in terms of the
- Lighting light as well as in relation to the measuring light in two fundamentally different classes are divided, which can be combined with each other.
- Bl The first alternative (Bl) for the lighting is characterized by the fact that the scene by means of a single illumination light beam high
- Focusing and low divergence ie high collimation is scanned sequentially. For each position of the illumination light beam in the scene, a measurement of the duration of the illumination light and the measurement light
- the scanning can be realized using movable optical components, in particular mirrors. Alternatively or in
- Combination can be used for sequential scanning of the scene with the
- Illuminating light beam can be used a solid body, which manages without mechanically moving parts and has integrated photonic structures or circuits. With a suitable control of these structures, the illumination light beam is then directed to the desired location of the scene.
- a solid is known for example from US 2015/293224 Al.
- B2 The second alternative (B2) for lighting is characterized by the fact that the entire scene is illuminated (all at once and flatly). If necessary, the intensity of the illumination light in selected subregions of the scene can be (selectively) increased in order to enable improved 3D object detection at these locations. Such spatially uneven distribution of the intensity of the illumination light can be done without moving optical
- DOE Diffractive optical element
- M l A first alternative (M l) for the measurement is based on pulsed
- Illumination light beams The "travel time" of a light pulse on the receiver side for each pixel within a time window is determined and derived from the distance.
- the second alternative (M2) for the measurement is based on a temporal, preferably sinusoidal, modulation of the illumination light with a
- the predetermined frequency is predetermined, with appropriate values for this frequency depending on the expected transit time or the maximum detection distance.
- the phase difference is measured for each pixel and derived therefrom the distance information.
- Both measuring principles Ml and M2 are based on an integration of the number of photons or the photoelectrons generated in the light receiver, which arrive on each pixel to be measured.
- an ever-present light or photon noise depends on the number of photons accumulated in a pixel. Therefore, the higher the number of accumulated photons, the more accurate the distance information obtained from the TOF measurement becomes.
- Figure 1 shows the use of a sensor system for controlling a
- FIGS. 2a and 2b illustrate a deformation of a free-form optical system designed as an elastic optical element.
- FIG. 3 shows the use of a sensor system for detecting a
- Figures 4a and 4b illustrate a collection of single pixels of a light receiver.
- FIGS. 5a to 5c show different beam cross sections of a
- Light receiver to optimize the incident optical energy of the measuring light by illuminating the scene differently (intensively) depending on the characteristics of the scene.
- Lighting used energy can be optimally utilized.
- a TOF sensor system is typically mounted above the average height of the objects to be observed (people, products, vehicles, etc.) to make it undesirable for a plurality of objects
- a light source which consists of several individual illumination intensity supplying elements, which represent the entire illumination device (for example an array of laser or light-emitting diodes), a
- spatial angle-dependent intensity distribution of illumination light by a variation of the brightness of individual elements over other elements are supported.
- This variation can be constructive in design (eg, laser or light emitting diodes with different intensity) as well as the type of control (via variable current per laser or light emitting diode by a suitable electronics, for example by a measuring at the (first) Installation set).
- a dynamic adjustment of the individual laser or LEDs during operation is possible. In this case, simply those laser or light emitting diodes, which are assigned to areas of the scene that provide little measuring light, are correspondingly more energized. This is particularly suitable for the o.g. Lighting principle B2 in combination with the o. Measuring principle M l or M2.
- Scene geometry (and with optional dynamic "result control” also depending on the amount of reflection or scattered light) for each solid angle which are already controlled by the illumination device light intensity controlled.
- a static scene can be measured, with the appropriate intensity for each solid angle is taught to illuminating light.
- Intensity of illumination light by the illumination device can be adaptive both in real time and from "frame to frame". In this case, for the parts of the scene from which too little measuring light is received, immediately the intensity of the corresponding part (illumination light) is increased. This means that, depending on the result values of a last scene capture for the next scene capture, those areas with "overexposure"
- An embodiment of the invention achieves energy saving by means of dynamic lighting energy optimization, wherein
- Illumination energies of different wavelengths or frequencies are used. Depending on the color of the object, for example, the wavelengths that contribute to the most intense reflections or
- other wavelengths or other wavelength regions with less reflection or scattering may be present in the wavelength spectrum with a higher intensity.
- a red object can be illuminated primarily with a red light component and the green and blue light components are reduced (for the relevant solid angle), preferably to at least approximately an intensity of zero.
- the same principle can also be applied in the relationship between visible light and infrared (IR) light.
- variable frequency or wavelength variable illumination with respect to reflection and scattering properties with associated distance and solid angle can be used in a subsequent
- Scene analysis of moving objects can be of great advantage, because it allows easier objects to be detected and tracked, since the
- the sensor system described in this document may be used, for example, in passageways, especially in passageways having a shutter characteristic that is automatically controlled (e.g., by means of doors, gates, barriers, traffic lights, etc.). Since the sensors for a
- Passage control is usually powered by the existing closure systems with energy, it is with a given amount of energy as much sensory effect to achieve.
- the sensor system described in this document allows (i) a comparison with known sensor systems
- Cross-traffic and / or (iii) a more reliable detection of objects in a security area of the closure system.
- Lighting principle B2 is suitable, an (additional) spatial variation of the illumination light is achieved by DOE's, which also as part of the
- Pattern projections are required, which are required for 3D sensors, which are based on the known principle of structured illumination or the so-called. Strip projection.
- the scene to be detected is first illuminated conventionally, in particular according to the illumination principle B2.
- the sensor system can be operated at the lower limit of the measurability as long as the scene is (still) static. However, if a change in the scene is (grossly) detected or at least suspected, then an increase in the illumination intensity can be reacted immediately, so that the scene or the scene changes can then be detected and evaluated with high accuracy.
- This mechanism can be used both for IR sensor systems and for sensor systems with
- FIG. 1 shows the use of a sensor system 100 to control a coverage characteristic of an opening 184 depending on the characteristics of a scene 190 monitored by the sensor system 100.
- the opening 184 is an entrance for persons into a building or garage entrance for motor vehicles.
- the corresponding input structure is provided with the reference numeral 180.
- An object 195 located in the scene 190 is intended to be such a person or a person
- the input structure 180 comprises a stationary support structure 182, which constitutes a frame and a guide for two closing bodies 186 designed as sliding doors.
- the sliding doors 186 can each be represented by means of a motor 187 along the two thick double arrows
- Move instructions are moved.
- the actuation of the motors 187 takes place, as explained below, by means of the sensor system 100 described in this document.
- the sensor system 100 comprises (a) a TOF detection system 110, (b) a data processing device 150 and (c) a database 160.
- the TOF detection system 110 in turn has (a) a lighting device 130 for emitting illumination light 131 and (a2 ) a measuring device 115, which is responsible for the detection and measurement of measuring light 196.
- the measurement light 196 is at least partially backscattered by the object 195 illumination light 131.
- the measuring device 115 (a2-i) a light receiver 120 and (a2 -ii) a measuring unit 125, which is connected downstream of the light receiver 120 and which is adapted to measure a light transit time between the
- Lighting device 130 emitted illumination light 131 and received by the light receiver 120 measuring light 196. Further, the measuring device 115 is associated with a light receiver controller 122, which controls the operation of the light receiver 120 with respect to different operating modes.
- the TOF detection system 110 has (a3) an illumination light control device 135 for controlling the operation of the illumination device 130 and (a4) a free-form optical control device 145 for actuating actuators 131 , 143, by means of which the optical imaging properties of free-form optics 140 and 142 can be selectively adjusted.
- the sensor system 100 has two free-form optics 140 which are each assigned to one of two light sources or illumination units of the illumination device 130 and are located in the corresponding beam path of the respective illumination light 131.
- the two free-form optics 140 are not static but can be deformed by means of a respective schematically illustrated actuator 141. This changes the imaging properties of the
- Free-form optics 140 and there is a relation to the space angle-dependent intensity distribution of the illumination light, as it of the respective
- Lighting unit of the lighting device 130 is emitted
- Deformation of the elastic free-form optics 140 which is caused by the free-form optical control device 145 and performed by the respective actuators 141, the spatial characteristics of the illumination light 131, as it encounters the scene 190, may be adjusted in view of a desired space angle-dependent distribution of the intensity of the illumination light 131 in the scene 190.
- the desired spatial angle-dependent distribution of the illumination light intensity is typically the one
- Intensity distribution in which the detection of the scene by the light receiver 120 and the signal processing by the downstream measuring unit 125 and the data processing device 150 can be as reliable and / or accurate as possible.
- the optical properties with respect to a modification of the intensity distribution of the measuring light 196 are provided by an actuator 143 of the Free-form optical control device 145 controlled.
- the collection of the intensity of the measurement light 196 depending on the spatial angle is modified. This preferably takes place in such a way that the measurement light 196, which is inhomogeneously distributed with respect to its intensity, is "homogenized” (as shown in FIG. 1 from below) on the freeform optical system 140, so that the measurement light has a distribution of intensity which is as homogeneous as possible Light receiver 120 hits.
- the solid angle dependence of the illumination light and the measurement light described here can also be realized by only one type of free-form optics. For example, the whole
- Free-form optics 140 and 142 and the actuators 141 and 143 which are mounted on a housing of the TOF detection system 110 via support structures, not shown in Figure 1 for reasons of clarity.
- a holder 111 at least the TOF detection system 110 is attached to the stationary support structure 182 in a mechanically stable and spatially fixed manner.
- the entire sensor system 100 (in contrast to the representation of FIG. 1) is constructed as a module which, in addition to the TOF detection system 110, also has the data processing device 150 and the database 160 within a compact design.
- the light receiver controller 122 may cause the light receiver 120 to operate in a particular one of at least two different modes of operation.
- individual pixels of the light receiver 120 may have a different sensitivity or a different efficiency with respect to an accumulation of photons. Also below with reference to Figures 4a and 4b
- Light receiver 120 may be initiated by the light receiver controller 122.
- An external control signal 152a transferred to the data processing device 150 via an interface 152 can be used to make the operation of the data processing device 150 at least partially dependent on external information.
- an "a priori knowledge" about an object 195 for an improved evaluation of the detected scene 190 and in particular for an improved object recognition are transmitted.
- Lighting device 130 which may be, for example, an array of individually controllable laser or light-emitting diodes, the scene 190 and thus also located in the scene 190 object 195 with a pulsed and thus temporally modulated illumination light 131.
- the illumination light control device 135 is configured, the Lighting device 130 to control such that a characteristic of the illumination light 131, which describes the dependence of the illumination intensity of the illumination light 131 of the solid angle (in which the illumination light 131 strikes the free-form optical system 140), during an operation of the sensor system 100 dynamically
- the space angle-dependent intensity distribution of the illumination light 131 is not shown in FIG. 1 for reasons of clarity.
- those intensity of the backscattered measuring light 196 as it encounters the light receiver
- Solid angle of the scene 190 which are associated with a larger measuring distance, illuminated more than other solid angles, which are associated with a smaller measuring distance.
- the illumination light controller 135 may be the characteristic of the
- Illumination light 131 may also modify other (non-spatial) properties of illumination light 131, such as its (a) wavelength, (b) spectral intensity distribution, (c) polarization direction, and (d)
- Intensity distribution for different polarization directions can be selected such that they contribute to the most reliable and accurate object recognition possible. Again, a "a Priori knowledge "about optical properties of the object 195 are considered.
- the illumination device 130 in addition to the illumination units shown in Figure 1 can also have other lighting units that illuminate the scene 190 from a different angle. Likewise, the two lighting units can also be arranged even outside the housing of the TOF detection system 110 and thus be further spaced from the light receiver 120. This does not change the principles of the TOF measurement.
- the acquired optical scene 190 is processed with the data processing device 150 using suitable methods of image evaluation
- the sensor system 100 is able to perform an object recognition. This is what the
- Data processing device 150 to a stored in the database 160 record of reference objects corresponding to selected objects that are authorized to pass through the opening 184. This means that with a suitable approach of the object 195 to the input 184, the sliding doors 186 are opened only when the detected object 195 at least approximately coincides with one of the stored reference objects. This clearly indicates that in the use described here of the
- FIGS. 2 a and 2 b illustrate a deformation of a free-form optical system designed as an elastic, optically refractive element.
- FIG. 2a shows the optical element 240a in a first operating state, which is characterized by a first spatial structure. As shown here
- Embodiment the first operating state characterized by the fact that apart from any existing internal voltage from the outside (by an actuator) no force or no pressure on the freeform optics 240a is exercised.
- the corresponding (not warped) coordinate system is shown in Figure 2a in the optical element 240a.
- FIG. 2b shows the optical element, which is now designated by the reference numeral 240b, in a second (strained) operating state, which is characterized by a second spatial structure that is different from the first spatial structure. It is obvious that this also changes the light shaping properties, in particular with regard to a modification of the intensity distribution of illuminating light or measuring light passing through.
- the second operating state is set by exerting a force or a pressure on the optical element 240b from the outside (from an actuator, not shown).
- the corresponding (warped) coordinate system of the optical element is illustrated in FIG. 2b.
- the dimensions or lengths of the freeform optical system 240b that are changed in relation to the first state along the coordinate axes are illustrated by coordinate axes of different lengths.
- FIG. 3 shows a further use or a further use of the invention
- the TOF detection system 110 detects a traffic flow of (various) objects 395a, 395b, 395c, 395d, and 395e that are on a conveyor belt 398 and move through a scene 390 along the direction of movement represented by an arrow.
- Reliable knowledge of the number and / or type of objects 395a to 395e may be used in the field of logistics for traffic flow control. Only an example of such
- Controlling a traffic flow is the control of luggage transport in an airport.
- labels on the respective objects 395a - 395e can also determine the type of the respective object. It should be noted, however, that use in an airport is merely one example of a variety of other uses in the field
- FIGS. 4a and 4b illustrate a collection of single pixels of a light receiver 420a or 420b formed as a semiconductor or CCD chip.
- the light receiver 420a has a plurality of light-sensitive or
- Photon-collecting pixels 422a As shown here
- the pixels 422 a of the full spatial resolution of the light receiver 420 a are assigned, which resolution is predetermined by the semiconductor architecture of the chip 420 a.
- the light receiver 420b four of the light-sensitive pixels (for a full resolution) are respectively added to a higher-level pixel 422b (for an increased resolution)
- a pixel 422b collects 4 times the amount of light compared to a single pixel 422a.
- Such "binning" of pixels reduces the required (minimum) intensity of the detected measurement light needed to evaluate the corresponding image area of the scene. Since the intensity of the measurement light depends directly on the intensity of the illumination light, binning reduces the intensity of the illumination light and thus reduces the energy consumption of the sensor system.
- the described "binning" can also be realized dynamically by a corresponding activation of one and the same light receiver 420a or 420b.
- the light receiver is operated either in a first operating mode (with full resolution) or in a second operating mode (with photon-collecting combined pixels). Switching between different modes of operation may be controlled by external control signals. Alternatively or in combination, such switching may also depend on the result of a scene evaluation, so that "binning"
- Operating mode is regulated for a next scene capture.
- each with a different strong summary of pixels can be used. Furthermore, it is possible to combine a different number of individual pixels into a higher-order pixel in different partial areas of the light receiver. Then, individual portions of the scene with a higher spatial resolution (and less photon accumulation) and other portions of the scene with a lower spatial resolution (and higher
- FIGS. 5a to 5c show different beam cross sections of an illumination light for adapting the illumination to the shape of the illumination
- a first illumination light 531a illustrated in FIG. 5a has a substantially circular beam cross section and is preferably suitable for "round scenes". However, for most applications which do not detect (and evaluate) a "round scene", a beam cross section deviating from a circular shape is suitable.
- FIG. 5b shows an illumination light 531b with an elliptical beam cross section.
- FIG. 5c shows
- Illumination light 531c with a rectangular beam cross section.
- Diffractive optical elements can also be used, which optionally even allow a dynamic and / or scene-dependent shaping of the beam cross-section.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017129641.7A DE102017129641A1 (de) | 2017-12-12 | 2017-12-12 | 3D Sensorsystem mit einer Freiformoptik |
PCT/EP2018/084421 WO2019115558A1 (de) | 2017-12-12 | 2018-12-11 | 3d sensorsystem mit einer freiformoptik |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3724674A1 true EP3724674A1 (de) | 2020-10-21 |
Family
ID=64899259
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18826219.0A Pending EP3724674A1 (de) | 2017-12-12 | 2018-12-11 | 3d sensorsystem mit einer freiformoptik |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3724674A1 (de) |
DE (1) | DE102017129641A1 (de) |
WO (1) | WO2019115558A1 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022203647A1 (en) * | 2021-03-22 | 2022-09-29 | Hewlett-Packard Development Company, L.P. | Devices for controlling automatic doors |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE60117731T2 (de) * | 2000-12-26 | 2007-01-11 | Kabushiki Kaisha Topcon | optisches Entfernungsmesssystem |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008286565A (ja) * | 2007-05-16 | 2008-11-27 | Omron Corp | 物体検知装置 |
WO2011128408A1 (en) * | 2010-04-15 | 2011-10-20 | Iee International Electronics & Engineering S.A. | Configurable access control sensing device |
DE102010033818A1 (de) * | 2010-08-09 | 2012-02-09 | Dorma Gmbh + Co. Kg | Sensor |
EP2667218B1 (de) | 2010-11-15 | 2017-10-18 | Cedes AG | Energiespar-3-D-Sensor |
EP2469301A1 (de) * | 2010-12-23 | 2012-06-27 | André Borowski | Verfahren und Vorrichtungen zur Erzeugung einer Repräsentation einer 3D-Szene bei sehr hoher Geschwindigkeit |
US10132928B2 (en) | 2013-05-09 | 2018-11-20 | Quanergy Systems, Inc. | Solid state optical phased array lidar and method of using same |
US9635231B2 (en) * | 2014-12-22 | 2017-04-25 | Google Inc. | Time-of-flight camera system and method to improve measurement quality of weak field-of-view signal regions |
DE102015115101A1 (de) * | 2015-09-08 | 2017-03-09 | Huf Hülsbeck & Fürst Gmbh & Co. Kg | Sensorsystem einer Sensoreinrichtung eines Kraftfahrzeugs |
DE102015220798A1 (de) * | 2015-10-23 | 2017-04-27 | Designa Verkehrsleittechnik Gmbh | Zugangskontrollsystem für einen Lagerbereich sowie Verfahren zur Zugangskontrolle |
TWI570387B (zh) * | 2015-11-09 | 2017-02-11 | 財團法人工業技術研究院 | 影像測距系統、光源模組及影像感測模組 |
DE102015226771A1 (de) * | 2015-12-29 | 2017-06-29 | Robert Bosch Gmbh | Umlenkeinrichtung für einen Lidarsensor |
DE102016202181A1 (de) * | 2016-02-12 | 2017-08-17 | pmdtechnologies ag | Beleuchtung für eine 3D-Kamera |
US10754015B2 (en) * | 2016-02-18 | 2020-08-25 | Aeye, Inc. | Adaptive ladar receiver |
US11237251B2 (en) * | 2016-05-11 | 2022-02-01 | Texas Instruments Incorporated | Lidar scanning with expanded scan angle |
DE102016122712B3 (de) * | 2016-11-24 | 2017-11-23 | Sick Ag | Optoelektronischer Sensor und Verfahren zur Erfassung von Objektinformationen |
-
2017
- 2017-12-12 DE DE102017129641.7A patent/DE102017129641A1/de not_active Withdrawn
-
2018
- 2018-12-11 EP EP18826219.0A patent/EP3724674A1/de active Pending
- 2018-12-11 WO PCT/EP2018/084421 patent/WO2019115558A1/de unknown
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE60117731T2 (de) * | 2000-12-26 | 2007-01-11 | Kabushiki Kaisha Topcon | optisches Entfernungsmesssystem |
Also Published As
Publication number | Publication date |
---|---|
DE102017129641A1 (de) | 2019-06-13 |
WO2019115558A1 (de) | 2019-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE112017001112T5 (de) | Verfahren und Vorrichtung für eine aktiv gepulste 4D-Kamera zur Bildaufnahme und -analyse | |
EP1515161B1 (de) | Optischer Entfernungsmesser und Verfahren zur Bestimmung der Entfernung zwischen einem Objekt und einem Referenzpunkt | |
EP3033251B1 (de) | Sensoranordnung zur erfassung von bediengesten an fahrzeugen | |
EP3014569B1 (de) | Inspektion der konturierten fläche des unterbodens eines kraftfahrzeugs | |
EP1159636A1 (de) | Ortsauflösendes abstandsmesssystem | |
EP1916545B1 (de) | Optoelektronischer Sensor und Verfahren zu dessen Betrieb | |
DE102013100521A1 (de) | Sensoranordnung zur Erfassung von Bediengesten an Fahrzeugen | |
EP2946226A1 (de) | Universelle sensoranordnung zur erfassung von bediengesten an fahrzeugen | |
WO2020229186A1 (de) | 3D SENSORSYSTEM, BETREIBBAR IN VERSCHIEDENEN BETRIEBSMODI IN ABHÄNGIGKEIT EINES BETRIEBSZUSTANDES EINES VERSCHLIEßKÖRPERS | |
WO2017041915A1 (de) | Sensorsystem einer sensoreinrichtung eines kraftfahrzeugs | |
DE102019003049A1 (de) | Vorrichtung und Verfahren zum Erfassen von Objekten | |
EP3724674A1 (de) | 3d sensorsystem mit einer freiformoptik | |
WO2019115561A1 (de) | Sensorsystem zum dreidimensionalen erfassen einer szene mit verschiedenen photonenakkumulationen | |
EP3724601B1 (de) | Abstandsermittlung basierend auf verschiedenen tiefenschärfebereichen bei unterschiedlichen fokuseinstellungen eines objektivs | |
DE102006010990B4 (de) | Sicherheitssystem | |
DE102017129654A1 (de) | 3D Szenenerfassung basierend auf Lichtlaufzeiten und einem charakteristischen Merkmal von erfassten elektromagnetischen Wellen | |
WO2019115559A1 (de) | 3d sensorsystem mit einer von einem raumwinkel abhängigen szenenbeleuchtung | |
WO2020229190A1 (de) | Identifizierung eines objekts basierend auf einer erkennung eines teils des objekts und auf beschreibungsdaten von einem referenzobjekt | |
EP1865755B1 (de) | Vorrichtung zur Beleuchtungssteuerung mit Anwesenheits-Erfassung | |
WO2020229189A1 (de) | Tof sensorsystem mit einer beleuchtungseinrichtung mit einem array von einzellichtquellen | |
DE102022001990B4 (de) | Verfahren zur Fusion von Sensordaten eines Lidars mit Sensordaten eines aktiven Terahertz-Sensors | |
DE202007008363U1 (de) | Vorrichtung zur Beleuchtungssteuerung | |
WO2023041706A1 (de) | Verfahren und vorrichtung zur messung von tiefeninformationen einer szene anhand von mittels zumindest einer parallelstrahlungsquelle generiertem strukturierten licht | |
DE102019217780A1 (de) | Verfahren zum Anpassen von Schwellwerten für eine Erkennung von Objekten in einem Projektionsvolumen einer scannenden Projektionsvorrichtung und scannende Projektionsvorrichtung | |
DE102011086026A1 (de) | System und Verfahren zur Objekterkennung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200604 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: HUNZIKER, URS Inventor name: SEILER, CHRISTIAN Inventor name: WYSS, BEAT Inventor name: ECKSTEIN, JOHANNES |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210629 |