WO2007105205A2 - Three-dimensional sensing using speckle patterns - Google Patents
Three-dimensional sensing using speckle patterns Download PDFInfo
- Publication number
- WO2007105205A2 WO2007105205A2 PCT/IL2007/000306 IL2007000306W WO2007105205A2 WO 2007105205 A2 WO2007105205 A2 WO 2007105205A2 IL 2007000306 W IL2007000306 W IL 2007000306W WO 2007105205 A2 WO2007105205 A2 WO 2007105205A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- speckle pattern
- light source
- image
- diffuser
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/18—Diffraction gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/557—Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras
Definitions
- the present invention relates generally to methods and systems for mapping of three- dimensional (3D) objects, and specifically to 3D optical imaging using speckle patterns.
- primary speckle When a coherent beam of light passes through a diffuser and is projected onto a surface, a primary speckle pattern can be observed on the surface.
- the primary speckle is caused by interference among different components of the diffused beam.
- the term "primary speckle” is used in this sense in the present patent application and in the claims, in distinction to secondary speckle, which is caused by diffuse reflection of coherent light from the rough surface of an object
- Hart describes the use of a speckle pattern in a high-speed 3D imaging system, in Taiwanese Patent TW 527528 B and in U.S. Patent Application 09/616,606, whose disclosures are incorporated herein by reference.
- the system includes a single-lens camera subsystem with an active imaging element and CCD element, and a correlation processing subsystem.
- the active imaging element can be a rotating aperture which allows adjustable non-equilateral spacing between defocused images to achieve greater depth of field and higher sub-pixel displacement accuracy.
- a speckle pattern is projected onto an object, and images of the resulting pattern are acquired from multiple angles. The images are locally cross-correlated using an image correlation technique, and the surface is resolved by using relative camera position information to calculate the three-dimensional coordinates of each locally-correlated region.
- Another speckle-based 3D imaging technique is described by Hunter et al., in U.S.
- Patent 6,101,269 whose disclosure is incorporated herein by reference.
- a random speckle pattern is projected upon a 3D surface and is imaged by a plurality of cameras to obtain a plurality of two-dimensional digital images.
- the two-dimensional images are processed to obtain a three-dimensional characterization of the surface.
- Embodiments of the present invention perform accurate, real-time mapping of 3D objects using primary speclde patterns.
- the methods and systems that are described in the above-mentioned PCT patent application, as well as further embodiments described hereinbelow, are capable of performing such 3D mapping using a single coherent light source and a single image sensor, which is held stationary at a fixed angle relative to the light source.
- a reference image of the speckle pattern is captured initially on a reference surface of known profile.
- the 3D profile of an object is then determined by capturing an image of the speckle pattern projected on the object, and comparing the image to the reference image.
- successive images of the speckle pattern on the object are captured as the object moves.
- Each image is compared with one or more of its predecessors in order to track the motion of the object in three dimensions.
- the light source and image sensor are held in a linear alignment that permits rapid, accurate motion tracking by computing one-dimensional correlation coefficients between successive images.
- novel illumination and image processing schemes are used to enhance the accuracy, depth of field, and computational speed of the 3D mapping system.
- apparatus for 3D mapping of an object including: an illumination assembly, including a coherent light source and a diffuser, which are arranged to project a primary speckle pattern on the object; a single image capture assembly, which is arranged to capture images of the primary speckle pattern on the object from a single, fixed location and angle relative to the illumination assembly; and a processor, which is coupled to process the images of the primary speckle pattern captured at the single, fixed angle so as to derive a 3D map of the object.
- the apparatus includes a mount, which is attached to the illumination assembly and the image capture assembly so as to hold the image capture assembly in a fixed spatial relation to the illumination assembly.
- the image capture assembly includes an array of detector elements arranged in a rectilinear pattern defining first and second, mutually-perpendicular axes, and objective optics, which have an entrance pupil and are arranged to focus the image onto the array, wherein the illumination assembly and image capture assembly are aligned by the mount so as to define a device axis that is parallel to the first axis and passes through the entrance pupil and through a spot at which a beam emitted by the coherent light source passes through the diffuser.
- the processor is arranged to derive the 3D map by finding an offset along only the first axis between the primary speckle pattern captured in one or more of the images and a reference image of the primary speckle pattern.
- the processor is arranged to derive the 3D map by finding respective offsets between the primary speckle pattern on multiple areas of the object captured in one or more of the images and a reference image of the primary speckle pattern, wherein the respective offsets are indicative of respective distances between the areas and the image capture assembly.
- the image capture assembly is located at a predetermined spacing from the illumination assembly, and the respective offsets are proportional to the respective distances in a ratio that is determined by the spacing.
- the primary speckle pattern projected by the illumination assembly includes speckles having a characteristic size, and the size of the speckles in the images varies across the image by a tolerance that depends on the spacing, wherein the spacing is selected so as to maintain the tolerance within a predefined bound.
- the processor is arranged to relate the respective offsets to respective coordinates in the 3D map using a parametric model of distortion in the image capture assembly. Further additionally or alternatively, the processor is arranged to find the respective offsets by finding an initial match between the primary speckle pattern in a first area of the object and a corresponding area of the reference image at a first offset relative to the first area, and to apply a region growing procedure, based on the first offset, to find the respective offsets of pixels adjacent to the first area.
- the processor is arranged to process a succession of the images captured while the object is moving so as to map a 3D movement of the object, wherein the object is a part of a human body, and wherein the 3D movement includes a gesture made by the part of the human body, and wherein the processor is coupled to provide an input to a computer application responsively to the gesture.
- the illumination assembly includes a beam former, which is arranged to reduce a variation of a contrast of the speckle pattern created by the diffuser over a sensing volume of the apparatus.
- the beam former includes a diffractive optical element (DOE) and a lens arranged to define a Fourier plane of the diffuser, wherein the DOE is located in the Fourier plane.
- DOE diffractive optical element
- the beam former may be arranged to reduce a divergence of light emitted from the diffuser or to equalize an intensity of light emitted from the diffuser across a plane transverse to an optical axis of the illumination assembly.
- the processor includes an optical correlator
- the optical correlator includes a diffractive optical element (DOE) containing a reference speckle pattern
- the image capture assembly includes a lenslet array, which is arranged to project multiple sub- images of the object onto the DOE so as to generate respective correlation peaks that are indicative of 3D coordinates of the object.
- DOE diffractive optical element
- the coherent light source has a coherence length that is less than 1 cm.
- the primary speckle pattern includes speckles having a characteristic size, and the illumination assembly is configured so as to permit the characteristic size of the speckles to be adjusted by varying a distance between the coherent light source and the diffuser.
- a method for 3D mapping of an object including: illuminating an object with a beam of diffused coherent light from a light source so as to project a primary speckle pattern on the object; capturing images of the primary speckle pattern on the object from a single, fixed location and angle relative to the light source; and processing the images of the primary speckle pattern captured at the single, fixed angle so as to derive a 3D map of the object.
- apparatus for 3D mapping of an object including: an illumination assembly, including a coherent light source, having a coherence length less than 1 cm, and a diffuser, which are arranged to project a primary speckle pattern on the object; an image capture assembly, which is arranged to capture images of the primary speckle pattern on the object; and a processor, which is coupled to process the images of the primary speckle pattern so as to derive a 3D map of the object.
- the coherence length of the coherent light source is less than 0.5 mm. Additionally or alternatively, the coherent light source has a divergence greater than 5°.
- Fig. 1 is a schematic, pictorial illustration of a system for 3D mapping, in accordance with an embodiment of the present invention
- Fig. 2 is a schematic top view of a speckle imaging device, in accordance with an embodiment of the present invention.
- Fig. 3 is a flow chart that schematically illustrates a method for 3D mapping, in accordance with an embodiment of the present invention
- Fig. 4 is a schematic side view of an illumination assembly used in a system for 3D mapping, in accordance with another embodiment of the present invention
- Fig. 5 is a schematic side view of a beam former, in accordance with an embodiment of the present invention
- Fig. 6 is a schematic side view of a beam former, in accordance with yet another embodiment of the present invention.
- Fig. 7 is a schematic side view of an optical correlator used in a system for 3D mapping, in accordance with a further embodiment of the present invention.
- Fig. 1 is a schematic, pictorial illustration of a system 20 for 3D mapping, in accordance with an embodiment of the present invention.
- System 20 comprises a speckle imaging device 22, which generates and projects a primary speckle pattern onto an object 28 and captures an image of the primary speckle pattern appearing on the object. Details of the design and operation of device 22 are shown in the figures that follow and are described hereinbelow with reference thereto.
- An image processor 24 processes image data generated by device 22 in order to derive a 3D map of object 28.
- Image processor 24, which performs such reconstruction may comprise a general-purpose computer processor, which is programmed in software to carry out the functions described hereinbelow.
- the software may be downloaded to processor 24 in electronic form, over a network, for example, or it may alternatively be provided on tangible media, such as optical, magnetic, or electronic memory media.
- processor 24 may be implemented in dedicated hardware, such as a custom or semi-custom integrated circuit or a programmable digital signal processor (DSP).
- DSP programmable digital signal processor
- processor 24 is shown in Fig. 1, by way of example, as a separate unit from imaging device 22, some or all of the processing functions of processor 24 may be performed by suitable dedicated circuitry within the housing of the imaging device or otherwise associated with the imaging device.
- the 3D map that is generated by processor 24 may be used for a wide range of different purposes.
- the map may be sent to an output device, such as a display 26, which shows a pseudo-3D image of the object.
- object 28 comprises all or a part (such as a hand) of the body of a subject.
- system 20 may be used to provide a gesture-based user interface, in which user movements detected by means of device 22 control an interactive computer application, such as a game, in place of tactile interface elements such as a mouse, joystick or other accessory.
- system 20 may be used to create 3D maps of objects of other types, for substantially any application in which 3D coordinate profiles are needed.
- An illumination assembly 30 comprises a coherent light source 32, typically a laser, and a diffuser 33.
- the term "light” in the context of the present patent application refers to any sort of optical radiation, including infrared and ultraviolet, as well as visible light.
- the beam of light emitted by source 32 passes through diffuser 33 at a spot 34 of radius wo, and thus generates a diverging beam 36.
- the primary speckle patterns created by diffuser 34 at distances Z o bj ⁇ and Z 0 ⁇ j ⁇ are to a good approximation linearly-scaled versions
- An image capture assembly j 38 captures an image of the speckle pattern that is projected onto object 28.
- Assembly 38 comprises objective optics 39, which focus the image onto an image sensor 40.
- sensor 40 comprises a rectilinear array of detector elements 41, such as a CCD or CMOS-based image sensor array.
- Optics 39 have an entrance pupil 42, which together with the dimensions of the image sensor defines a field of view 44 of the image capture assembly.
- the sensing volume of device 22 comprises an overlap area 46 between beam 36 and field of view 44.
- the characteristic transverse speckle size projected by illumination assembly 30 (as defined by the second-order statistics of the speckle pattern) at a distance Z o &/ is
- each speckle imaged onto sensor 40 by optics 39 should span between one and ten detector elements 41 in the horizontal direction.
- a speckle size between two and three pixels gives good results. It can be seen from the formula above for ⁇ X that the speckle size may be adjusted by varying the distance between light source 32 and diffuser 33, since the radius wo of spot 34 increases with distance from the light source.
- the speckle parameters of illumination assembly 30 can be controlled simply by laterally shifting the light source, without the use of lenses or other optics. Illumination assembly 30 can be adjusted in this manner to work with image sensors of different size and resolution and with objective optics of varying magnification.
- an inexpensive light source such as a laser diode, with high divergence (5° or greater) and short coherence length (less than 1 cm, and in some cases even shorter than 0.5 mm) may be used in system 20 with good effect.
- Illumination assembly 30 and image capture assembly 38 are held in a fixed spatial relation by a mount 43.
- the mount comprises a housing that holds the assemblies.
- any other suitable sort of mechanical mount may be used to maintain the desired spatial relation between the illumination and image capture assemblies.
- the configuration of device 22 and the processing techniques described hereinbelow make it possible to perform 3D mapping using the single image capture assembly, without relative movement between the illumination and image capture assemblies and without moving parts.
- Image capture assembly 38 thus captures images at a single, fixed angle relative to illumination assembly 30.
- mount 43 hold assemblies 30 and 38 so that the axis passing through the centers of entrance pupil 42 and spot 34 is parallel to one of the axes of sensor 40.
- the axis passing through pupil 42 and spot 34 should be parallel to one of the array axes, which is taken for convenience to be the X-axis.
- Z o i)j will cause distortions of the speckle pattern in images of the object captured by image capture assembly 38. Specifically, by triangulation, it can be seen in Fig. 2 that a Z-direction shift of a point on the object, ⁇ Z, will engender a concomitant transverse shift ⁇ X in the
- Z obj Z-coordinates of points on the object may thus be determined by measuring shifts in the X-coordinates of the speckles in the image captured by assembly 38 relative to a reference image taken at a known distance Z.
- the group of speckles in each area of the captured image are compared to the reference image to find the most closely-matching group of speckles in the reference image.
- the relative shift between the matching groups of speckles in the image gives the Z-direction shift of the area of the captured image relative to the reference image.
- the shift in the speckle pattern may be measured using image correlation or other image matching computation methods that are known in the art.
- the shift of the speckle pattern with ⁇ Z will be strictly in the X-direction, with no Y-component of the shift (as long as distortion due to optics 39 is negligible). Therefore, the image matching computation is simplified and need only seek the closest matching group of speckles subject to X-shift.
- processor may use a parametric model to compensate for the deviation.
- the known deviation may be measured or otherwise modeled, and the processor may then check copies of areas of the current image that are shifted by appropriate (X 5 Y) shifts relative to the reference image according to the parametric model of the deviation in order to find the actual 3D coordinates of the object surface.
- the operating parameters of system 20 are chosen so that £ « Z 0 ⁇ y.
- ⁇ be maintained within some predetermined bound, depending on the matching window size, as well as on the characteristic speckle size.
- ⁇ should be limited so that scaling of a characteristic window varies by no more than about 30% of a single speckle size. Given a diagonal angle ⁇ of the field of view
- the Z-direction shifts of the object in successive image frames captured by assembly 38 can generally be computed without explicitly taking speckle scaling variations into account.
- Fig. 3 is a flow chart that schematically illustrates a method for 3D mapping using system 20, in accordance with an embodiment of the present invention. This method is based, inter alia, on the realization that the speckle pattern that is projected by illumination assembly 30 does not change substantially over time. Therefore, an individual image of the speckle pattern that is projected onto an object, captured by image capture assembly 38 at a fixed location and angle relative to assembly, may be used to accurately compute a 3D map of the object.
- device 22 is calibrated by projecting the speckle pattern from assembly 30 onto an object of known spatial profile at a known distance from the device, at a calibration step 50.
- Image capture assembly 38 captures a reference image of the object, which is stored in a memory of processor 24. This calibration step may be carried out at the time of manufacture, and the reference image stored in the memory will then be usable in the field as long as there is no uncontrolled relative motion among the different components of device 22.
- the reference image may be saved in a data-reduced form, such as a threshold-based binary image, that is appropriate for the matching algorithm that is to be used.
- system 20 When system 20 is ready for use, it is actuated to capture an image of the object of interest (object 28 in this example) using device 22, at an initial image capture step 52.
- Processor 24 compares this image to the speckle pattern in the stored calibration image, at a map computation step 54. Dark areas of the image, in which the pixel values are below some threshold value (or otherwise not containing relevant speckle information), are typically classified as shadow areas, from which depth (Z) information cannot be derived.
- the remainder of the image may be binarized, possibly using an adaptive threshold, as is known in the art, or otherwise data-reduced for efficient matching to the reference image.
- Processor 24 selects a certain window within the non-shadow part of the image, and compares the sub-image within the window to parts of the reference image until the part of the reference image that best matches the sub-image is found.
- assemblies 30 and 38 are aligned along the X-axis, as described above and shown in Fig. 2, it is sufficient for processor to compare the sub-image to parts of the reference image that are displaced in the X-direction relative to the sub-image (subject to scaling of the speckle pattern by up to the scaling factor ⁇ , as noted above).
- the processor uses the transverse offset of the sub-image relative to the matching part of the reference image to determine the Z-coordinate of the area of the surface of object 28 within the sub-image, based on the principle of triangulation explained above.
- Processor 24 may optionally analyze the speckle distortion in order to estimate the slant angle, and thus improve the accuracy of 3D mapping.
- Processor 24 may use the map coordinates of this first window as a start point for determining the coordinates of neighboring areas of the image. Specifically, once the processor has found a high correlation between a certain area in the image and a corresponding area in the reference image, the offset of this area relative to the reference image can serve as a good predictor of the offsets of neighboring pixels in the image. The processor attempts to match these neighboring pixels to the reference image with an offset equal to or within a small range of the initially-matched area. In this manner, the processor grows the region of the matched area until it reaches the edges of the region. The processor thus proceeds to determine the Z-coordinates of all non-shadow areas of the image, until it has completed the 3D profile of object 28. This approach has the advantage of providing fast, robust matching even using small windows and images with poor signal/noise ratio. Details of computational methods that may be used for this purpose are described in the above-mentioned PCT patent application.
- processor 24 will have computed a complete 3D map of the part of the object surface that is visible in the initial image.
- the method may readily be extended, however, to capture and analyze successive images in order to track 3D motion of the object, at a next image step 56.
- Device 22 captures the successive images at some predetermined frame rate, and processor 24 updates the 3D map based on each successive image.
- the 3D maps may be computed with respect to the stored, calibrated reference image if desired. Alternatively, since the object will generally not move too much from one image frame to the next, it is frequently more efficient to use each successive image as a reference image for the next frame.
- processor 24 may compare each successive image to the preceding image in order to compute the X-direction shift of the speckles in each sub-image relative to the same speckles in the preceding image, at a shift computation step 58.
- the shift is no more than a few pixels, so that the computation can be performed rapidly and efficiently.
- processor 24 outputs the updated 3D map, at a new map output step 60. This process of image capture and update may thus proceed indefinitely.
- system 20 is capable of operating and outputting map coordinates at real-time video rates, on the order of 30 frames/sec or even faster, while using simple, low-cost imaging and processing hardware.
- efficient image matching computation and region growing, as described above may enable system 20 to operate at video speed even when local shifts cannot be computed from preceding images.
- a computer (which may comprise processor 24 or may receive the 3D maps output by the processor) identifies a certain volume or volumes in the 3D maps that correspond to parts of the user's body, such as the arm, hand, and/or fingers, and possibly the head, torso, and other extremities, as well.
- the computer is programmed to identify gestures corresponding to certain movements of these body parts and to control computer applications in response to these gestures. Examples of such gestures and applications include:
- Fig. 4 is a schematic side view of an illumination assembly 70 that may be used in system 20 in order to enhance the useful depth range of the system, in accordance with an embodiment of the present invention.
- Assembly 70 comprises source 32 and diffuser 33, together with a beam former 72.
- the beam former is designed to create beam 74 having reduced divergence over an intermediate region 76, while still preserving the linear scaling of the speckle pattern with axial distance Z within this region.
- high speckle contrast is maintained in images of object 28 throughout region 76, so that the range of depths covered by the 3D mapping system is increased.
- a number of optical designs that may be used to achieve this enhanced performed in region 76 are described below.
- Fig. 5 is a schematic side view of beam former 72, in accordance with an embodiment of the present invention.
- the beam former comprises a diffractive optical element (DOE) 80 and an axicon 82.
- DOE 80 may be butted against diffuser 33, or even incorporated as an etched or deposited layer on the surface of the diffuser itself.
- Various diffractive designs may be used to reduce the beam divergence in region 76.
- DOE 80 may comprise a pattern of concentric rings, centered on the optical axis of source 32, with a random distribution of ring radii.
- Axicon 82 has a conical profile centered on the optical axis, i.e., it is a sort of rotationally-symmetrical prism.
- Both DOE 80 and axicon 82 have the effect of creating long regions of focus along the optical axis, so that either of these elements could be used alone to create a region of reduced beam divergence.
- the reduction of divergence can be further enhanced by using the two elements together.
- Fig. 6 is a schematic side view of a beam former 90, in accordance with another embodiment of the present invention.
- Beam former 90 comprises a DOE 92 and lenses 94 and 96, having focal length F.
- the lenses are separated from diffuser 33 and from DOE 92 by distances equal to their focal lengths, so that the DOE is located in the Fourier plane of the diffuser.
- the Fourier transform of the diffuser is multiplied by the transmission function of the DOE.
- the speckle pattern is multiplied by the Fourier transform of the pattern on the DOE.
- the DOE pattern may be chosen so that its Fourier transform provides reduced divergence, as shown above in Fig. 4, and/or more uniform illumination across the illumination beam.
- element 92 with lower transmittance in its central region than in the periphery (opposite to the angular intensity distribution of the beam from diffuser 33, which tends to be brighter in the center and fall off with increasing angle from the optical axis).
- DOE 92 or DOE 80 Fig. 5
- Fig. 7 is a schematic side view of an optical correlator 110 that may be used in system 20 to determine Z-coordinates of areas of object 28, in accordance with an embodiment of the present invention.
- Correlator 110 uses optical techniques to carry out some of the functions of processor 24 that were described above.
- the correlator is capable of determining coordinates of multiple areas of the object in parallel at very high speed, almost instantaneously. It is therefore useful particularly in applications that are characterized by rapid object motion.
- a lenslet array 116 forms multiple sub-images of object 28 under speckle illumination by assembly 30.
- An array 118 of apertures limits the fields of view of the lenslets in array 116, so that each sub-image contains light from only a narrow angular range.
- a second lenslet array 120 projects the sub-images onto a DOE 122.
- Array 120 is separated from the plane of the sub-images by a distance equal to the focal length of the lenslets in the array, and is separated from the plane of DOE 122 by an equal distance.
- a rear lenslet array 124 is located between DOE 122 and sensor 40, separated from each by a distance equal to the focal length of the lenslets.
- DOE 122 contains a reference diffraction pattern that is the spatial Fourier transform of the reference speckle pattern to which the speckle image of object 28 is to be compared.
- the reference diffraction pattern may be the Fourier transform of the calibration speckle image formed at step 50 (Fig. 3), using a flat surface at a known distance from the illumination source.
- the reference diffraction pattern may be deposited or etched on the surface of , the DOE.
- DOE 122 may comprise a spatial light modulator (SLM), which is driven to project the reference diffraction pattern dynamically.
- correlator 110 multiplies the sub-images of the object (formed by the lenslets in array 116) by the reference speckle pattern in Fourier space.
- the intensity distribution projected onto sensor 40 by lenslet array 124 corresponds to the cross- correlation of each sub-image with the reference speckle pattern.
- the intensity distribution on the sensor will comprise multiple correlation peaks, each peak corresponding to one of the sub-images.
- the transverse offset of each peak relative to the axis of the corresponding sub-image is proportional to the transverse displacement of the speckle pattern on the corresponding area of object 28.
- This displacement is proportional to the Z-direction displacement of the area relative to the plane of the reference speckle pattern, as explained above.
- the output of sensor 40 may be processed to determine the Z-coordinate of the area of each sub- image, and thus to compute a 3D map of the object.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Diffracting Gratings Or Hologram Optical Elements (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2007800166255A CN101496033B (en) | 2006-03-14 | 2007-03-08 | Depth-varying light fields for three dimensional sensing |
US12/282,517 US8390821B2 (en) | 2005-10-11 | 2007-03-08 | Three-dimensional sensing using speckle patterns |
JP2008558981A JP5174684B2 (en) | 2006-03-14 | 2007-03-08 | 3D detection using speckle patterns |
KR1020087025030A KR101331543B1 (en) | 2006-03-14 | 2007-03-08 | Three-dimensional sensing using speckle patterns |
US13/541,775 US9330324B2 (en) | 2005-10-11 | 2012-07-05 | Error compensation in three-dimensional mapping |
US13/748,617 US9063283B2 (en) | 2005-10-11 | 2013-01-24 | Pattern generation using a diffraction pattern that is a spatial fourier transform of a random pattern |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ILPCT/IL2006/000335 | 2006-03-14 | ||
PCT/IL2006/000335 WO2007043036A1 (en) | 2005-10-11 | 2006-03-14 | Method and system for object reconstruction |
US78518706P | 2006-03-24 | 2006-03-24 | |
US60/785,187 | 2006-03-24 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2006/000335 Continuation-In-Part WO2007043036A1 (en) | 2005-03-30 | 2006-03-14 | Method and system for object reconstruction |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/282,517 A-371-Of-International US8390821B2 (en) | 2005-10-11 | 2007-03-08 | Three-dimensional sensing using speckle patterns |
US12/605,340 Continuation-In-Part US20110096182A1 (en) | 2005-10-11 | 2009-10-25 | Error Compensation in Three-Dimensional Mapping |
US13/748,617 Continuation US9063283B2 (en) | 2005-10-11 | 2013-01-24 | Pattern generation using a diffraction pattern that is a spatial fourier transform of a random pattern |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2007105205A2 true WO2007105205A2 (en) | 2007-09-20 |
WO2007105205A3 WO2007105205A3 (en) | 2009-04-23 |
Family
ID=38509871
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2007/000306 WO2007105205A2 (en) | 2005-10-11 | 2007-03-08 | Three-dimensional sensing using speckle patterns |
Country Status (5)
Country | Link |
---|---|
US (2) | US8390821B2 (en) |
JP (1) | JP5174684B2 (en) |
KR (1) | KR101331543B1 (en) |
CN (1) | CN101496033B (en) |
WO (1) | WO2007105205A2 (en) |
Cited By (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2921719A1 (en) * | 2007-09-28 | 2009-04-03 | Noomeo Soc Par Actions Simplif | Physical object e.g. apple, three-dimensional surface's synthesis image creating method for e.g. industrial farm, involves calculating depth coordinate measured at axis for each point of dusty seeds from projection of point on surface |
FR2940423A1 (en) * | 2008-12-22 | 2010-06-25 | Noomeo | DENSE RECONSTRUCTION THREE-DIMENSIONAL SCANNING DEVICE |
US20110096182A1 (en) * | 2009-10-25 | 2011-04-28 | Prime Sense Ltd | Error Compensation in Three-Dimensional Mapping |
EP2342892A1 (en) * | 2008-09-26 | 2011-07-13 | Cybula Ltd | Image recognition |
EP2363686A1 (en) | 2010-02-02 | 2011-09-07 | Primesense Ltd. | Optical apparatus, an imaging system and a method for producing a photonics module |
US8050461B2 (en) | 2005-10-11 | 2011-11-01 | Primesense Ltd. | Depth-varying light fields for three dimensional sensing |
US8150142B2 (en) | 2007-04-02 | 2012-04-03 | Prime Sense Ltd. | Depth mapping using projected patterns |
US8166421B2 (en) | 2008-01-14 | 2012-04-24 | Primesense Ltd. | Three-dimensional user interface |
EP2466560A1 (en) | 2010-12-20 | 2012-06-20 | Axis AB | Method and system for monitoring the accessibility of an emergency exit |
US8249334B2 (en) | 2006-05-11 | 2012-08-21 | Primesense Ltd. | Modeling of humanoid forms from depth maps |
US20120281240A1 (en) * | 2005-10-11 | 2012-11-08 | Primesense Ltd. | Error Compensation in Three-Dimensional Mapping |
EP2530442A1 (en) | 2011-05-30 | 2012-12-05 | Axis AB | Methods and apparatus for thermographic measurements. |
US8350847B2 (en) | 2007-01-21 | 2013-01-08 | Primesense Ltd | Depth mapping using multi-beam illumination |
US8384997B2 (en) | 2008-01-21 | 2013-02-26 | Primesense Ltd | Optical pattern projection |
US8390821B2 (en) | 2005-10-11 | 2013-03-05 | Primesense Ltd. | Three-dimensional sensing using speckle patterns |
US8400494B2 (en) | 2005-10-11 | 2013-03-19 | Primesense Ltd. | Method and system for object reconstruction |
WO2013038089A1 (en) | 2011-09-16 | 2013-03-21 | Prynel | Method and system for acquiring and processing images for the detection of motion |
US8456517B2 (en) | 2008-07-09 | 2013-06-04 | Primesense Ltd. | Integrated processor for 3D mapping |
US8462207B2 (en) | 2009-02-12 | 2013-06-11 | Primesense Ltd. | Depth ranging with Moiré patterns |
EP2611171A1 (en) | 2011-12-27 | 2013-07-03 | Thomson Licensing | Device for acquiring stereoscopic images |
US8492696B2 (en) | 2009-11-15 | 2013-07-23 | Primesense Ltd. | Optical projector with beam monitor including mapping apparatus capturing image of pattern projected onto an object |
US8494252B2 (en) | 2007-06-19 | 2013-07-23 | Primesense Ltd. | Depth mapping using optical elements having non-uniform focal characteristics |
US8493496B2 (en) | 2007-04-02 | 2013-07-23 | Primesense Ltd. | Depth mapping using projected patterns |
US8565479B2 (en) | 2009-08-13 | 2013-10-22 | Primesense Ltd. | Extraction of skeletons from 3D maps |
US8582867B2 (en) | 2010-09-16 | 2013-11-12 | Primesense Ltd | Learning-based pose estimation from depth maps |
US8594425B2 (en) | 2010-05-31 | 2013-11-26 | Primesense Ltd. | Analysis of three-dimensional scenes |
US8599484B2 (en) | 2010-08-10 | 2013-12-03 | Asahi Glass Company, Limited | Diffractive optical element and measuring device |
US8630039B2 (en) | 2008-01-21 | 2014-01-14 | Primesense Ltd. | Optical designs for zero order reduction |
US20140043230A1 (en) * | 2008-01-14 | 2014-02-13 | Primesense Ltd. | Three-Dimensional User Interface Session Control |
US8717417B2 (en) | 2009-04-16 | 2014-05-06 | Primesense Ltd. | Three-dimensional mapping and imaging |
US8717488B2 (en) | 2011-01-18 | 2014-05-06 | Primesense Ltd. | Objective optics with interference filter |
US8749796B2 (en) | 2011-08-09 | 2014-06-10 | Primesense Ltd. | Projectors of structured light |
US8787663B2 (en) | 2010-03-01 | 2014-07-22 | Primesense Ltd. | Tracking body parts by combined color image and depth processing |
US8786757B2 (en) | 2010-02-23 | 2014-07-22 | Primesense Ltd. | Wideband ambient light rejection |
US8786682B2 (en) | 2009-03-05 | 2014-07-22 | Primesense Ltd. | Reference image techniques for three-dimensional sensing |
US8830227B2 (en) | 2009-12-06 | 2014-09-09 | Primesense Ltd. | Depth-based gain control |
US8872762B2 (en) | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
US8881051B2 (en) | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
US8908277B2 (en) | 2011-08-09 | 2014-12-09 | Apple Inc | Lens array projector |
US8933876B2 (en) | 2010-12-13 | 2015-01-13 | Apple Inc. | Three dimensional user interface session control |
US8959013B2 (en) | 2010-09-27 | 2015-02-17 | Apple Inc. | Virtual keyboard for a non-tactile three dimensional user interface |
US8971572B1 (en) | 2011-08-12 | 2015-03-03 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
US8995057B2 (en) | 2010-11-02 | 2015-03-31 | Asahi Glass Company, Limited | Diffractive optical element and measurement instrument |
US9002099B2 (en) | 2011-09-11 | 2015-04-07 | Apple Inc. | Learning-based estimation of hand and finger pose |
US9019267B2 (en) | 2012-10-30 | 2015-04-28 | Apple Inc. | Depth mapping with enhanced resolution |
US9030498B2 (en) | 2011-08-15 | 2015-05-12 | Apple Inc. | Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface |
US9030529B2 (en) | 2011-04-14 | 2015-05-12 | Industrial Technology Research Institute | Depth image acquiring device, system and method |
US9036158B2 (en) | 2010-08-11 | 2015-05-19 | Apple Inc. | Pattern projector |
US9047507B2 (en) | 2012-05-02 | 2015-06-02 | Apple Inc. | Upper-body skeleton extraction from depth maps |
US9052512B2 (en) | 2011-03-03 | 2015-06-09 | Asahi Glass Company, Limited | Diffractive optical element and measuring apparatus |
US9066087B2 (en) | 2010-11-19 | 2015-06-23 | Apple Inc. | Depth mapping using time-coded illumination |
US9098931B2 (en) | 2010-08-11 | 2015-08-04 | Apple Inc. | Scanning projectors and image capture modules for 3D mapping |
US9122311B2 (en) | 2011-08-24 | 2015-09-01 | Apple Inc. | Visual feedback for tactile and non-tactile user interfaces |
US9131136B2 (en) | 2010-12-06 | 2015-09-08 | Apple Inc. | Lens arrays for pattern projection and imaging |
US9152234B2 (en) | 2012-12-02 | 2015-10-06 | Apple Inc. | Detecting user intent to remove a pluggable peripheral device |
US9201237B2 (en) | 2012-03-22 | 2015-12-01 | Apple Inc. | Diffraction-based sensing of mirror position |
US9218063B2 (en) | 2011-08-24 | 2015-12-22 | Apple Inc. | Sessionless pointing user interface |
US9348111B2 (en) | 2010-08-24 | 2016-05-24 | Apple Inc. | Automatic detection of lens deviations |
EP3048581A1 (en) * | 2015-01-20 | 2016-07-27 | Ricoh Company, Ltd. | Image processing apparatus, system, image processing method, calibration method, and computer-readable recording medium |
US9477018B2 (en) | 2010-08-06 | 2016-10-25 | Asahi Glass Company, Limited | Diffractive optical element and measurement device |
US9514378B2 (en) | 2014-03-25 | 2016-12-06 | Massachusetts Institute Of Technology | Space-time modulated active 3D imager |
US9525863B2 (en) | 2015-04-29 | 2016-12-20 | Apple Inc. | Time-of-flight depth mapping with flexible scan pattern |
US9528906B1 (en) | 2013-12-19 | 2016-12-27 | Apple Inc. | Monitoring DOE performance using total internal reflection |
US9582889B2 (en) | 2009-07-30 | 2017-02-28 | Apple Inc. | Depth mapping based on pattern matching and stereoscopic information |
US9595156B2 (en) | 2012-01-23 | 2017-03-14 | Novomatic Ag | Prize wheel with gesture-based control |
US9817159B2 (en) | 2015-01-31 | 2017-11-14 | Microsoft Technology Licensing, Llc | Structured light pattern generation |
US9825425B2 (en) | 2013-06-19 | 2017-11-21 | Apple Inc. | Integrated structured-light projector comprising light-emitting elements on a substrate |
CN107430773A (en) * | 2015-03-20 | 2017-12-01 | 高通股份有限公司 | Strengthen the system and method for the depth map retrieval of mobile object using active detection technology |
US10012831B2 (en) | 2015-08-03 | 2018-07-03 | Apple Inc. | Optical monitoring of scan parameters |
US10043279B1 (en) | 2015-12-07 | 2018-08-07 | Apple Inc. | Robust detection and classification of body parts in a depth map |
US10073004B2 (en) | 2016-09-19 | 2018-09-11 | Apple Inc. | DOE defect monitoring utilizing total internal reflection |
CN109541875A (en) * | 2018-11-24 | 2019-03-29 | 深圳阜时科技有限公司 | A kind of light-source structure, optical projection mould group, sensing device and equipment |
US10310281B1 (en) | 2017-12-05 | 2019-06-04 | K Laser Technology, Inc. | Optical projector with off-axis diffractive element |
US10317684B1 (en) | 2018-01-24 | 2019-06-11 | K Laser Technology, Inc. | Optical projector with on axis hologram and multiple beam splitter |
US10349037B2 (en) | 2014-04-03 | 2019-07-09 | Ams Sensors Singapore Pte. Ltd. | Structured-stereo imaging assembly including separate imagers for different wavelengths |
US10366278B2 (en) | 2016-09-20 | 2019-07-30 | Apple Inc. | Curvature-based face detector |
EP3527121A1 (en) | 2011-02-09 | 2019-08-21 | Apple Inc. | Gesture detection in a 3d mapping environment |
JP2019144261A (en) * | 2013-06-06 | 2019-08-29 | ヘプタゴン・マイクロ・オプティクス・プライベート・リミテッドHeptagon Micro Optics Pte. Ltd. | Imaging system and method for making it operate |
US10509128B1 (en) | 2019-04-12 | 2019-12-17 | K Laser Technology, Inc. | Programmable pattern optical projector for depth detection |
US10545457B2 (en) | 2017-12-05 | 2020-01-28 | K Laser Technology, Inc. | Optical projector with off-axis diffractive element and conjugate images |
EP3666067A1 (en) | 2013-01-31 | 2020-06-17 | Lely Patent N.V. | Camera system, animal related system therewith, and method to create 3d camera images |
GB2589121A (en) * | 2019-11-21 | 2021-05-26 | Bae Systems Plc | Imaging apparatus |
US11262233B2 (en) | 2014-12-27 | 2022-03-01 | Guardian Optical Technologies, Ltd. | System and method for detecting surface vibrations |
US11422292B1 (en) | 2018-06-10 | 2022-08-23 | Apple Inc. | Super-blazed diffractive optical elements with sub-wavelength structures |
US11506762B1 (en) | 2019-09-24 | 2022-11-22 | Apple Inc. | Optical module comprising an optical waveguide with reference light path |
US11681019B2 (en) | 2019-09-18 | 2023-06-20 | Apple Inc. | Optical module with stray light baffle |
US11754767B1 (en) | 2020-03-05 | 2023-09-12 | Apple Inc. | Display with overlaid waveguide |
US12111421B2 (en) | 2021-03-17 | 2024-10-08 | Apple Inc. | Waveguide-based transmitters with adjustable lighting |
Families Citing this family (118)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5592070B2 (en) * | 2006-03-14 | 2014-09-17 | プライム センス リミティド | Light field that changes depth for 3D detection |
US8265793B2 (en) | 2007-03-20 | 2012-09-11 | Irobot Corporation | Mobile robot for telecommunication |
DE102007058590B4 (en) * | 2007-12-04 | 2010-09-16 | Sirona Dental Systems Gmbh | Recording method for an image of a recording object and recording device |
DK2442720T3 (en) | 2009-06-17 | 2016-12-19 | 3Shape As | Focus scan devices |
CN102022979A (en) | 2009-09-21 | 2011-04-20 | 鸿富锦精密工业(深圳)有限公司 | Three-dimensional optical sensing system |
US8867820B2 (en) * | 2009-10-07 | 2014-10-21 | Microsoft Corporation | Systems and methods for removing a background of an image |
US8963829B2 (en) * | 2009-10-07 | 2015-02-24 | Microsoft Corporation | Methods and systems for determining and tracking extremities of a target |
US7961910B2 (en) | 2009-10-07 | 2011-06-14 | Microsoft Corporation | Systems and methods for tracking a model |
US8564534B2 (en) | 2009-10-07 | 2013-10-22 | Microsoft Corporation | Human tracking system |
JP4783456B2 (en) * | 2009-12-22 | 2011-09-28 | 株式会社東芝 | Video playback apparatus and video playback method |
US20110188054A1 (en) * | 2010-02-02 | 2011-08-04 | Primesense Ltd | Integrated photonics module for optical projection |
US8982182B2 (en) * | 2010-03-01 | 2015-03-17 | Apple Inc. | Non-uniform spatial resource allocation for depth mapping |
CN107256094A (en) * | 2010-04-13 | 2017-10-17 | 诺基亚技术有限公司 | Device, method, computer program and user interface |
US9400503B2 (en) | 2010-05-20 | 2016-07-26 | Irobot Corporation | Mobile human interface robot |
US8935005B2 (en) | 2010-05-20 | 2015-01-13 | Irobot Corporation | Operating a mobile robot |
US9014848B2 (en) | 2010-05-20 | 2015-04-21 | Irobot Corporation | Mobile robot system |
WO2011146259A2 (en) | 2010-05-20 | 2011-11-24 | Irobot Corporation | Mobile human interface robot |
US8918213B2 (en) | 2010-05-20 | 2014-12-23 | Irobot Corporation | Mobile human interface robot |
US8670029B2 (en) * | 2010-06-16 | 2014-03-11 | Microsoft Corporation | Depth camera illuminator with superluminescent light-emitting diode |
JP5791131B2 (en) | 2010-07-20 | 2015-10-07 | アップル インコーポレイテッド | Interactive reality extension for natural interactions |
US9201501B2 (en) | 2010-07-20 | 2015-12-01 | Apple Inc. | Adaptive projector |
IL208568B (en) * | 2010-10-07 | 2018-06-28 | Elbit Systems Ltd | Mapping, detecting and tracking objects in an arbitrary outdoor scene using active vision |
KR20120046973A (en) * | 2010-11-03 | 2012-05-11 | 삼성전자주식회사 | Method and apparatus for generating motion information |
GB2502213A (en) | 2010-12-30 | 2013-11-20 | Irobot Corp | Mobile Human Interface Robot |
US8930019B2 (en) | 2010-12-30 | 2015-01-06 | Irobot Corporation | Mobile human interface robot |
JP5948948B2 (en) * | 2011-03-03 | 2016-07-06 | 旭硝子株式会社 | Diffractive optical element and measuring device |
JP5948949B2 (en) * | 2011-06-28 | 2016-07-06 | 旭硝子株式会社 | Diffractive optical element and measuring device |
US9030528B2 (en) | 2011-04-04 | 2015-05-12 | Apple Inc. | Multi-zone imaging sensor and lens array |
WO2012147495A1 (en) * | 2011-04-28 | 2012-11-01 | 三洋電機株式会社 | Information acquisition device and object detection device |
US9024872B2 (en) | 2011-04-28 | 2015-05-05 | Sharp Kabushiki Kaisha | Head-mounted display |
JP5926500B2 (en) * | 2011-06-07 | 2016-05-25 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JP5298161B2 (en) * | 2011-06-13 | 2013-09-25 | シャープ株式会社 | Operating device and image forming apparatus |
US9459758B2 (en) | 2011-07-05 | 2016-10-04 | Apple Inc. | Gesture-based interface with enhanced features |
US9377865B2 (en) | 2011-07-05 | 2016-06-28 | Apple Inc. | Zoom-based gesture user interface |
US8869073B2 (en) * | 2011-07-28 | 2014-10-21 | Hewlett-Packard Development Company, L.P. | Hand pose interaction |
US9462210B2 (en) | 2011-11-04 | 2016-10-04 | Remote TelePointer, LLC | Method and system for user interface for interactive devices using a mobile device |
DE102011121696A1 (en) * | 2011-12-16 | 2013-06-20 | Friedrich-Schiller-Universität Jena | Method for 3D measurement of depth-limited objects |
US9651417B2 (en) | 2012-02-15 | 2017-05-16 | Apple Inc. | Scanning depth engine |
US10937239B2 (en) | 2012-02-23 | 2021-03-02 | Charles D. Huston | System and method for creating an environment and for sharing an event |
WO2013126784A2 (en) | 2012-02-23 | 2013-08-29 | Huston Charles D | System and method for creating an environment and for sharing a location based experience in an environment |
US10600235B2 (en) | 2012-02-23 | 2020-03-24 | Charles D. Huston | System and method for capturing and sharing a location based experience |
US9229534B2 (en) | 2012-02-28 | 2016-01-05 | Apple Inc. | Asymmetric mapping for tactile and non-tactile user interfaces |
KR101898490B1 (en) * | 2012-02-29 | 2018-09-13 | 엘지전자 주식회사 | Holographic display device and method for generating hologram using redundancy of 3-D video |
US8958911B2 (en) | 2012-02-29 | 2015-02-17 | Irobot Corporation | Mobile robot |
US11169611B2 (en) | 2012-03-26 | 2021-11-09 | Apple Inc. | Enhanced virtual touchpad |
CN103424077A (en) * | 2012-05-23 | 2013-12-04 | 联想(北京)有限公司 | Motion detection device, detection method and electronic equipment |
CN102681183B (en) * | 2012-05-25 | 2015-01-07 | 合肥鼎臣光电科技有限责任公司 | Two-way three-dimensional imaging and naked-eye three-dimensional display system based on lens array |
US10048779B2 (en) * | 2012-06-30 | 2018-08-14 | Hewlett-Packard Development Company, L.P. | Virtual hand based on combined data |
US8896594B2 (en) * | 2012-06-30 | 2014-11-25 | Microsoft Corporation | Depth sensing with depth-adaptive illumination |
CN104602608B (en) | 2012-08-27 | 2019-01-01 | 皇家飞利浦有限公司 | It is adjusted based on optics 3D scene detection and the patient-specific and automatic x-ray system of explanation |
DE102012110460A1 (en) * | 2012-10-31 | 2014-04-30 | Audi Ag | A method for entering a control command for a component of a motor vehicle |
US9661304B2 (en) * | 2012-10-31 | 2017-05-23 | Ricoh Company, Ltd. | Pre-calculation of sine waves for pixel values |
CN105027030B (en) | 2012-11-01 | 2018-10-23 | 艾卡姆有限公司 | The wireless wrist calculating connected for three-dimensional imaging, mapping, networking and interface and control device and method |
US9217665B2 (en) | 2013-01-31 | 2015-12-22 | Hewlett Packard Enterprise Development Lp | Viewing-angle imaging using lenslet array |
JP6044403B2 (en) * | 2013-03-18 | 2016-12-14 | 富士通株式会社 | Imaging apparatus, imaging method, and imaging program |
US10268885B2 (en) | 2013-04-15 | 2019-04-23 | Microsoft Technology Licensing, Llc | Extracting true color from a color and infrared sensor |
CN103268608B (en) * | 2013-05-17 | 2015-12-02 | 清华大学 | Based on depth estimation method and the device of near-infrared laser speckle |
CN105358063B (en) | 2013-06-19 | 2018-11-30 | 皇家飞利浦有限公司 | The calibration of imager with dynamic beam reshaper |
US9208566B2 (en) | 2013-08-09 | 2015-12-08 | Microsoft Technology Licensing, Llc | Speckle sensing for motion tracking |
WO2015030127A1 (en) | 2013-09-02 | 2015-03-05 | 旭硝子株式会社 | Diffraction optical element, projection device, and measurement device |
TWI485361B (en) * | 2013-09-11 | 2015-05-21 | Univ Nat Taiwan | Measuring apparatus for three-dimensional profilometry and method thereof |
KR102159996B1 (en) * | 2013-12-16 | 2020-09-25 | 삼성전자주식회사 | Event filtering device and motion recognition device thereof |
EP2894546B1 (en) * | 2014-01-13 | 2018-07-18 | Facebook Inc. | Sub-resolution optical detection |
WO2015118120A1 (en) | 2014-02-07 | 2015-08-13 | 3Shape A/S | Detecting tooth shade |
US10455212B1 (en) * | 2014-08-25 | 2019-10-22 | X Development Llc | Projected pattern motion/vibration for depth sensing |
USD733141S1 (en) | 2014-09-10 | 2015-06-30 | Faro Technologies, Inc. | Laser scanner |
US9881235B1 (en) | 2014-11-21 | 2018-01-30 | Mahmoud Narimanzadeh | System, apparatus, and method for determining physical dimensions in digital images |
US9841496B2 (en) | 2014-11-21 | 2017-12-12 | Microsoft Technology Licensing, Llc | Multiple pattern illumination optics for time of flight system |
TWI564754B (en) * | 2014-11-24 | 2017-01-01 | 圓剛科技股份有限公司 | Spatial motion sensing device and spatial motion sensing method |
CN107209960B (en) * | 2014-12-18 | 2021-01-01 | 脸谱科技有限责任公司 | System, apparatus and method for providing a user interface for a virtual reality environment |
FI126498B (en) * | 2014-12-29 | 2017-01-13 | Helmee Imaging Oy | Optical measuring system |
US9958758B2 (en) * | 2015-01-21 | 2018-05-01 | Microsoft Technology Licensing, Llc | Multiple exposure structured light pattern |
US10509147B2 (en) | 2015-01-29 | 2019-12-17 | ams Sensors Singapore Pte. Ltd | Apparatus for producing patterned illumination using arrays of light sources and lenses |
JP6575795B2 (en) | 2015-03-11 | 2019-09-18 | パナソニックIpマネジメント株式会社 | Human detection system |
US10001583B2 (en) | 2015-04-06 | 2018-06-19 | Heptagon Micro Optics Pte. Ltd. | Structured light projection using a compound patterned mask |
KR101892168B1 (en) * | 2015-05-13 | 2018-08-27 | 페이스북, 인크. | Enhancement of depth map representation using reflectivity map representation |
WO2016195684A1 (en) * | 2015-06-04 | 2016-12-08 | Siemens Healthcare Gmbh | Apparatus and methods for a projection display device on x-ray imaging devices |
JP6566768B2 (en) * | 2015-07-30 | 2019-08-28 | キヤノン株式会社 | Information processing apparatus, information processing method, and program |
US11057608B2 (en) | 2016-01-04 | 2021-07-06 | Qualcomm Incorporated | Depth map generation in structured light system |
JP6668764B2 (en) | 2016-01-13 | 2020-03-18 | セイコーエプソン株式会社 | Image recognition device, image recognition method, and image recognition unit |
JP6668763B2 (en) | 2016-01-13 | 2020-03-18 | セイコーエプソン株式会社 | Image recognition device, image recognition method, and image recognition unit |
JP6631261B2 (en) | 2016-01-14 | 2020-01-15 | セイコーエプソン株式会社 | Image recognition device, image recognition method, and image recognition unit |
US10154234B2 (en) * | 2016-03-16 | 2018-12-11 | Omnivision Technologies, Inc. | Image sensor with peripheral 3A-control sensors and associated imaging system |
KR101745651B1 (en) * | 2016-03-29 | 2017-06-09 | 전자부품연구원 | System and method for recognizing hand gesture |
US10489924B2 (en) | 2016-03-30 | 2019-11-26 | Samsung Electronics Co., Ltd. | Structured light generator and object recognition apparatus including the same |
JP6607121B2 (en) | 2016-03-30 | 2019-11-20 | セイコーエプソン株式会社 | Image recognition apparatus, image recognition method, and image recognition unit |
US10474297B2 (en) | 2016-07-20 | 2019-11-12 | Ams Sensors Singapore Pte. Ltd. | Projecting a structured light pattern onto a surface and detecting and responding to interactions with the same |
US10241244B2 (en) | 2016-07-29 | 2019-03-26 | Lumentum Operations Llc | Thin film total internal reflection diffraction grating for single polarization or dual polarization |
US10481740B2 (en) | 2016-08-01 | 2019-11-19 | Ams Sensors Singapore Pte. Ltd. | Projecting a structured light pattern onto a surface and detecting and responding to interactions with the same |
US10775508B1 (en) * | 2016-08-19 | 2020-09-15 | Apple Inc. | Remote sensing device |
TWI587206B (en) * | 2016-11-24 | 2017-06-11 | 財團法人工業技術研究院 | Interactive display device and system thereof |
US10499039B2 (en) | 2016-12-15 | 2019-12-03 | Egismos Technology Corporation | Path detection system and path detection method generating laser pattern by diffractive optical element |
US10158845B2 (en) | 2017-01-18 | 2018-12-18 | Facebook Technologies, Llc | Tileable structured light projection for wide field-of-view depth sensing |
US10620447B2 (en) * | 2017-01-19 | 2020-04-14 | Cognex Corporation | System and method for reduced-speckle laser line generation |
EP3615967B1 (en) * | 2017-04-24 | 2023-03-01 | Magic Leap, Inc. | Tracking optical flow of backscattered laser speckle patterns |
WO2018216575A1 (en) | 2017-05-26 | 2018-11-29 | Agc株式会社 | Diffraction optical element, projection device, and measuring device |
US11494897B2 (en) | 2017-07-07 | 2022-11-08 | William F. WILEY | Application to determine reading/working distance |
US10527711B2 (en) | 2017-07-10 | 2020-01-07 | Aurora Flight Sciences Corporation | Laser speckle system and method for an aircraft |
CN111095018B (en) | 2017-08-31 | 2022-03-29 | 深圳市大疆创新科技有限公司 | Solid state light detection and ranging (LIDAR) systems, systems and methods for improving solid state light detection and ranging (LIDAR) resolution |
JP6856784B2 (en) * | 2017-08-31 | 2021-04-14 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Solid-state photodetection and range-finding (LIDAR) systems, systems and methods for improving solid-state light detection and range-finding (LIDAR) resolution. |
WO2019079790A1 (en) | 2017-10-21 | 2019-04-25 | Eyecam, Inc | Adaptive graphic user interfacing system |
JP6970376B2 (en) | 2017-12-01 | 2021-11-24 | オムロン株式会社 | Image processing system and image processing method |
CN110161786B (en) | 2018-02-12 | 2021-08-31 | 深圳富泰宏精密工业有限公司 | Light projection module, three-dimensional image sensing device and sensing method thereof |
CN108663800B (en) * | 2018-04-16 | 2021-03-19 | 华东交通大学 | Optical encryption and decryption method, device and system |
WO2019240010A1 (en) | 2018-06-11 | 2019-12-19 | Agc株式会社 | Diffraction optical element, projection device, and measurement device |
CN110619996B (en) * | 2018-06-20 | 2022-07-08 | 株式会社村田制作所 | Inductor and method for manufacturing the same |
US11675114B2 (en) | 2018-07-23 | 2023-06-13 | Ii-Vi Delaware, Inc. | Monolithic structured light projector |
WO2020080169A1 (en) | 2018-10-15 | 2020-04-23 | Agc株式会社 | Diffractive optical element and illumination optical system |
DE102018129143B4 (en) * | 2018-11-20 | 2021-06-17 | Carl Zeiss Industrielle Messtechnik Gmbh | Variable measurement object-dependent camera structure and calibration thereof |
WO2020136658A1 (en) * | 2018-12-28 | 2020-07-02 | Guardian Optical Technologies Ltd | Systems, devices and methods for vehicle post-crash support |
WO2020171749A1 (en) * | 2019-02-18 | 2020-08-27 | Fingerprint Cards Ab | Optical biometric imaging device and method of operating an optical biometric imaging device |
US11029408B2 (en) * | 2019-04-03 | 2021-06-08 | Varjo Technologies Oy | Distance-imaging system and method of distance imaging |
CN111650759A (en) * | 2019-12-31 | 2020-09-11 | 北京大学 | Multi-focal-length micro-lens array remote sensing light field imaging system for near-infrared light spot projection |
US20220338747A1 (en) * | 2020-01-17 | 2022-10-27 | Antishock Technologies, Ltd. | System and method for monitoring fluid management to a patient |
US11888289B2 (en) * | 2020-03-30 | 2024-01-30 | Namuga, Co., Ltd. | Light source module allowing differential control according to distance to subject and method for controlling the same |
CA3188141A1 (en) * | 2020-06-30 | 2022-01-06 | Kneedly Ab | Solution for determination of supraphysiological body joint movements |
EP3993385A1 (en) | 2020-10-29 | 2022-05-04 | Universitat de València | A multiperspective photography camera device |
CN114255233B (en) * | 2022-03-01 | 2022-05-31 | 合肥的卢深视科技有限公司 | Speckle pattern quality evaluation method and device, electronic device and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040228519A1 (en) * | 2003-03-10 | 2004-11-18 | Cranial Technologies, Inc. | Automatic selection of cranial remodeling device trim lines |
US20050200925A1 (en) * | 1999-12-10 | 2005-09-15 | Xyz Imaging, Inc. | Holographic printer |
Family Cites Families (165)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2951207A1 (en) | 1978-12-26 | 1980-07-10 | Canon Kk | METHOD FOR THE OPTICAL PRODUCTION OF A SPREADING PLATE |
US4542376A (en) | 1983-11-03 | 1985-09-17 | Burroughs Corporation | System for electronically displaying portions of several different images on a CRT screen through respective prioritized viewports |
JPS6079108U (en) * | 1983-11-08 | 1985-06-01 | オムロン株式会社 | speckle rangefinder |
JPH0762869B2 (en) | 1986-03-07 | 1995-07-05 | 日本電信電話株式会社 | Position and shape measurement method by pattern projection |
US4843568A (en) | 1986-04-11 | 1989-06-27 | Krueger Myron W | Real time perception of and response to the actions of an unencumbered participant/user |
JPH0615968B2 (en) | 1986-08-11 | 1994-03-02 | 伍良 松本 | Three-dimensional shape measuring device |
JP2714152B2 (en) * | 1989-06-28 | 1998-02-16 | 古野電気株式会社 | Object shape measurement method |
US5075562A (en) | 1990-09-20 | 1991-12-24 | Eastman Kodak Company | Method and apparatus for absolute Moire distance measurements using a grating printed on or attached to a surface |
GB9116151D0 (en) | 1991-07-26 | 1991-09-11 | Isis Innovation | Three-dimensional vision system |
US5483261A (en) | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
DE69226512T2 (en) | 1992-03-12 | 1999-04-22 | International Business Machines Corp., Armonk, N.Y. | Image processing method |
US5636025A (en) | 1992-04-23 | 1997-06-03 | Medar, Inc. | System for optically measuring the surface contour of a part using more fringe techniques |
JP3353365B2 (en) * | 1993-03-18 | 2002-12-03 | 静岡大学長 | Displacement and displacement velocity measuring device |
US5856871A (en) | 1993-08-18 | 1999-01-05 | Applied Spectral Imaging Ltd. | Film thickness mapping using interferometric spectral imaging |
WO1996007939A1 (en) * | 1994-09-05 | 1996-03-14 | Mikoh Technology Limited | Diffraction surfaces and methods for the manufacture thereof |
US6041140A (en) | 1994-10-04 | 2000-03-21 | Synthonics, Incorporated | Apparatus for interactive image correlation for three dimensional image production |
JPH08186845A (en) | 1994-12-27 | 1996-07-16 | Nobuaki Yanagisawa | Focal distance controlling stereoscopic-vision television receiver |
US5630043A (en) | 1995-05-11 | 1997-05-13 | Cirrus Logic, Inc. | Animated texture map apparatus and method for 3-D image displays |
IL114278A (en) | 1995-06-22 | 2010-06-16 | Microsoft Internat Holdings B | Camera and method |
AU728407B2 (en) | 1995-07-18 | 2001-01-11 | Budd Company, The | Moire interferometry system and method with extended imaging depth |
JPH09261535A (en) | 1996-03-25 | 1997-10-03 | Sharp Corp | Image pickup device |
DE19638727A1 (en) | 1996-09-12 | 1998-03-19 | Ruedger Dipl Ing Rubbert | Method for increasing the significance of the three-dimensional measurement of objects |
JP3402138B2 (en) | 1996-09-27 | 2003-04-28 | 株式会社日立製作所 | Liquid crystal display |
IL119341A (en) | 1996-10-02 | 1999-09-22 | Univ Ramot | Phase-only filter for generating an arbitrary illumination pattern |
IL119831A (en) | 1996-12-15 | 2002-12-01 | Cognitens Ltd | Apparatus and method for 3d surface geometry reconstruction |
CA2275411A1 (en) * | 1996-12-20 | 1998-07-02 | Lifef/X Networks, Inc. | Apparatus and method for rapid 3d image parametrization |
US5838428A (en) | 1997-02-28 | 1998-11-17 | United States Of America As Represented By The Secretary Of The Navy | System and method for high resolution range imaging with split light source and pattern mask |
JPH10327433A (en) | 1997-05-23 | 1998-12-08 | Minolta Co Ltd | Display device for composted image |
US6008813A (en) | 1997-08-01 | 1999-12-28 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | Real-time PC based volume rendering system |
DE19736169A1 (en) | 1997-08-20 | 1999-04-15 | Fhu Hochschule Fuer Technik | Method to measure deformation or vibration using electronic speckle pattern interferometry |
US6101269A (en) | 1997-12-19 | 2000-08-08 | Lifef/X Networks, Inc. | Apparatus and method for rapid 3D image parametrization |
US6438272B1 (en) | 1997-12-31 | 2002-08-20 | The Research Foundation Of State University Of Ny | Method and apparatus for three dimensional surface contouring using a digital video projection system |
DE19815201A1 (en) | 1998-04-04 | 1999-10-07 | Link Johann & Ernst Gmbh & Co | Measuring arrangement for detecting dimensions of test specimens, preferably of hollow bodies, in particular of bores in workpieces, and methods for measuring such dimensions |
US6731391B1 (en) | 1998-05-13 | 2004-05-04 | The Research Foundation Of State University Of New York | Shadow moire surface measurement using Talbot effect |
DE19821611A1 (en) | 1998-05-14 | 1999-11-18 | Syrinx Med Tech Gmbh | Recording method for spatial structure of three-dimensional surface, e.g. for person recognition |
GB2352901A (en) | 1999-05-12 | 2001-02-07 | Tricorder Technology Plc | Rendering three dimensional representations utilising projected light patterns |
US6377700B1 (en) | 1998-06-30 | 2002-04-23 | Intel Corporation | Method and apparatus for capturing stereoscopic images using image sensors |
JP3678022B2 (en) | 1998-10-23 | 2005-08-03 | コニカミノルタセンシング株式会社 | 3D input device |
US6084712A (en) | 1998-11-03 | 2000-07-04 | Dynamic Measurement And Inspection,Llc | Three dimensional imaging using a refractive optic design |
US8965898B2 (en) | 1998-11-20 | 2015-02-24 | Intheplay, Inc. | Optimizations for live event, real-time, 3D object tracking |
US6759646B1 (en) | 1998-11-24 | 2004-07-06 | Intel Corporation | Color interpolation for a four color mosaic pattern |
JP2001166810A (en) | 1999-02-19 | 2001-06-22 | Sanyo Electric Co Ltd | Device and method for providing solid model |
CN2364507Y (en) * | 1999-03-18 | 2000-02-16 | 香港生产力促进局 | Small non-contact symmetric imput type three-D profile scanning head |
US6259561B1 (en) | 1999-03-26 | 2001-07-10 | The University Of Rochester | Optical system for diffusing light |
CA2373284A1 (en) * | 1999-05-14 | 2000-11-23 | 3D Metrics, Incorporated | Color structured light 3d-imaging system |
US6751344B1 (en) | 1999-05-28 | 2004-06-15 | Champion Orthotic Investments, Inc. | Enhanced projector system for machine vision |
US6512385B1 (en) | 1999-07-26 | 2003-01-28 | Paul Pfaff | Method for testing a device under test including the interference of two beams |
US6268923B1 (en) | 1999-10-07 | 2001-07-31 | Integral Vision, Inc. | Optical method and system for measuring three-dimensional surface topography of an object having a surface contour |
JP2001141430A (en) | 1999-11-16 | 2001-05-25 | Fuji Photo Film Co Ltd | Image pickup device and image processing device |
US6301059B1 (en) | 2000-01-07 | 2001-10-09 | Lucent Technologies Inc. | Astigmatic compensation for an anamorphic optical system |
US6937348B2 (en) | 2000-01-28 | 2005-08-30 | Genex Technologies, Inc. | Method and apparatus for generating structural pattern illumination |
US6700669B1 (en) | 2000-01-28 | 2004-03-02 | Zheng J. Geng | Method and system for three-dimensional imaging using light pattern having multiple sub-patterns |
JP4560869B2 (en) | 2000-02-07 | 2010-10-13 | ソニー株式会社 | Glasses-free display system and backlight system |
JP4265076B2 (en) | 2000-03-31 | 2009-05-20 | 沖電気工業株式会社 | Multi-angle camera and automatic photographing device |
KR100355718B1 (en) | 2000-06-10 | 2002-10-11 | 주식회사 메디슨 | System and method for 3-d ultrasound imaging using an steerable probe |
US6810135B1 (en) | 2000-06-29 | 2004-10-26 | Trw Inc. | Optimized human presence detection through elimination of background interference |
TW527518B (en) | 2000-07-14 | 2003-04-11 | Massachusetts Inst Technology | Method and system for high resolution, ultra fast, 3-D imaging |
US7227526B2 (en) | 2000-07-24 | 2007-06-05 | Gesturetek, Inc. | Video-based image control system |
US6686921B1 (en) | 2000-08-01 | 2004-02-03 | International Business Machines Corporation | Method and apparatus for acquiring a set of consistent image maps to represent the color of the surface of an object |
US6754370B1 (en) | 2000-08-14 | 2004-06-22 | The Board Of Trustees Of The Leland Stanford Junior University | Real-time structured light range scanning of moving scenes |
US6639684B1 (en) | 2000-09-13 | 2003-10-28 | Nextengine, Inc. | Digitizer using intensity gradient to image features of three-dimensional objects |
US6813440B1 (en) | 2000-10-10 | 2004-11-02 | The Hong Kong Polytechnic University | Body scanner |
JP3689720B2 (en) | 2000-10-16 | 2005-08-31 | 住友大阪セメント株式会社 | 3D shape measuring device |
JP2002152776A (en) | 2000-11-09 | 2002-05-24 | Nippon Telegr & Teleph Corp <Ntt> | Method and device for encoding and decoding distance image |
JP2002191058A (en) | 2000-12-20 | 2002-07-05 | Olympus Optical Co Ltd | Three-dimensional image acquisition device and three- dimensional image acquisition method |
JP2002213931A (en) | 2001-01-17 | 2002-07-31 | Fuji Xerox Co Ltd | Instrument and method for measuring three-dimensional shape |
US6841780B2 (en) | 2001-01-19 | 2005-01-11 | Honeywell International Inc. | Method and apparatus for detecting objects |
JP2002365023A (en) | 2001-06-08 | 2002-12-18 | Koji Okamoto | Apparatus and method for measurement of liquid level |
AU2002354681A1 (en) | 2001-07-13 | 2003-01-29 | Mems Optical, Inc. | Autosteroscopic display with rotated microlens-array and method of displaying multidimensional images, especially color images |
US6741251B2 (en) | 2001-08-16 | 2004-05-25 | Hewlett-Packard Development Company, L.P. | Method and apparatus for varying focus in a scene |
US7340077B2 (en) | 2002-02-15 | 2008-03-04 | Canesta, Inc. | Gesture recognition system using depth perceptive sensors |
US7369685B2 (en) | 2002-04-05 | 2008-05-06 | Identix Corporation | Vision-based operating method and system |
US7811825B2 (en) | 2002-04-19 | 2010-10-12 | University Of Washington | System and method for processing specimens and images for optical tomography |
US7385708B2 (en) | 2002-06-07 | 2008-06-10 | The University Of North Carolina At Chapel Hill | Methods and systems for laser based real-time structured light depth extraction |
US7006709B2 (en) | 2002-06-15 | 2006-02-28 | Microsoft Corporation | System and method deghosting mosaics using multiperspective plane sweep |
US20040001145A1 (en) | 2002-06-27 | 2004-01-01 | Abbate Jeffrey A. | Method and apparatus for multifield image generation and processing |
US6859326B2 (en) | 2002-09-20 | 2005-02-22 | Corning Incorporated | Random microlens array for optical beam shaping and homogenization |
KR100624405B1 (en) | 2002-10-01 | 2006-09-18 | 삼성전자주식회사 | Substrate for mounting optical component and method for producing the same |
US7194105B2 (en) | 2002-10-16 | 2007-03-20 | Hersch Roger D | Authentication of documents and articles by moiré patterns |
TWI291040B (en) | 2002-11-21 | 2007-12-11 | Solvision Inc | Fast 3D height measurement method and system |
US7103212B2 (en) | 2002-11-22 | 2006-09-05 | Strider Labs, Inc. | Acquisition of three-dimensional images by an active stereo technique using locally unique patterns |
US20040174770A1 (en) | 2002-11-27 | 2004-09-09 | Rees Frank L. | Gauss-Rees parametric ultrawideband system |
US7639419B2 (en) | 2003-02-21 | 2009-12-29 | Kla-Tencor Technologies, Inc. | Inspection system using small catadioptric objective |
US20040213463A1 (en) | 2003-04-22 | 2004-10-28 | Morrison Rick Lee | Multiplexed, spatially encoded illumination system for determining imaging and range estimation |
US7539340B2 (en) | 2003-04-25 | 2009-05-26 | Topcon Corporation | Apparatus and method for three-dimensional coordinate measurement |
ES2313036T3 (en) | 2003-07-24 | 2009-03-01 | Cognitens Ltd. | PROCEDURE AND SYSTEM FOR THE RECONSTRUCTION OF THE THREE-DIMENSIONAL SURFACE OF AN OBJECT. |
CA2435935A1 (en) | 2003-07-24 | 2005-01-24 | Guylain Lemelin | Optical 3d digitizer with enlarged non-ambiguity zone |
US20050111705A1 (en) | 2003-08-26 | 2005-05-26 | Roman Waupotitsch | Passive stereo sensing for 3D facial shape biometrics |
US6934018B2 (en) * | 2003-09-10 | 2005-08-23 | Shearographics, Llc | Tire inspection apparatus and method |
US7187437B2 (en) * | 2003-09-10 | 2007-03-06 | Shearographics, Llc | Plurality of light sources for inspection apparatus and method |
US7874917B2 (en) | 2003-09-15 | 2011-01-25 | Sony Computer Entertainment Inc. | Methods and systems for enabling depth and direction detection when interfacing with a computer program |
US7112774B2 (en) | 2003-10-09 | 2006-09-26 | Avago Technologies Sensor Ip (Singapore) Pte. Ltd | CMOS stereo imaging system and method |
US20050135555A1 (en) | 2003-12-23 | 2005-06-23 | Claus Bernhard Erich H. | Method and system for simultaneously viewing rendered volumes |
US7250949B2 (en) | 2003-12-23 | 2007-07-31 | General Electric Company | Method and system for visualizing three-dimensional data |
US8134637B2 (en) | 2004-01-28 | 2012-03-13 | Microsoft Corporation | Method and system to increase X-Y resolution in a depth (Z) camera using red, blue, green (RGB) sensing |
US7961909B2 (en) | 2006-03-08 | 2011-06-14 | Electronic Scripting Products, Inc. | Computer interface employing a manipulated object with absolute pose detection component and a display |
WO2005076198A1 (en) | 2004-02-09 | 2005-08-18 | Cheol-Gwon Kang | Device for measuring 3d shape using irregular pattern and method for the same |
US7427981B2 (en) | 2004-04-15 | 2008-09-23 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Optical device that measures distance between the device and a surface |
US7308112B2 (en) | 2004-05-14 | 2007-12-11 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
WO2006008637A1 (en) | 2004-07-23 | 2006-01-26 | Ge Healthcare Niagara, Inc. | Method and apparatus for fluorescent confocal microscopy |
US20060017656A1 (en) | 2004-07-26 | 2006-01-26 | Visteon Global Technologies, Inc. | Image intensity control in overland night vision systems |
US8114172B2 (en) | 2004-07-30 | 2012-02-14 | Extreme Reality Ltd. | System and method for 3D space-dimension based image processing |
US7120228B2 (en) | 2004-09-21 | 2006-10-10 | Jordan Valley Applied Radiation Ltd. | Combined X-ray reflectometer and diffractometer |
JP2006128818A (en) | 2004-10-26 | 2006-05-18 | Victor Co Of Japan Ltd | Recording program and reproducing program corresponding to stereoscopic video and 3d audio, recording apparatus, reproducing apparatus and recording medium |
IL165212A (en) | 2004-11-15 | 2012-05-31 | Elbit Systems Electro Optics Elop Ltd | Device for scanning light |
US7076024B2 (en) | 2004-12-01 | 2006-07-11 | Jordan Valley Applied Radiation, Ltd. | X-ray apparatus with dual monochromators |
US20060156756A1 (en) | 2005-01-20 | 2006-07-20 | Becke Paul E | Phase change and insulating properties container and method of use |
US20060221218A1 (en) | 2005-04-05 | 2006-10-05 | Doron Adler | Image sensor with improved color filter |
WO2006107955A1 (en) | 2005-04-06 | 2006-10-12 | Dimensional Photonics International, Inc. | Multiple channel interferometric surface contour measurement system |
US7560679B1 (en) | 2005-05-10 | 2009-07-14 | Siimpel, Inc. | 3D camera |
US7609875B2 (en) | 2005-05-27 | 2009-10-27 | Orametrix, Inc. | Scanner system and method for mapping surface of three-dimensional object |
EP1934945A4 (en) | 2005-10-11 | 2016-01-20 | Apple Inc | Method and system for object reconstruction |
US20110096182A1 (en) | 2009-10-25 | 2011-04-28 | Prime Sense Ltd | Error Compensation in Three-Dimensional Mapping |
US8018579B1 (en) | 2005-10-21 | 2011-09-13 | Apple Inc. | Three-dimensional imaging and display system |
CA2628611A1 (en) | 2005-11-04 | 2007-05-18 | Clean Earth Technologies, Llc | Tracking using an elastic cluster of trackers |
US7856125B2 (en) | 2006-01-31 | 2010-12-21 | University Of Southern California | 3D face reconstruction from 2D images |
WO2007096893A2 (en) | 2006-02-27 | 2007-08-30 | Prime Sense Ltd. | Range mapping using speckle decorrelation |
JP5592070B2 (en) | 2006-03-14 | 2014-09-17 | プライム センス リミティド | Light field that changes depth for 3D detection |
KR101331543B1 (en) | 2006-03-14 | 2013-11-20 | 프라임센스 엘티디. | Three-dimensional sensing using speckle patterns |
CN101957994B (en) | 2006-03-14 | 2014-03-19 | 普莱姆传感有限公司 | Depth-varying light fields for three dimensional sensing |
US7869649B2 (en) | 2006-05-08 | 2011-01-11 | Panasonic Corporation | Image processing device, image processing method, program, storage medium and integrated circuit |
US8488895B2 (en) | 2006-05-31 | 2013-07-16 | Indiana University Research And Technology Corp. | Laser scanning digital camera with pupil periphery illumination and potential for multiply scattered light imaging |
US8139142B2 (en) | 2006-06-01 | 2012-03-20 | Microsoft Corporation | Video manipulation of red, green, blue, distance (RGB-Z) data including segmentation, up-sampling, and background substitution techniques |
US8411149B2 (en) | 2006-08-03 | 2013-04-02 | Alterface S.A. | Method and device for identifying and extracting images of multiple users, and for recognizing user gestures |
US7737394B2 (en) | 2006-08-31 | 2010-06-15 | Micron Technology, Inc. | Ambient infrared detection in solid state sensors |
CN101512601B (en) | 2006-09-04 | 2013-07-31 | 皇家飞利浦电子股份有限公司 | Method for determining a depth map from images, device for determining a depth map |
US7256899B1 (en) | 2006-10-04 | 2007-08-14 | Ivan Faul | Wireless methods and systems for three-dimensional non-contact shape sensing |
WO2008061259A2 (en) | 2006-11-17 | 2008-05-22 | Celloptic, Inc. | System, apparatus and method for extracting three-dimensional information of an object from received electromagnetic radiation |
US8090194B2 (en) | 2006-11-21 | 2012-01-03 | Mantis Vision Ltd. | 3D geometric modeling and motion capture using both single and dual imaging |
US7990545B2 (en) | 2006-12-27 | 2011-08-02 | Cambridge Research & Instrumentation, Inc. | Surface measurement of in-vivo subjects using spot projector |
US7840031B2 (en) | 2007-01-12 | 2010-11-23 | International Business Machines Corporation | Tracking a range of body movement based on 3D captured image streams of a user |
WO2008087652A2 (en) | 2007-01-21 | 2008-07-24 | Prime Sense Ltd. | Depth mapping using multi-beam illumination |
US7894078B2 (en) | 2007-04-23 | 2011-02-22 | California Institute Of Technology | Single-lens 3-D imaging device using a polarization-coded aperture masks combined with a polarization-sensitive sensor |
US20080212835A1 (en) | 2007-03-01 | 2008-09-04 | Amon Tavor | Object Tracking by 3-Dimensional Modeling |
US8493496B2 (en) | 2007-04-02 | 2013-07-23 | Primesense Ltd. | Depth mapping using projected patterns |
US8150142B2 (en) | 2007-04-02 | 2012-04-03 | Prime Sense Ltd. | Depth mapping using projected patterns |
US8488868B2 (en) | 2007-04-03 | 2013-07-16 | Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Industry, Through The Communications Research Centre Canada | Generation of a depth map from a monoscopic color image for rendering stereoscopic still and video images |
US7835561B2 (en) | 2007-05-18 | 2010-11-16 | Visiongate, Inc. | Method for image processing and reconstruction of images for optical tomography |
WO2008155770A2 (en) | 2007-06-19 | 2008-12-24 | Prime Sense Ltd. | Distance-varying illumination and imaging techniques for depth mapping |
WO2009008864A1 (en) | 2007-07-12 | 2009-01-15 | Thomson Licensing | System and method for three-dimensional object reconstruction from two-dimensional images |
JP4412362B2 (en) | 2007-07-18 | 2010-02-10 | 船井電機株式会社 | Compound eye imaging device |
US20090060307A1 (en) | 2007-08-27 | 2009-03-05 | Siemens Medical Solutions Usa, Inc. | Tensor Voting System and Method |
DE102007045332B4 (en) | 2007-09-17 | 2019-01-17 | Seereal Technologies S.A. | Holographic display for reconstructing a scene |
KR100858034B1 (en) | 2007-10-18 | 2008-09-10 | (주)실리콘화일 | One chip image sensor for measuring vitality of subject |
US8166421B2 (en) | 2008-01-14 | 2012-04-24 | Primesense Ltd. | Three-dimensional user interface |
US8176497B2 (en) | 2008-01-16 | 2012-05-08 | Dell Products, Lp | Method to dynamically provision additional computer resources to handle peak database workloads |
US8384997B2 (en) | 2008-01-21 | 2013-02-26 | Primesense Ltd | Optical pattern projection |
EP2235584B1 (en) | 2008-01-21 | 2020-09-16 | Apple Inc. | Optical designs for zero order reduction |
DE102008011350A1 (en) | 2008-02-27 | 2009-09-03 | Loeffler Technology Gmbh | Apparatus and method for real-time detection of electromagnetic THz radiation |
US8121351B2 (en) | 2008-03-09 | 2012-02-21 | Microsoft International Holdings B.V. | Identification of objects in a 3D video using non/over reflective clothing |
US8035806B2 (en) | 2008-05-13 | 2011-10-11 | Samsung Electronics Co., Ltd. | Distance measuring sensor including double transfer gate and three dimensional color image sensor including the distance measuring sensor |
US8456517B2 (en) | 2008-07-09 | 2013-06-04 | Primesense Ltd. | Integrated processor for 3D mapping |
US8462207B2 (en) | 2009-02-12 | 2013-06-11 | Primesense Ltd. | Depth ranging with Moiré patterns |
US8786682B2 (en) | 2009-03-05 | 2014-07-22 | Primesense Ltd. | Reference image techniques for three-dimensional sensing |
US8717417B2 (en) | 2009-04-16 | 2014-05-06 | Primesense Ltd. | Three-dimensional mapping and imaging |
US8503720B2 (en) | 2009-05-01 | 2013-08-06 | Microsoft Corporation | Human body pose estimation |
US8744121B2 (en) | 2009-05-29 | 2014-06-03 | Microsoft Corporation | Device for identifying and tracking multiple humans over time |
EP2275990B1 (en) | 2009-07-06 | 2012-09-26 | Sick Ag | 3D sensor |
US9582889B2 (en) | 2009-07-30 | 2017-02-28 | Apple Inc. | Depth mapping based on pattern matching and stereoscopic information |
US8773514B2 (en) | 2009-08-27 | 2014-07-08 | California Institute Of Technology | Accurate 3D object reconstruction using a handheld device with a projected light pattern |
US8830227B2 (en) | 2009-12-06 | 2014-09-09 | Primesense Ltd. | Depth-based gain control |
US8320621B2 (en) | 2009-12-21 | 2012-11-27 | Microsoft Corporation | Depth projector system with integrated VCSEL array |
US8982182B2 (en) | 2010-03-01 | 2015-03-17 | Apple Inc. | Non-uniform spatial resource allocation for depth mapping |
US8330804B2 (en) | 2010-05-12 | 2012-12-11 | Microsoft Corporation | Scanned-beam depth mapping to 2D image |
US8654152B2 (en) | 2010-06-21 | 2014-02-18 | Microsoft Corporation | Compartmentalizing focus area within field of view |
-
2007
- 2007-03-08 KR KR1020087025030A patent/KR101331543B1/en active IP Right Grant
- 2007-03-08 JP JP2008558981A patent/JP5174684B2/en active Active
- 2007-03-08 US US12/282,517 patent/US8390821B2/en active Active
- 2007-03-08 WO PCT/IL2007/000306 patent/WO2007105205A2/en active Application Filing
- 2007-03-08 CN CN2007800166255A patent/CN101496033B/en active Active
-
2013
- 2013-01-24 US US13/748,617 patent/US9063283B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050200925A1 (en) * | 1999-12-10 | 2005-09-15 | Xyz Imaging, Inc. | Holographic printer |
US20040228519A1 (en) * | 2003-03-10 | 2004-11-18 | Cranial Technologies, Inc. | Automatic selection of cranial remodeling device trim lines |
Cited By (111)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8050461B2 (en) | 2005-10-11 | 2011-11-01 | Primesense Ltd. | Depth-varying light fields for three dimensional sensing |
US9330324B2 (en) * | 2005-10-11 | 2016-05-03 | Apple Inc. | Error compensation in three-dimensional mapping |
US8400494B2 (en) | 2005-10-11 | 2013-03-19 | Primesense Ltd. | Method and system for object reconstruction |
US8390821B2 (en) | 2005-10-11 | 2013-03-05 | Primesense Ltd. | Three-dimensional sensing using speckle patterns |
US20120281240A1 (en) * | 2005-10-11 | 2012-11-08 | Primesense Ltd. | Error Compensation in Three-Dimensional Mapping |
US8249334B2 (en) | 2006-05-11 | 2012-08-21 | Primesense Ltd. | Modeling of humanoid forms from depth maps |
US8350847B2 (en) | 2007-01-21 | 2013-01-08 | Primesense Ltd | Depth mapping using multi-beam illumination |
US8150142B2 (en) | 2007-04-02 | 2012-04-03 | Prime Sense Ltd. | Depth mapping using projected patterns |
US8493496B2 (en) | 2007-04-02 | 2013-07-23 | Primesense Ltd. | Depth mapping using projected patterns |
US8494252B2 (en) | 2007-06-19 | 2013-07-23 | Primesense Ltd. | Depth mapping using optical elements having non-uniform focal characteristics |
WO2009074751A3 (en) * | 2007-09-28 | 2009-08-06 | Noomeo | Method for three-dimensional digitisation |
WO2009074751A2 (en) * | 2007-09-28 | 2009-06-18 | Noomeo | Method for three-dimensional digitisation |
US8483477B2 (en) | 2007-09-28 | 2013-07-09 | Noomeo | Method of constructing a digital image of a three-dimensional (3D) surface using a mask |
FR2921719A1 (en) * | 2007-09-28 | 2009-04-03 | Noomeo Soc Par Actions Simplif | Physical object e.g. apple, three-dimensional surface's synthesis image creating method for e.g. industrial farm, involves calculating depth coordinate measured at axis for each point of dusty seeds from projection of point on surface |
US20140043230A1 (en) * | 2008-01-14 | 2014-02-13 | Primesense Ltd. | Three-Dimensional User Interface Session Control |
US20160349853A1 (en) * | 2008-01-14 | 2016-12-01 | Apple Inc. | Three dimensional user interface session control using depth sensors |
US9035876B2 (en) * | 2008-01-14 | 2015-05-19 | Apple Inc. | Three-dimensional user interface session control |
US8166421B2 (en) | 2008-01-14 | 2012-04-24 | Primesense Ltd. | Three-dimensional user interface |
US9829988B2 (en) * | 2008-01-14 | 2017-11-28 | Apple Inc. | Three dimensional user interface session control using depth sensors |
US8630039B2 (en) | 2008-01-21 | 2014-01-14 | Primesense Ltd. | Optical designs for zero order reduction |
US20130120841A1 (en) * | 2008-01-21 | 2013-05-16 | Primesense Ltd. | Optical pattern projection |
US20170116757A1 (en) * | 2008-01-21 | 2017-04-27 | Apple Inc. | Optical pattern projection |
US9239467B2 (en) | 2008-01-21 | 2016-01-19 | Apple Inc. | Optical pattern projection |
US10148941B2 (en) * | 2008-01-21 | 2018-12-04 | Apple Inc. | Optical pattern projection |
US8384997B2 (en) | 2008-01-21 | 2013-02-26 | Primesense Ltd | Optical pattern projection |
US8456517B2 (en) | 2008-07-09 | 2013-06-04 | Primesense Ltd. | Integrated processor for 3D mapping |
EP2342892B1 (en) * | 2008-09-26 | 2022-05-25 | The Face Recognition Company Limited | Image recognition |
EP2342892A1 (en) * | 2008-09-26 | 2011-07-13 | Cybula Ltd | Image recognition |
WO2010072912A1 (en) | 2008-12-22 | 2010-07-01 | Noomeo | Device for three-dimensional scanning with dense reconstruction |
FR2940423A1 (en) * | 2008-12-22 | 2010-06-25 | Noomeo | DENSE RECONSTRUCTION THREE-DIMENSIONAL SCANNING DEVICE |
US8462207B2 (en) | 2009-02-12 | 2013-06-11 | Primesense Ltd. | Depth ranging with Moiré patterns |
US8786682B2 (en) | 2009-03-05 | 2014-07-22 | Primesense Ltd. | Reference image techniques for three-dimensional sensing |
US8717417B2 (en) | 2009-04-16 | 2014-05-06 | Primesense Ltd. | Three-dimensional mapping and imaging |
US9582889B2 (en) | 2009-07-30 | 2017-02-28 | Apple Inc. | Depth mapping based on pattern matching and stereoscopic information |
US8565479B2 (en) | 2009-08-13 | 2013-10-22 | Primesense Ltd. | Extraction of skeletons from 3D maps |
US20110096182A1 (en) * | 2009-10-25 | 2011-04-28 | Prime Sense Ltd | Error Compensation in Three-Dimensional Mapping |
US8492696B2 (en) | 2009-11-15 | 2013-07-23 | Primesense Ltd. | Optical projector with beam monitor including mapping apparatus capturing image of pattern projected onto an object |
US8829406B2 (en) | 2009-11-15 | 2014-09-09 | Primesense Ltd. | Optical projector with beam monitor including sensing intensity of beam pattern not projected toward an object |
US8830227B2 (en) | 2009-12-06 | 2014-09-09 | Primesense Ltd. | Depth-based gain control |
US9736459B2 (en) | 2010-02-02 | 2017-08-15 | Apple Inc. | Generation of patterned radiation |
EP2363686A1 (en) | 2010-02-02 | 2011-09-07 | Primesense Ltd. | Optical apparatus, an imaging system and a method for producing a photonics module |
US8786757B2 (en) | 2010-02-23 | 2014-07-22 | Primesense Ltd. | Wideband ambient light rejection |
US8787663B2 (en) | 2010-03-01 | 2014-07-22 | Primesense Ltd. | Tracking body parts by combined color image and depth processing |
US8824737B2 (en) | 2010-05-31 | 2014-09-02 | Primesense Ltd. | Identifying components of a humanoid form in three-dimensional scenes |
US8781217B2 (en) | 2010-05-31 | 2014-07-15 | Primesense Ltd. | Analysis of three-dimensional scenes with a surface model |
US8594425B2 (en) | 2010-05-31 | 2013-11-26 | Primesense Ltd. | Analysis of three-dimensional scenes |
US9477018B2 (en) | 2010-08-06 | 2016-10-25 | Asahi Glass Company, Limited | Diffractive optical element and measurement device |
US8599484B2 (en) | 2010-08-10 | 2013-12-03 | Asahi Glass Company, Limited | Diffractive optical element and measuring device |
US9036158B2 (en) | 2010-08-11 | 2015-05-19 | Apple Inc. | Pattern projector |
US9098931B2 (en) | 2010-08-11 | 2015-08-04 | Apple Inc. | Scanning projectors and image capture modules for 3D mapping |
US9348111B2 (en) | 2010-08-24 | 2016-05-24 | Apple Inc. | Automatic detection of lens deviations |
US8582867B2 (en) | 2010-09-16 | 2013-11-12 | Primesense Ltd | Learning-based pose estimation from depth maps |
US8959013B2 (en) | 2010-09-27 | 2015-02-17 | Apple Inc. | Virtual keyboard for a non-tactile three dimensional user interface |
US8995057B2 (en) | 2010-11-02 | 2015-03-31 | Asahi Glass Company, Limited | Diffractive optical element and measurement instrument |
US9066087B2 (en) | 2010-11-19 | 2015-06-23 | Apple Inc. | Depth mapping using time-coded illumination |
US9167138B2 (en) | 2010-12-06 | 2015-10-20 | Apple Inc. | Pattern projection and imaging using lens arrays |
US9131136B2 (en) | 2010-12-06 | 2015-09-08 | Apple Inc. | Lens arrays for pattern projection and imaging |
US8872762B2 (en) | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
US8933876B2 (en) | 2010-12-13 | 2015-01-13 | Apple Inc. | Three dimensional user interface session control |
EP2466560A1 (en) | 2010-12-20 | 2012-06-20 | Axis AB | Method and system for monitoring the accessibility of an emergency exit |
US8717488B2 (en) | 2011-01-18 | 2014-05-06 | Primesense Ltd. | Objective optics with interference filter |
EP3527121A1 (en) | 2011-02-09 | 2019-08-21 | Apple Inc. | Gesture detection in a 3d mapping environment |
US9052512B2 (en) | 2011-03-03 | 2015-06-09 | Asahi Glass Company, Limited | Diffractive optical element and measuring apparatus |
US9030529B2 (en) | 2011-04-14 | 2015-05-12 | Industrial Technology Research Institute | Depth image acquiring device, system and method |
EP2530442A1 (en) | 2011-05-30 | 2012-12-05 | Axis AB | Methods and apparatus for thermographic measurements. |
US8881051B2 (en) | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
US8908277B2 (en) | 2011-08-09 | 2014-12-09 | Apple Inc | Lens array projector |
US8749796B2 (en) | 2011-08-09 | 2014-06-10 | Primesense Ltd. | Projectors of structured light |
US9372546B2 (en) | 2011-08-12 | 2016-06-21 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
US8971572B1 (en) | 2011-08-12 | 2015-03-03 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
US9030498B2 (en) | 2011-08-15 | 2015-05-12 | Apple Inc. | Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface |
US9218063B2 (en) | 2011-08-24 | 2015-12-22 | Apple Inc. | Sessionless pointing user interface |
US9122311B2 (en) | 2011-08-24 | 2015-09-01 | Apple Inc. | Visual feedback for tactile and non-tactile user interfaces |
US9002099B2 (en) | 2011-09-11 | 2015-04-07 | Apple Inc. | Learning-based estimation of hand and finger pose |
WO2013038089A1 (en) | 2011-09-16 | 2013-03-21 | Prynel | Method and system for acquiring and processing images for the detection of motion |
EP2611169A1 (en) | 2011-12-27 | 2013-07-03 | Thomson Licensing | Device for the acquisition of stereoscopic images |
EP2611171A1 (en) | 2011-12-27 | 2013-07-03 | Thomson Licensing | Device for acquiring stereoscopic images |
US9595156B2 (en) | 2012-01-23 | 2017-03-14 | Novomatic Ag | Prize wheel with gesture-based control |
US9201237B2 (en) | 2012-03-22 | 2015-12-01 | Apple Inc. | Diffraction-based sensing of mirror position |
US9047507B2 (en) | 2012-05-02 | 2015-06-02 | Apple Inc. | Upper-body skeleton extraction from depth maps |
US9019267B2 (en) | 2012-10-30 | 2015-04-28 | Apple Inc. | Depth mapping with enhanced resolution |
US9152234B2 (en) | 2012-12-02 | 2015-10-06 | Apple Inc. | Detecting user intent to remove a pluggable peripheral device |
EP3666067A1 (en) | 2013-01-31 | 2020-06-17 | Lely Patent N.V. | Camera system, animal related system therewith, and method to create 3d camera images |
JP2019144261A (en) * | 2013-06-06 | 2019-08-29 | ヘプタゴン・マイクロ・オプティクス・プライベート・リミテッドHeptagon Micro Optics Pte. Ltd. | Imaging system and method for making it operate |
US9825425B2 (en) | 2013-06-19 | 2017-11-21 | Apple Inc. | Integrated structured-light projector comprising light-emitting elements on a substrate |
US9528906B1 (en) | 2013-12-19 | 2016-12-27 | Apple Inc. | Monitoring DOE performance using total internal reflection |
US9514378B2 (en) | 2014-03-25 | 2016-12-06 | Massachusetts Institute Of Technology | Space-time modulated active 3D imager |
US10349037B2 (en) | 2014-04-03 | 2019-07-09 | Ams Sensors Singapore Pte. Ltd. | Structured-stereo imaging assembly including separate imagers for different wavelengths |
US11262233B2 (en) | 2014-12-27 | 2022-03-01 | Guardian Optical Technologies, Ltd. | System and method for detecting surface vibrations |
US10186034B2 (en) | 2015-01-20 | 2019-01-22 | Ricoh Company, Ltd. | Image processing apparatus, system, image processing method, calibration method, and computer-readable recording medium |
US10621694B2 (en) | 2015-01-20 | 2020-04-14 | Ricoh Company, Ltd. | Image processing apparatus, system, image processing method, calibration method, and computer-readable recording medium |
EP3048581A1 (en) * | 2015-01-20 | 2016-07-27 | Ricoh Company, Ltd. | Image processing apparatus, system, image processing method, calibration method, and computer-readable recording medium |
US9817159B2 (en) | 2015-01-31 | 2017-11-14 | Microsoft Technology Licensing, Llc | Structured light pattern generation |
CN107430773A (en) * | 2015-03-20 | 2017-12-01 | 高通股份有限公司 | Strengthen the system and method for the depth map retrieval of mobile object using active detection technology |
US9525863B2 (en) | 2015-04-29 | 2016-12-20 | Apple Inc. | Time-of-flight depth mapping with flexible scan pattern |
US10012831B2 (en) | 2015-08-03 | 2018-07-03 | Apple Inc. | Optical monitoring of scan parameters |
US10043279B1 (en) | 2015-12-07 | 2018-08-07 | Apple Inc. | Robust detection and classification of body parts in a depth map |
US10073004B2 (en) | 2016-09-19 | 2018-09-11 | Apple Inc. | DOE defect monitoring utilizing total internal reflection |
US10366278B2 (en) | 2016-09-20 | 2019-07-30 | Apple Inc. | Curvature-based face detector |
US10545457B2 (en) | 2017-12-05 | 2020-01-28 | K Laser Technology, Inc. | Optical projector with off-axis diffractive element and conjugate images |
US10310281B1 (en) | 2017-12-05 | 2019-06-04 | K Laser Technology, Inc. | Optical projector with off-axis diffractive element |
US10317684B1 (en) | 2018-01-24 | 2019-06-11 | K Laser Technology, Inc. | Optical projector with on axis hologram and multiple beam splitter |
US11422292B1 (en) | 2018-06-10 | 2022-08-23 | Apple Inc. | Super-blazed diffractive optical elements with sub-wavelength structures |
CN109541875A (en) * | 2018-11-24 | 2019-03-29 | 深圳阜时科技有限公司 | A kind of light-source structure, optical projection mould group, sensing device and equipment |
CN109541875B (en) * | 2018-11-24 | 2024-02-13 | 深圳阜时科技有限公司 | Light source structure, optical projection module, sensing device and equipment |
US10509128B1 (en) | 2019-04-12 | 2019-12-17 | K Laser Technology, Inc. | Programmable pattern optical projector for depth detection |
US11681019B2 (en) | 2019-09-18 | 2023-06-20 | Apple Inc. | Optical module with stray light baffle |
US11506762B1 (en) | 2019-09-24 | 2022-11-22 | Apple Inc. | Optical module comprising an optical waveguide with reference light path |
GB2589121A (en) * | 2019-11-21 | 2021-05-26 | Bae Systems Plc | Imaging apparatus |
US11754767B1 (en) | 2020-03-05 | 2023-09-12 | Apple Inc. | Display with overlaid waveguide |
US12111421B2 (en) | 2021-03-17 | 2024-10-08 | Apple Inc. | Waveguide-based transmitters with adjustable lighting |
Also Published As
Publication number | Publication date |
---|---|
CN101496033A (en) | 2009-07-29 |
JP5174684B2 (en) | 2013-04-03 |
US8390821B2 (en) | 2013-03-05 |
CN101496033B (en) | 2012-03-21 |
US20090096783A1 (en) | 2009-04-16 |
KR101331543B1 (en) | 2013-11-20 |
US20130136305A1 (en) | 2013-05-30 |
JP2009531655A (en) | 2009-09-03 |
KR20080111474A (en) | 2008-12-23 |
US9063283B2 (en) | 2015-06-23 |
WO2007105205A3 (en) | 2009-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8390821B2 (en) | Three-dimensional sensing using speckle patterns | |
JP2009531655A5 (en) | ||
US20210297651A1 (en) | Three dimensional depth mapping using dynamic structured light | |
CN101496032B (en) | Range mapping using speckle decorrelation | |
US8050461B2 (en) | Depth-varying light fields for three dimensional sensing | |
US8374397B2 (en) | Depth-varying light fields for three dimensional sensing | |
TWI585436B (en) | Method and apparatus for measuring depth information | |
US9778751B2 (en) | Gesture based control using three-dimensional information extracted over an extended depth of field | |
US7675020B2 (en) | Input apparatus and methods having diffuse and specular tracking modes | |
EP3403131A1 (en) | Depth mapping using structured light and time of flight | |
KR102102291B1 (en) | Optical tracking system and optical tracking method | |
WO2009124181A2 (en) | Gesture based control using three-dimensional information extracted over an extended depth of field | |
JP2002318344A (en) | Method and device for autofocusing for optical equipment | |
CN117128892A (en) | Three-dimensional information measuring device, measuring method and electronic equipment | |
CN109669553B (en) | Device and method for detecting movement of a pointer in three dimensions | |
Hashimoto et al. | Mirror Based Framework for Human Body Measurement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200780016625.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07713326 Country of ref document: EP Kind code of ref document: A2 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008558981 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020087025030 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12282517 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 07713326 Country of ref document: EP Kind code of ref document: A2 |