WO1999060525A1 - Method and apparatus for 3d representation - Google Patents
Method and apparatus for 3d representation Download PDFInfo
- Publication number
- WO1999060525A1 WO1999060525A1 PCT/GB1999/001556 GB9901556W WO9960525A1 WO 1999060525 A1 WO1999060525 A1 WO 1999060525A1 GB 9901556 W GB9901556 W GB 9901556W WO 9960525 A1 WO9960525 A1 WO 9960525A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- representation
- images
- camera
- correspondences
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/221—Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
- H04N13/117—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0085—Motion estimation from stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0092—Image segmentation from stereoscopic image signals
Definitions
- the present invention relates to a method and apparatus for deriving a three- dimensional representation from two or more two-dimensional images.
- such a 3D representation can only be generated if the features (eg points) of the images are correlated. Respective features of two overlapping images are correlated if they are derived from (ie conjugate with) the same feature (eg a point) of the object. If the positions and orientations of the camera at which the overlapping images are acquired are known, then the 3D coordinates of the object in the region of overlap can be determined, assuming that the camera geometry (in particular its focal length) is known.
- the information about the position and orientation of the camera required to reconstruct the object in 3D can be obtained either by direct measurement or by various sophisticated mathematical techniques involving processing all the correlated pairs of features. (Stereoscopic camera arrangements in which the cameras are fixed are ignored for the purposes of the present discussion).
- EP-A-782,100 discloses a method and apparatus in the first category, namely a photographic 3D acquisition arrangement in which a camera displacement signal and a lens position signal are used to enable a 3D representation of an object to be built up from 2D images acquired by a digital camera. Both the position and the orientation of the camera are monitored. The hardware required to achieve this is expected to be expensive.
- a disadvantage of the above mathematical methods is the intensive computation required to process all the correlated points. Much of this computation involves the determination of the camera positions and orientations from multiple points and is effectively wasted because the positions and orientations are " grossly over- determined by the large number of processed points.
- An object of the present invention is to overcome or alleviate at least some disadvantages of the known methods and apparatus, particularly when the resolution of the images from which the 3D reconstruction is generated is limited
- the invention provides a method of deriving a 3D representation of at least part of an object from correlated overlapping 2D images of the object acquired from different spaced apart viewpoints relative to the object, the separation between the viewpoints not being precisely known, the method comprising the step of digitally processing the 2D images to form a 3D representation which extends in a simulated 3D space in dependence upon both the mutual offset between correspondences of the respective 2D images and a scaling variable, the scaling variable being representative of the separation between the viewpoints at which the 2D images were acquired.
- the invention also provides image processing apparatus for -deriving a 3D representation of at least part of an object from correlated overlapping 2D images of the object acquired from different spaced apart viewpoints relative to the object, the apparatus comprising image processing means which is arranged to digitally process the 2D images to form a 3D representation which extends in a simulated 3D space in dependence upon both the mutual offset between correspondences of the respective 2D images and a scaling variable, the scaling variable being representative of the separation between the viewpoints at which the 2D images were acquired.
- the scaling variable is preferably entered by a user.
- the scaling factor could be a/a' to magnify the partial 3D reconstruction by a factor of a/a' and thereby generate a partial 3D reconstruction having the same size as the object.
- a partial 3D reconstruction will be able to be fitted to other life-size partial 3D reconstructions generated similarly from other pairs of images.
- any value of scaling factor which will enable the partial 3D reconstructions to be fitted together will be satisfactory, and can for example be applied by the user during a process of fitting together the partial 3D reconstructions on-screen.
- the camera orientation (in the reference frame of the object) at each of the two viewpoints is preferably the same or nearly the same (eg ⁇ 10 degrees) and the or each partial 3D reconstruction is preferably generated by virtual projectors having the same orientation as the camera(s) and having optical centres on the line joining the optical centres of the camera(s).
- the method further comprises the step of acquiring the overlapping 2D images from a camera which is moved relative to the object between the different viewpoints, the net movement of the camera between the viewpoints not being fully constrained.
- the camera could be mounted on a fixed slide so as to move transversely to its optical axis (so that its orientation and movement along two axes is constrained but its movement along the third axis is not) or it could be mounted on a tripod (so that its movement in the vertical direction and rotation about a horizontal axis are constrained).
- the camera is hand-held.
- the camera orientation can be measured with an inertial sensor eg a vibratory gyroscope and appropriate filtering and integrating circuitry as disclosed in our UK patent GB 2,292,605B which is hereby incorporated by reference.
- a Kalman filter is presently preferred for filtering the inertial sensor output signals.
- the orientation of the camera is varied after acquiring a first image from one viewpoint and before acquiring a second image from the other viewpoint so as to maintain its orientation relative to the reference frame of the object when acquiring the second image.
- One purely optical way of ensuring that the orientation of the camera is unchanged relative to the orientation at the first viewpoint is to vary the orientation until the projections in the image plane of the camera of the correlated points of the first image and the corresponding points of the second image (which is preferably instantaneously displayed on screen) converge at a common point.
- the orientation at the second viewpoint can be adjusted until the above projections are parallel.
- the present invention also relates to a method and apparatus for deriving a representation of the three-dimensional (3D) shape of an object from an image (referred to herein as an object image) of the projection of structured optical radiation onto the object surface.
- structured optical radiation is a generalisation of the term “structured light” and is intended to cover not only structured light but also structured electromagnetic radiation of other wavelengths which obeys the laws of optics.
- the 3D shape of part of an object surface can be obtained by projecting structured light, eg a grid pattern onto a surface of the object, acquiring an image of the illuminated region of the object surface and identifying the elements of the structured light (eg the crossed lines of the grid pattern) which correspond to the respective features (eg crossed lines) of the image, assuming that the spatial distribution of the structured light is known.
- structured light eg a grid pattern onto a surface of the object
- the invention provides a method of deriving a 3D representation of at least part of an object from a 2D image thereof, comprising the steps of illuminating the object with structured projected optical radiation, acquiring a 2D image of the illuminated object, correlating the 2D image with rays of the structured optical radiation, and digitally processing the 2D image to form a 3D representation which extends in a simulated 3D space in dependence upon both the correlation and a scaling variable, the scaling variable being representative of the separation between a location from which the structured optical radiation is projected and the viewpoint at which the 2D image is acquired.
- the invention also provides image processing apparatus for deriving a 3D representation of at least part of an object from a 2D image of the illuminated object, the object being illuminated with structured optical radiation projected from a location spaced apart from the viewpoint at which the 2D image is acquired, the 2D image being correlated with the structured radiation, the apparatus comprising digital processing means arranged to form a 3D representation which extends in a simulated 3D space in dependence upon both the correlation and a scaling variable, the scaling variable being representative of the separation between the location from which the structured optical radiation is projected and the viewpoint at which the 2D image is acquired.
- This aspect is related to the first aspect in that the separation of perspective centres of the cameras does not need to be known in the apparatus and method of the first aspect and the separation of theperspective centres of the projector and camera does not need to be known in the apparatus and method of the second aspect.
- the camera-camera arrangement employed in the first aspect and the camera-projector arrangement employed in the second aspect.
- a 3D representation can be derived from the intersection of two projections, in the one case representing the respective pencils of camera rays and in the other case representing the respective pencils of projector rays and camera rays.
- the calibration image can for example be of the projection of the structured optical radiation onto a calibration surface or can for example be a further object image obtained after moving the object relative to the camera used to acquire the initial object image and the projector means used to project the structured optical radiation.
- the first and second projections are from a baseline linking an origin of the structured optical radiation and a perspective centre associated with the image (eg the optical centre of the camera lens used to acquire the image), the reconstruction processing means being arranged to derive said baseline from two or more pairs of correlated features. This feature is illustrated in Figures 3 and 13 discussed in detail below.
- the image processing means is arranged to generate correspondences between two or more calibration images and to determine the spacing between origins of the first and second projections in dependence upon both the correspondences of the two or more calibration images and input or stored metric information associated with the calibration images. This feature is illustrated in Figure 15, discussed in detail below.
- the reconstruction processing means is arranged to vary the spacing between the origins of the first and second projections in dependence upon a scaling variable enterable by a user.
- a further calibration image is not required.
- the apparatus includes means for displaying the 3D representation with a relative scaling dependent upon the value of the scaling variable.
- the invention provides a method of generating a 3D representation of an object from an object image of the projection of structured optical radiation onto the object surface and from at least one calibration image of the projection of the structured optical radiation onto a surface displaced from the object surface, the 15 method comprising the steps of:
- the invention provides image processing apparatus for generating a 3D representation of at least part of an object from an object image of the projection of structured optical radiation onto the object surface and from at least 30 one calibration image of the projection of the structured optical radiation onto a surface displaced from the object surface, the apparatus comprising image processing means arranged to generate correspondences between at least one calibration image and the object image and optionally a further calibration image, and reconstruction processing means arranged to simulate a first projection of the
- the invention provides image processing apparatus for deriving a 3D representation of at least part of an object from a 2D image thereof, the object being illuminated with structured optical radiation projected from a location spaced apart from the viewpoint at which the 2D image is acquired, the 2D image being correlated with rays of the structured radiation, the apparatus comprising digital processing means arranged to form a 3D reconstruction which extends in a simulated 3D space in dependence upon both the correlation and a scaling variable, the scaling variable being representative of the separation between the location from which the structured optical radiation is projected and the viewpoint at which the 2D image is acquired.
- FIG. 13 This aspect of the invention is illustrated in Figure 13. Following a simple calibration procedure requiring no knowledge of the position of the camera or the projector relative to the object it enables a 3D representation to be generated. This can optionally be displayed and scaled or it can be distorted eg for special effects in graphics and animation.
- the apparatus is arranged to derive a further 3D representation from a further 2D image acquired from a different viewpoint relative to the object, the combining means being arranged to combine the first-mentioned 3D representation and the further 3D representation by manipulations in a simulated 3D space involving one or more of rotation and translation, the apparatus further comprising scaling means arranged to reduce or eliminate any remaining discrepancies between the 3D reconstructions by scaling one 3D reconstruction relative to the other along at least one axis.
- the apparatus is arranged to display both 3D representations simultaneously and to manipulate them in simulated 3D space in response to commands entered by a user.
- the apparatus is arranged to perform the manipulations of the 3D reconstructions under the control of a computer pointing device.
- the apparatus includes means for combining two or more 3D representations and means for adjusting the relative scaling of the representations to enable them to fit each other.
- variable will correct for one or more distortions of the partial 3D reconstruction either laterally or in the depth direction (curvature of field) which, as shown below in connection with Figures 8 to 11 can arise from incorrect positioning of the virtual projectors eg a misalignment relative to the camera viewpoints.
- the partial 3D reconstruction will be distorted by delibarately misaligning one or both the virtual projectors relative to the camera viewpoints or camera and projector viewpoints.
- Such a feature is useful in the fields of design, graphics and animation.
- the angle subtended by a pair of correlated features at the corresponding feature of the object is 90 degrees ⁇ 30 degrees (more preferably ⁇ 10 degrees).
- the subtended angle is exactly 90 degrees. This feature enables any distortion resulting from a slight change in orientation to be corrected more accurately.
- a view of the reconstruction in the simulated 3D space is displayed eg on a screen and the variable is varied by the user in response to the view displayed on screen.
- the image processing means is arranged to generate said correspondences of said images by comparing local radiometric distributions of said images.
- the invention provides a method of deriving a 3D representation of at least part of an object from a 2D image thereof, comprising the steps of illuminating the object with structured projected optical radiation, acquiring a 2D image of the illuminated object, correlating the 2D image with rays of the structured optical radiation, and digitally processing the 2D image to form a 3D reconstruction which extends in a simulated 3D space in dependence upon both the correlation and a scaling variable, the scaling variable being representative of the separation between a location from which the structured optical radiation is projected and the viewpoint at which the 2D image is acquired.
- Suitable algorithms for correlating (generating correspondences between) overlapping images are already known - eg Gruen's algorithm (see Gruen, A W “Adaptive least squares correlation: a powerful image matching technique” S Afr J of Photogrammetry, remote sensing and Cartography Vol 14 No 3 (1985) and Gruen, A W and Baltsavias, E P "High precision image matching for digital terrain model generation” Int Arch photogrammetry Vol 25 No 3 (1986) p254) and particularly the "region-growing” modification thereto which is described in Otto and Chau “Region-growing algorithm for matching terrain images” Image and Vision Computing Vol 7 No 2 May 1989 p83, all of which are incorporated herein by reference.
- Gruen's algorithm is an adaptive least squares correlation algorithm in which two image patches of typically 15 x 15 to 30 x 30 pixels are correlated (ie selected from larger left and right images in such a manner as to give the most consistent match between patches) by allowing an affine geometric distortion between coordinates in the images (ie stretching or compression in which originally parallel lines remain parallel in the transformation) and allowing an additive radiometric distortion between the grey levels of the pixels in the image patches, generating an over-constrained set of linear equations representing the discrepancies between the correlated pixels and finding a least squares solution which minimises the discrepancies.
- the Gruen algorithm is essentially an iterative algorithm and requires a reasonable approximation for the correlation to be fed in before it will converge to the correct solution.
- the Otto and Chau region-growing algorithm begins with an approximate match between a point in one image and a point in the other, utilises Gruen's algorithm to produce a more accurate match and to generate the geometric and radiometric distortion parameters, and uses the distortion parameters to predict approximate matches for points in the region of the neighbourhood of the initial matching point.
- the neighbouring points are selected by choosing the adjacent points on a grid having a grid spacing of eg 5 or 10 pixels in order to avoid running
- a candidate matched point moves by more than a certain amount (eg 3 pixels) per iteration then it is not a valid matched point and should be rejected;
- the 3D configuration of the object is obtainable by projecting each image from (eg a virtual) projector having the same focal length and viewpoint (position and orientation) as the camera which acquired that image.
- the principal rays from corresponding features of the respective images will intersect in (virtual) 3D space at the location of the object feature.
- the invention provides a method of generating a 3D reconstruction of an object comprising the steps of projecting images of the object acquired by mutually aligned cameras into simulated 3D space from aligned virtual projectors, the separation of the virtual projectors being variable by the user.
- the invention also provides apparatus for generating a 3D reconstruction of an object comprising two aligned virtual projector means arranged to project images of the object acquired by mutually aligned cameras into simulated 3D space, the separation of the virtual projectors being variable by the user.
- the difference in alignment of the virtual projectors is less than 45 degrees, more preferably less 30 degrees, and is most preferably less than 20 degrees, eg less than 10 degrees. Desirably this angle is less than 5 degrees, eg less than one degree.
- This feature enables overhanging features of the object to be captured from both camera viewpoints and also facilitates the determination of the line connecting the optical centres of the camera(s) at the different viewpoints as well as the correlation of features between the images.
- the origin of each projection is located on a line in simulated 3D space connecting the corresponding optical centres of the camera at the two viewpoints. As explained below with reference to Figure 3 , this will result in a scaled partial 3D reconstruction of the object.
- a distortion parameter is entered by the user and applied to the 3D reconstruction.
- the initial 3D reconstruction can be rotated whilst constraining the features of the initial 3D reconstruction which are generated from the intersecting projections of correlated features of the projected images to lie on the projections of those features from one of the 2D images, thereby forming a further 3D reconstruction.
- this can be used to generate a reconstruction which is parallel to, and therefore a scaled replica of, the actual object surface.
- the viewpoints are calculated from at least two (desirably at least three) pairs of correlated features and the 3D reconstruction of the object is generated in dependence upon said calculation of the viewpoints, the calculation of the viewpoints being performed on fewer than all derivable pairs of correlated features.
- Figure 3 shows the derivation of the line connecting the viewpoints from the pencil of projections from three pairs of correlated features. Since the third projection from the third pair of correlated features P3 and P3' intersects the point VP already defined by the other two projections' intersection (and thus merely onfirms the unchanged orientation between the viewpoints) it is not strictly necessary to find the straight line joining the viewpoints. Hence this line can be found from just two pairs of correlated features if there is no change in camera orientation. However greater accuracy will be obtained if more than two pairs of correlated features are processed.
- Preferably said calculation is performed on fewer than one thousand pairs of correlated features, more preferably fewer than one hundred pairs of correlated features, desirably fewer than fifty pairs of correlated features eg eight or fewer pairs.
- the calculation can be performed on four, three or two pairs of correlated points.
- Figure 1 is a diagrammatic view of one apparatus in accordance with the two- camera aspects of the invention.
- FIG. 2 is a flow diagram of one method in accordance with the two-camera aspects of the invention.
- Figure 3 is a ray diagram showing the relationship between the object, camera and projector viewpoints, virtual projector viewpoints and partial 3D reconstruction in one embodiment of the invention
- Figure 4 is a ray diagram in 3D showing a derivation in accordance with one aspect of the invention of the direction of movement of the camera from the acquired images in the method of Figure 2 and apparatus of Figure 1 ;
- Figure 5 is a ray diagram in 3D showing a derivation of the direction of movement of the camera from the acquired images in the method of Figure 2 and apparatus of Figure 1 in the special case in which the camera does not move relative to the object in the Z direction;
- Figure 6 is a diagram showing the movement of the image of the object in the image plane I of Figure 5;
- Figure 7 is a flow diagram summarising the image processing steps utilised in the method of Figure 2 and the apparatus of Figure 1 ;
- Figure 8 is a 2D ray diagram illustrating the curvature of field resulting from misalignment of one virtual projector relative to the other in the apparatus of Figure 1 and method of Fi ure 2;
- Figure 9 is a 2D ray diagram illustrating correction of distortion of the partial 3D reconstruction in an embodiment of the invention.
- Figure 10A is a 2D ray diagram illustrating the curvature of field resulting from a misalignment of a virtual projector by 5 degrees;
- Figure 10B is a ray diagram illustrating the curvature of field resulting from a misalignment of a virtual projector by 10 degrees;
- Figure IOC is a ray diagram illustrating the curvature of field resulting from a misalignment of a virtual projector by 15 degrees;
- Figure 11 is a plot of curvature of field:misalignment in the arrangements of Figures 10A to IOC;
- Figure 12 is a schematic representation of one apparatus in accordance with projector-camera aspects of the invention.
- Figure 13 is a sketch perspective ray diagram showing one optical arrangement of the apparatus of Figure 12;
- Figure 14 shows an object image and a calibration image acquired by the apparatus of Figures 12 and 13 and the correlation of their features
- Figure 15 is a sketch perspective ray diagram showing a variant of Figure 13 in which two reference surfaces are used to locate the camera and projector of the apparatus of Figures 12 on the baseline connecting their respective perspective centres;
- Figure 16 is a flow diagram illustrating a method of operation of the apparatus of Figures 12 and 13 in accordance with a projector-camera aspect of the invention
- Figure 17 is a screenshot illustrating the fitting together of two 3D surface portions of the object using the apparatus of Figures 1 and 2 or the apparatus of Figure 12;
- Figure 18 is further screenshot showing the scaling of the resulting composite 3D surface portion along vertical and horizontal axes
- Figure 19 is a further screenshot showing the scaling of intersecting 3D surface portions to fit each other.
- Figure 20 is a screenshot showing a user interface provided by the apparatus of Figures 1 and 2 or the apparatus of Figure 12 for manipulating the images and 3D surface portions.
- the apparatus comprises a personal computer 4 (eg a
- Pentium® PC having conventional CPU, ROM, RAM and a hard drive and a connection at an input port to a digital camera 1 as well as a video output port connected to a screen 5 and conventional input ports connected to a keyboard and a mouse 6 or other pointing device.
- the hard drive is loaded with conventional operating system such as Windows®95 and software:
- the software to carry out function a) can be any suitable graphics program and the software to carry out function b) can be based on the algorithms disclosed in Hu et al "Matching Point Features with ordered Geometric, Rigidity and Disparity Constraints" IEEE Transactions on Pattern Analysis and Machine Intelligence Vol 16 No 10, 1994 ppl041-1049 (and references cited therein).
- One suitable algorithm is the Gruens al ⁇ gorithm.
- the camera CM a Pulnix M-9701 progressive scan digital camera, which may be hand-held and carrying a 3-axis vibratory gyroscope G with associated filtering circuitry (a Kalman filter is presently preferred) and integrating circuitry to generate and display on screen 3-axis orientation signals or (as shown at 1') may be mounted on a tripod T or other support, for example, defines a first set of axes x, y, z in its initial viewpoint (ie position and orientation) and is used to acquire an image of the object 3 and to display the image on screen 5.
- the origin of the x y z coordinate system is taken to be the optical centre of the camera lens.
- the camera is then moved an arbitrary distance to a new position CM' and a second image of object 3 is acquired with the orientation of the camera (relative to the x y z coordinate system) maintained unchanged.
- This unchanged orientation can be achieved with the aid of a suitable support if necessary.
- a method of checking for changes in orientation, based on the convergence of the projections from corresponding points of the two images in the image plane I of the camera, will subsequently be described with reference to Figure 4. Only a few (eg 3, 4, 5 or up to eg 10) pairs of correlated points need to be found for this purpose and can be derived visually by the user from the images displayed on screen or by the software in computer 4.
- the camera movement ie the line in the x y z coordinate system joining the optical centre of the camera lens in its two positions
- the direction of camera movement can be estimated by the user.
- the remaining corresponding points in the two images are then correlated by the computer 4 (preferably taking advantage of the information on camera movement obtained in the previous step, eg by searching along the epipolar line in one image corresponding to a point of interest in the other image) and a partial 3D reconstruction of the object 3 in simulated 3D space is generated from the correlated points as will be described below with reference to Figure 3.
- a parameter representing this distance is entered by the user, either form the keyboard or from the pointing device 6 for example.
- the computer 4 is programmed to display the resulting 3D reconstruction on screen and the user can vary the parameter interactively in order to arrive at a partial 3D reconstruction in the x y z coordinate system which bears a desired relationship to the actual object 3. Typically this will be stretched or compressed in the direction of movement between positions CM and CM', relative to the actual object.
- a further partial 3D reconstruction is then generated by moving the camera CM to a new viewpoint CMA such that the object lies in a region of overlap ie such that at least one point P in the camera's field of view at viewpoints CM and CM' remains in the camera's field of view at viewpoint CMA.
- This movement can be represented by a rotation ⁇ about the y axis (resulting in new axes xi, yi and z ) followed by a rotation ⁇ about the i axis (resulting in new axes x ⁇ y' and z') followed by a rotation K about the z' axis, followed by translations ⁇ X, ⁇ Y and ⁇ Z along the resulting axes.
- Therotation K about the z' axis will in many cases be zero, as shown in Figure 1.
- the camera After acquiring an image at the viewpoint CMA the camera is moved to a new position 1A' and a further image is acquired and displayed (the orientation of the camera being adjusted to remain the same as at viewpoint CMA).
- 3D reconstruction of object 3 is then performed by the computer 4 in a manner analogous to that described above in connection with viewpoints CM and CM'.
- the terms in the 3x3 matrix can be shown to be the cosines of the angles between the axes of the XYZ and xyz coordinate systems (see eg BoasMathematical Methods in the Physical Sciences Pub John Wiley & Sons, 2nd Edn pp437 and 438).
- the partial 3D reconstructions can be transformed to a common coordinate system and elongated/compressed along all three axes to minimise discrepancies between them in their region of overlap and to build up a more complete reconstruction of the object 3.
- this process is carried out under the control of a user by displaying the partial 3D reconstructions on screen and varying their relative elongation/compression along three axes, as will be described in ore detail with reference to Figure 19.
- the partial 3D reconstructions are combined without user intervention using the Iterative Closest Point algorithm.
- This algorithm is publicly available and therefore it is not necessary to describe it in detail. Briefly however, it registers two surface edges by searching for a group of (say) the ten closest pairs of points on the respective edges. The surfaces are then repositioned to minimise the aggregate distance between these pairs of points and then a new group of closest pairs of points is found. The repositioning step is then repeated.
- Other methods of correlating 3D surface regions are disclosed in our GB 2,292,605B.
- a scaling factor or other variable is generated either by the user or iteratively under software control in order to adjust the relative sizes of the partial 3D reconstructions to ensure they can be fitted together into a self-consistent overall surface description of the object.
- Overlapping images are captured (step S10), pairs of points (or larger features eg lines or areas) are correlated (step S20) and at least the approximate camera movement between the viewpoints is determined (step S30). In the preferred embodiment this is determined with a high degree of accuracy by processing the two images. Partial 3D reconstructions of the object surface are then generated in a simulated 3D space using the computer 4 to process the correlated pairs of points and camera movement (step S40) and these are combined, preferably interactively on screen by the user to give a consistent but possibly distorted (eg compressed or elongated) 3D representation (step S50). Optionally, this is distorted or undistorted by applying appropriate compression or elongation (step S60).
- steps S30 and S40 can be combined in a matrix processing method, eg using the commercially available INTERSECT program produced by 3D Construction Inc.
- Figure 3 shows the object 3 in the field of view of the camera CM at positions CM and CM'.
- Principal ray lines from points Qa, Qb and Qc on the surface of the object pass through optical centres Oc and Oc' of the camera in the respective positions CM and CM' and are each imaged on the image plane.
- the pair of points thus formed on the image plane from each of points Qa, Qb and Qc are corresponding points and can be correlated by known algorithms.
- the projectors are shown in Figure 3 with the same orientation in the reference frame of the object (corresponding to the common camera orientation) this is merely a preferred feature which enables the vector V to be determined more easily by the image processing method disclosed in Figure 4.
- the above analysis is also applicable to virtual projectors of different orientations, corresponding to respective, different camera orientations.
- the camera orientations can be determined by an inertial sensor and integrating circuitry similar to that disclosed in our above-mentioned UK patent GB 2,292,605B.
- any ray line from Op' parallel to a ray line from Op will lie in the same plane as a ray line from Oc intersecting the ray line from Op and therefore intersect that ray line from OC-
- line Op'Qa' lies in the same plane as triangle Oc c'Qa and will therefore intersect line OcQa, in this case at Qa'.
- a scaled representation 3/3' can be generated by any pair of virtual projectors on vector V, if V is parallel to the line joining the optical centres of the camera lens at the two positions at which the projected images are acquired and the projectors have the same orientation(s) as the camera.
- This last condition can be satisfied even if the correct orientation is initially unknown, namely by adjusting the orientation about any axis perpendicular to vector V until the respective ray lines from any pair of correlated points intersect.
- This procedure can be carried out either manually be the user with the aid of a suitable display of the images and ray lines on screen or automatically in software.
- Figure 3 will be referred to again in connection with a very similar method applicable to the virtual projectors associated with the projector- camera aspects of the invention.
- Figure 4 illustrates one derivation of vector V (step S40 in Figure 2).
- the camera having a lens with optical centre O
- the object 3 is considered to move to position 3' along line M.
- Points PI, P2 and P3 are imaged as points pi, p2 and p3 when the object is in position 3 and as points pi', p2' and p3' when the object is in position 3'.
- the line joining VP to the optical centre O is the vector V which is the locus of the optical centre of the camera (or desired locus of the virtual projector) relative to the object.
- the lines LI, L2, L3....Ln connecting the correlated points will not meet at a common point, owing to a change in orientation of the camera between the two positions.
- the orientation of the camera can be varied about the X, Y and Z axes by the user at the second position whilst displaying the lines LI, L2, L3...Ln on screen and the second image captured only when these lines converge on or near a common point, as determined either visually or by a software routine. Indeed the image can be captured, under control of the software routine, only when the necessary convergence is achieved. If gyroscope G is used then its 3-axis orientation signals can be used to maintain the orientation of the camera between the two viewpoints.
- the movement of the camera from its original position is derived by superimposing the image plane and associated image of the camera in its new position on the image plane (and image) of the camera in its first position, projecting the lines LI, L2, L3....Ln in the image plane of the camera in its first position and connecting the resulting point of intersection VP to the optical centre of the camera in its first position.
- the resulting vector V is the movement of the object in the coordinate frame of the camera at its first position.
- a moving image or a rapid succession of still images are acquired by the camera as it moves and the (assumed) rectilinear movement of the camera between each still image or between sequential frames of the moving image is derived by projecting the lines LI, L2, L3....Ln in the image plane of the camera to derive the point VP for each incremental movement of the camera.
- the resulting vector V for each incremental movement of the camera will change direction as the direction of movement of the camera moves and the segments can be integrated to determine the overall movement (including any change in orientation) of the camera.
- FIG. 6 which shows the object 3/3' (not its image) as seen by the camera, various possible faces ABC, A1B1C1 and A2B2C2 are shown. There will be a continuous range of possible sizes for face ABC; for the sake of clarity only the above three are shown. However the possible faces all have a common centroid P.
- FIG. 7 The overall method of determining the direction of movement of the camera is illustrated in Figure 7.
- a first image is captured (step SI 1) and the camera is moved to a new position with the object still in its field of view and the first image is displayed on screen 5 ( Figure 1), superimposed on the instantaneous image seen by the camera at the new position (step S12).
- Corresponding points are correlated, either by eye (ie by the user) or by a suitable software routine (step S13). Only a small number of points need to be correlated at this stage, eg 100 or fewer, eg 10, depending on the processing speed and the accuracy required.
- step S14 the orientation of the camera is adjusted about the X, Y and Z axes until the correlated points converge to a common "vanishing point" VP ( Figure 4).
- the second image is captured and stored in memory.
- step S15 a line is projected from the point VP through the optical centre of the camera lens to find the locus V ( Figure 4) and with the aid of this information, further correlation of the images is performed and a partial 3D reconstruction is generated by the moethod illustrated in Figure 3 (step S 17).
- step S15 also involves or is preceded by the selection of a camera model (eg from a list displayed on screen) by the user.
- the computer 4 is programmed to store a list of camera models Ml, M2....Mn in association with the parameters of each model needed for the calculation of the required projection and other processing.
- the following parameters may be stored:
- step S16 If it appears to the user (eg as a result of a failure to find a reasonably close "vanishing point" VP) that there has not been significant movement in the Z direction then the movement of the object is assumed to be parallel to the image plane and the line PQ is found from the images of a group of eg three or more points (not necessarily corners) preferably lying in or close to a plane parallel to the image plane (step S16) before proceeding to step S17.
- Figure 8 is a ray diagram orthogonal to the image plane showing the imaging of a line of points PI, P2 and P3 on the surface of the object 3 by a camera at positions CM and CM'.
- the lens L is shown only at the first position CM. It will be noted that the orientation of the camera is the same at the two positions.
- the camera positions are so chosen that the angle subtended by the ray lines from a pair of conelated points at their intersection at their corresponding point at the object surface is substantially 90 degrees, eg 90 degrees ⁇ 30 degrees, preferably 90 degrees ⁇ 20 degrees, most preferably 90 degrees ⁇ 10 degrees.
- This minimises the conection required due to curvature of field.
- There remains the lateral distortion of the reconstruction ie the discrepancy between the ratios P1P2/P2P3 and P1'P27P2'P3' ( Figure 8).
- distortion parallel to the image plane can be corrected for (or deliberately applied) by rotating the partial 3D reconstruction about an axis perpendicular to the image plane of a projector used to generate that partial reconstruction whilst constraining the points in the partial 3D reconstruction to lie on their ray lines from that projector.
- Figures 10A to 10C illustrate the curvature of field resulting from angular misalignment of one projector pr2' from its correct orientation pr2.
- the correct reconstruction 30 defined by prl and pr2 is planar and the centre of curvature CN of the actual reconstruction 30' is shown (and was derived geometrically).
- the misalignment is generated by rotation of the projector pr2 about its perspective centre by 5 degrees, 10 degrees and 15 degrees respectively.
- the radius of curvature R of reconstruction 30' is inversely proportional to the misalignment as shown in Figure 11. It should be noted that in Figures 10A to 10C, the angle subtended by the ray lines at their intersection at their corresponding point at (the reconstruction 30 of) the object surface is very much less than the optimum angle of 90 degrees and hence the degree of curvature is very much greater than would normally be obtained in practice.
- the above method of the invention allows a considerable latitude in the orientation of the projectors which implies that a considerable uncertainty in the relative orientation of the camera at the two viewpoints is permissible.
- the corrections noted above can be applied at any stage during the generation or fitting together of the partial 3D reconstructions.
- a projector-camera embodiment closely analagous to the camera-camera embodiment of Figure 1 will now be described with reference to Figure 12.
- the apparatus comprises a personal computer 4 (eg a Pentium® PC) having conventional CPU, ROM, RAM and a hard drive and a frame grabber connection at an input port to a digital camera CM as well as a video output port connected to a screen 5 and conventional input ports connected to a keyboard and a mouse 6 or other pointing device.
- the hard drive is loaded with conventional operating system such as Windows®95 and software:
- the software is preferably arranged to correct the images for distortion due eg to curvature of field of the camera and projector optics before they are processed as described above, either during an initial calibration procedure or as part of a ray bundle adjustment process during the processing of the object and calibration image(s). Suitable correction and calibration procedures are described by Tsai in "An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision" Proc IEEE CVPR (1986) pp 364-374 (supra) and will not be described further.
- the camera a Pulnix M-9701 progressive scan digital camera, is shown mounted at one end of a support frame F on eg a ball and socket mounting and a slide projector PR is shown securely mounted on the other end of the frame.
- Slide projector PR is provided with a speckle pattern slide S and is arranged to project the resulting speckle pattern onto the surface of region R of an object 3 which is in the field of view of camera CM.
- the intrinsic camera parameters are initially determined by acquiring images of a reference plate (not shown) in known positions.
- the reference plate is planar and has an array of printed blobs of uniform and known spacing. The following parameters are determined and are therefore assumed to be known in the subsequent description:
- the pixel size (determined by the camera manucturer) is assumed to be known.
- the following extrinsic camera parameters are determined:
- the camera location and orientation can be taken to define the coordinate system relative to which the object surface coordinates are determined.
- the camera CM is shown with its perspective centre Oc located on a baseline vector V and viewing (initially) a target surface T and (subsequently) object 3.
- the (virtual) origin or perspective centre Op of projector PR also lies on baseline vector V and is defined by the optical system of the projector comprising field lenses OL and condenser lenses CL.
- a point light source LS such as a filament bulb illuminates slide S and directs a speckle pattern onto (initially) target surface T and (subsequently) the surface of object 3.
- the baseline vector V is found by the following procedure:
- an image II ( Figure 14) of the region of the surface of object 3 illuminated by the projected speckle pattern is acquired and stored in the memory of computer 4 and an arbitrary group of at least two spaced apart points Ql and Q2 of this region are selected as points ql and q2 in the image formed on the photodetector plane PD of the camera.
- the group of points ql and q2 is stored.
- the object 3 is substituted by target surface T and an image 12 (Figure 3) of the illuminated region of the target surface is acquired by camera CM.
- the position and orientation of the target T relative to the camera are found by acquiring an image of the target in the absence of any illumination from the projector, utilising a known pattern of blobs BL formed on the periphery of the target.
- the image 12 is stored and and and a patch defined by its central point (eg Q n , Figure 3) of the first image II is correlated with the corresponding point P n of the second image 12 by selecting a surrounding region R of initially 3 x 3 pixels and, by comparing local radiometric intensity distributions by means of the above-described modified Gruens algorithm, searching for the corresponding region R' in image 12 which is allowed to be distorted with an affine geometric distortion (eg in the simple case illustrated in Figure 14, horizontally elongated).
- the correlated patch is expanded (up to a maximum of 19 x 19 pixels) and the process is repeated. In this manner the corresponding point P n is found.
- conespondences are treated for the sake of simplicity as conelated pairs of points but it should be noted that this does not imply anything about their topography - in particular it does not imply that they lie at corners or edges of the object, for example.
- the origin Op (perspective centre) of the projector PR will lie at the intersection of P1Q1 and P2Q2.
- the 3D locations of these four points are not known, only the ray lines from the camera on which they lie, namely plPl, qlQl, p2P2 and q2Q2.
- the line P1Q1 will lie in the plane OcPlQl ie plane Ocplql which is available form the calibration process and the two images II and 12
- the line P2Q2 will lie in the plane OcP2Q2 ie plane Ocp2q2 which is similarly available from the calibration process and the two images II and 12.
- These planes define a baseline vector V by their intersection, which passes through OC and the perspective centre Op of the projector.
- a particularly simple way of finding the baseline vector V is to project plql and p2q2 which will meet at a point X in the plane of photodetector PD.
- the projection from point X through the perspective centre Oc is the baseline vector V as shown.
- the baseline vector V can be determined, though not the position of the projector origin Op along this baseline.
- the groups of points P and Q will each comprise more than two pairs and hence overdetermine the baseline vector V.
- the computer 4 is preferably ananged to derive a bundle of such vectors as determined by the sets of points PQ, to eliminate "outliers" ie those vectors which deviate by more than a given threshold from the mean and to perform a least squares estimate of vector V on the remainder of the vectors, in accordance with known statistical methods.
- the derivation of a three-dimensional representation of the object 3 is shown in the ray diagram of Figure 3.
- the camera CM and projector PR are shown located on baseline vector V.
- a first virtual projector prl is implemented by the image processing software in computer 4 and has the same optical characteristics as the camera (as determined in the initial calibration procedure).
- Image II Figure 14 is projected from this virtual projector in a 3D space simulated by the image processing software.
- a second virtual projector pr2 is similarly implemented by the image processing software and preferably has the same optical characteristics as the projector PR (which is also represented in Figure 3).
- This virtual projector projects a set of ray lines in the simulated 3D space corresponding to the respective physical projector rays PQ and the ray lines are each labelled with the respective correlated pixels of the image II as found in the image conelation process described with reference to Figure 14. It will be appreciated that the image 12 and target T define, and can be equated with, a set of rays originating from the perspective centre Op of the projector.
- the software in computer 4 is arranged to scale such acquired 3D representations to enable them to be fitted together to form a self-consistent overall 3D representation of the object. This aspect is described below with reference to Figures 17 to 20.
- FIG. 15 shows two planar calibration targets Tl and T2 (having peripheral blobs or discs BL similar to target T of Figure 13) whose orientations (and preferably positions) relative to the camera axis system are known, eg as a result of a photogrammetric determination involving separately acquiring images of them in the absence of any illumination from the projector, and processing the images in a procedure similar to that described above in connection with Figure 2.
- the perspective centres Oc and Op of the camera and projector are also shown.
- target Tl is illuminated by the structured light from the projector and an image is acquired by the camera CM.
- Figure 15 illustrates three points pi, p2 and p3 at which the structured light impinges on target Tl. These (and many other points, not shown) will be imaged by the camera CM.
- target Tl is removed and target T2 is illuminated by the structured light from the projector.
- An image is acquired by the camera CM.
- the three points PI, P2 and P3 conesponding to points pi, p2 and p3 are found by correlating the newly acquired image of the projection of the structured radiation on target T2 with the previously acquired image of the conesponding projection on Tl by the procedure described above with reference to Figure 13.
- Figure 15 illustrates further the relationship between the positions of two calibration targets Tl and T2 and the perspective centre Op of the projector PR (the camera CM being assumed fixed on the baseline vector V).
- a pair of points PI and P2 on target Tl form image points pi and p2 respectively on the photodetector anay PD of camera CM and (in a subsequent step following the removal of target Tl) a pair of points P3 and P4 on target T2 which are conelated with PI and P2 respectively form image points p3 and p4 respectively on photo detector anay PD.
- the pencil of rays formed by conesponding points on targets Tl and T2 (eg PI and P3; P2 and P4) is constructed to find the position of the perspective centre Op of the projector.
- the rays will not intersect at a point but a best estimate can be found from a least squares algorithm.
- the camera could be calibrated by the Tsai method (Roger Y Tsai, IEEE Journal of Robotics and Automation RA-3, No 4, August 1987 p 323 - see also references cited therein).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Remote Sensing (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0027703A GB2353659A (en) | 1998-05-15 | 1999-05-17 | Method and apparatus for 3D representation |
AU40505/99A AU4050599A (en) | 1998-05-15 | 1999-05-17 | Method and apparatus for 3d representation |
JP2000550066A JP2002516443A (en) | 1998-05-15 | 1999-05-17 | Method and apparatus for three-dimensional display |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GBGB9810553.9A GB9810553D0 (en) | 1998-05-15 | 1998-05-15 | Method and apparatus for 3D representation |
GB9810553.9 | 1999-05-12 | ||
GB9910960.5 | 1999-05-12 | ||
GB9910960A GB2352901A (en) | 1999-05-12 | 1999-05-12 | Rendering three dimensional representations utilising projected light patterns |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1999060525A1 true WO1999060525A1 (en) | 1999-11-25 |
Family
ID=26313698
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB1999/001556 WO1999060525A1 (en) | 1998-05-15 | 1999-05-17 | Method and apparatus for 3d representation |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP2002516443A (en) |
AU (1) | AU4050599A (en) |
GB (1) | GB2353659A (en) |
WO (1) | WO1999060525A1 (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2368117A (en) * | 2000-05-08 | 2002-04-24 | Neutral Ltd | Optical viewing arrangement |
WO2002075656A1 (en) * | 2001-03-20 | 2002-09-26 | Scannova Gmbh | Method and system for recording and representing three-dimensional objects |
GB2377576A (en) * | 2001-07-12 | 2003-01-15 | Vision Works Ltd D | Modelling of three dimensional shapes |
EP1982292A2 (en) * | 2006-01-31 | 2008-10-22 | University Of Southern California | 3d face reconstruction from 2d images |
WO2010053809A1 (en) * | 2008-10-28 | 2010-05-14 | The Boeing Company | Hand-held positioning interface for spatial query |
WO2012097020A1 (en) * | 2011-01-14 | 2012-07-19 | Eastman Kodak Company | Determining a stereo image from video |
WO2012168904A3 (en) * | 2011-06-07 | 2013-02-21 | Creaform Inc. | Sensor positioning for 3d scanning |
US9123159B2 (en) | 2007-11-30 | 2015-09-01 | Microsoft Technology Licensing, Llc | Interactive geo-positioning of imagery |
US9134339B2 (en) | 2013-09-24 | 2015-09-15 | Faro Technologies, Inc. | Directed registration of three-dimensional scan measurements using a sensor unit |
CN105066962A (en) * | 2015-07-21 | 2015-11-18 | 中国航空工业集团公司北京长城航空测控技术研究所 | Multiresolution large visual field angle high precision photogrammetry apparatus |
US9258550B1 (en) | 2012-04-08 | 2016-02-09 | Sr2 Group, Llc | System and method for adaptively conformed imaging of work pieces having disparate configuration |
US9816809B2 (en) | 2012-07-04 | 2017-11-14 | Creaform Inc. | 3-D scanning and positioning system |
US10342431B2 (en) | 2000-07-26 | 2019-07-09 | Melanoscan Llc | Method for total immersion photography |
US10401142B2 (en) | 2012-07-18 | 2019-09-03 | Creaform Inc. | 3-D scanning and positioning interface |
US11185697B2 (en) | 2016-08-08 | 2021-11-30 | Deep Brain Stimulation Technologies Pty. Ltd. | Systems and methods for monitoring neural activity |
US11298070B2 (en) | 2017-05-22 | 2022-04-12 | Deep Brain Stimulation Technologies Pty Ltd | Systems and methods for monitoring neural activity |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6377700B1 (en) * | 1998-06-30 | 2002-04-23 | Intel Corporation | Method and apparatus for capturing stereoscopic images using image sensors |
GB2358308B (en) * | 1999-11-25 | 2004-03-24 | Canon Kk | Image processing apparatus |
US6970591B1 (en) | 1999-11-25 | 2005-11-29 | Canon Kabushiki Kaisha | Image processing apparatus |
CN106228527B (en) * | 2010-11-15 | 2020-08-04 | 斯加勒宝展示技术有限公司 | System and method for calibrating display system using manual and semi-automatic techniques |
CN107255458B (en) * | 2017-06-19 | 2020-02-07 | 昆明理工大学 | Resolving method of vertical projection grating measurement simulation system |
-
1999
- 1999-05-17 GB GB0027703A patent/GB2353659A/en not_active Withdrawn
- 1999-05-17 AU AU40505/99A patent/AU4050599A/en not_active Abandoned
- 1999-05-17 JP JP2000550066A patent/JP2002516443A/en not_active Withdrawn
- 1999-05-17 WO PCT/GB1999/001556 patent/WO1999060525A1/en active Application Filing
Non-Patent Citations (5)
Title |
---|
ALBRECHT P ET AL: "IMPROVEMENT OF THE SPATIAL RESOLUTION OF AN OPTICAL 3-D MEASUREMENT PROCEDURE", INSTRUMENTATION/MEASUREMENT TECHNOLOGY CONF. (IMTC), OTTAWA, MAY 19 - 21, 1997, vol. 1, no. 14, IEEE,, pages 124 - 129, XP000725373, ISBN: 0-7803-3748-4 * |
BAKER H H ET AL: "GENERALIZING EPIPOLAR-PLANE IMAGE ANALYSIS FOR NON-ORTHOGONAL AND VARYING VIEW DIRECTIONS", IMAGE UNDERSTANDING WORKSHOP. PROCEEDINGS, vol. 2, 1 January 1987 (1987-01-01), pages 843 - 848, XP000572538 * |
HARTLEY R ET AL: "STEREO FROM UNCALIBRATED CAMERAS", PROC. COMP. SOC. CONF. ON COMPUTER VISION AND PATTERN RECOGNITION, CHAMPAIGN, IL, JUNE 15 - 18, 1992, 15 June 1992 (1992-06-15), IEEE,, pages 761 - 764, XP000357427, ISBN: 0-8186-2855-3 * |
HARTLEY R I: "IN DEFENCE OF THE 8-POINT ALGORTHM", PROC. FIFTH INTERNAT. CONF. ON COMPUTER VISION, CAMBRIDGE, MA., JUNE 20 - 23, 1995, no. 5, IEEE,, pages 1064 - 1070, XP000557481, ISBN: 0-7803-2925-2 * |
OTTO G P ET AL: "'Region-growing' algorithm for matching of terrain images", IMAGE AND VISION COMPUTING, MAY 1989, UK, vol. 7, no. 2, pages 83 - 94, XP002115154, ISSN: 0262-8856 * |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2368117A (en) * | 2000-05-08 | 2002-04-24 | Neutral Ltd | Optical viewing arrangement |
US10342431B2 (en) | 2000-07-26 | 2019-07-09 | Melanoscan Llc | Method for total immersion photography |
WO2002075656A1 (en) * | 2001-03-20 | 2002-09-26 | Scannova Gmbh | Method and system for recording and representing three-dimensional objects |
GB2377576A (en) * | 2001-07-12 | 2003-01-15 | Vision Works Ltd D | Modelling of three dimensional shapes |
GB2377576B (en) * | 2001-07-12 | 2005-06-01 | Vision Works Ltd D | Modelling of three dimensional shapes |
EP1982292A4 (en) * | 2006-01-31 | 2013-05-29 | Univ Southern California | 3d face reconstruction from 2d images |
EP1982292A2 (en) * | 2006-01-31 | 2008-10-22 | University Of Southern California | 3d face reconstruction from 2d images |
US9123159B2 (en) | 2007-11-30 | 2015-09-01 | Microsoft Technology Licensing, Llc | Interactive geo-positioning of imagery |
US8138938B2 (en) | 2008-10-28 | 2012-03-20 | The Boeing Company | Hand-held positioning interface for spatial query |
WO2010053809A1 (en) * | 2008-10-28 | 2010-05-14 | The Boeing Company | Hand-held positioning interface for spatial query |
WO2012097020A1 (en) * | 2011-01-14 | 2012-07-19 | Eastman Kodak Company | Determining a stereo image from video |
CN103649680A (en) * | 2011-06-07 | 2014-03-19 | 形创有限公司 | Sensor positioning for 3D scanning |
WO2012168904A3 (en) * | 2011-06-07 | 2013-02-21 | Creaform Inc. | Sensor positioning for 3d scanning |
US9325974B2 (en) | 2011-06-07 | 2016-04-26 | Creaform Inc. | Sensor positioning for 3D scanning |
US9258550B1 (en) | 2012-04-08 | 2016-02-09 | Sr2 Group, Llc | System and method for adaptively conformed imaging of work pieces having disparate configuration |
US10235588B1 (en) | 2012-04-08 | 2019-03-19 | Reality Analytics, Inc. | System and method for adaptively conformed imaging of work pieces having disparate configuration |
US9816809B2 (en) | 2012-07-04 | 2017-11-14 | Creaform Inc. | 3-D scanning and positioning system |
US10928183B2 (en) | 2012-07-18 | 2021-02-23 | Creaform Inc. | 3-D scanning and positioning interface |
US10401142B2 (en) | 2012-07-18 | 2019-09-03 | Creaform Inc. | 3-D scanning and positioning interface |
US9134339B2 (en) | 2013-09-24 | 2015-09-15 | Faro Technologies, Inc. | Directed registration of three-dimensional scan measurements using a sensor unit |
CN105066962A (en) * | 2015-07-21 | 2015-11-18 | 中国航空工业集团公司北京长城航空测控技术研究所 | Multiresolution large visual field angle high precision photogrammetry apparatus |
US11185697B2 (en) | 2016-08-08 | 2021-11-30 | Deep Brain Stimulation Technologies Pty. Ltd. | Systems and methods for monitoring neural activity |
US11278726B2 (en) | 2016-08-08 | 2022-03-22 | Deep Brain Stimulation Technologies Pty Ltd | Systems and methods for monitoring neural activity |
US11890478B2 (en) | 2016-08-08 | 2024-02-06 | Deep Brain Stimulation Technologies Pty Ltd | Systems and methods for monitoring neural activity |
US11298070B2 (en) | 2017-05-22 | 2022-04-12 | Deep Brain Stimulation Technologies Pty Ltd | Systems and methods for monitoring neural activity |
Also Published As
Publication number | Publication date |
---|---|
GB2353659A (en) | 2001-02-28 |
JP2002516443A (en) | 2002-06-04 |
GB0027703D0 (en) | 2000-12-27 |
AU4050599A (en) | 1999-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO1999060525A1 (en) | Method and apparatus for 3d representation | |
CN111750806B (en) | Multi-view three-dimensional measurement system and method | |
KR100450469B1 (en) | Image Combination Method and System Using Parallax-based Technology | |
US7342669B2 (en) | Three-dimensional shape measuring method and its device | |
GB2352901A (en) | Rendering three dimensional representations utilising projected light patterns | |
US6930685B1 (en) | Image processing method and apparatus | |
JP7486740B2 (en) | System and method for efficient 3D reconstruction of an object using a telecentric line scan camera - Patents.com | |
US6195455B1 (en) | Imaging device orientation information through analysis of test images | |
RU2204149C2 (en) | Method and facility for cartography of radiation sources | |
WO2007015059A1 (en) | Method and system for three-dimensional data capture | |
Gorevoy et al. | Optimal calibration of a prism-based videoendoscopic system for precise 3D measurements | |
Mahdy et al. | Projector calibration using passive stereo and triangulation | |
CN107135336B (en) | A kind of video camera array | |
CN107977998B (en) | Light field correction splicing device and method based on multi-view sampling | |
Takatsuka et al. | Low-cost interactive active monocular range finder | |
Pedersini et al. | 3D area matching with arbitrary multiview geometry | |
GB2337390A (en) | Deriving a 3D representation from two or more 2D images | |
Pachidis et al. | Pseudo-stereo vision system: a detailed study | |
JPH09329440A (en) | Coordinating method for measuring points on plural images | |
Schillebeeckx et al. | Single image camera calibration with lenticular arrays for augmented reality | |
Sagawa et al. | Accurate calibration of intrinsic camera parameters by observing parallel light pairs | |
Cui et al. | Epipolar geometry for prism-based single-lens stereovision | |
Schneider et al. | Development and application of an extended geometric model for high resolution panoramic cameras | |
Pedersini et al. | A multi-view trinocular system for automatic 3D object modeling and rendering | |
Xiao et al. | A single-lens trinocular stereovision system using a 3F filter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
ENP | Entry into the national phase |
Ref country code: GB Ref document number: 200027703 Kind code of ref document: A Format of ref document f/p: F |
|
NENP | Non-entry into the national phase |
Ref country code: KR |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 09700465 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |