EP3076369B1 - Procede et dispositif destines a la representation d'un objet - Google Patents
Procede et dispositif destines a la representation d'un objet Download PDFInfo
- Publication number
- EP3076369B1 EP3076369B1 EP16161420.1A EP16161420A EP3076369B1 EP 3076369 B1 EP3076369 B1 EP 3076369B1 EP 16161420 A EP16161420 A EP 16161420A EP 3076369 B1 EP3076369 B1 EP 3076369B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- subarea
- coordinates
- determining
- spatial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 33
- 238000002604 ultrasonography Methods 0.000 claims description 56
- 238000001839 endoscopy Methods 0.000 claims description 42
- 238000003384 imaging method Methods 0.000 claims description 16
- 238000013507 mapping Methods 0.000 claims description 8
- 238000005266 casting Methods 0.000 claims description 4
- 238000003325 tomography Methods 0.000 claims description 4
- 238000003703 image analysis method Methods 0.000 claims description 2
- 238000012546 transfer Methods 0.000 claims description 2
- 230000004927 fusion Effects 0.000 claims 2
- 210000001519 tissue Anatomy 0.000 description 19
- 230000003287 optical effect Effects 0.000 description 12
- 206010028980 Neoplasm Diseases 0.000 description 7
- 239000000523 sample Substances 0.000 description 6
- 230000010354 integration Effects 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000011888 foil Substances 0.000 description 1
- 238000010147 laser engraving Methods 0.000 description 1
- 201000010260 leiomyoma Diseases 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 238000002603 single-photon emission computed tomography Methods 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 238000011477 surgical intervention Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/065—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5247—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
- G06T2207/10136—3D ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Definitions
- the invention relates to a method for displaying an object, in particular a biological tissue, according to claim 1 and an apparatus for displaying an object according to claim 14.
- Ultrasound and endoscopy devices are known from medical technology, which allow surgical intervention under ultrasound control or visual inspection.
- the US 601 97 24 A a corresponding ultrasonic device.
- the problem underlying the invention is to provide a method that allows the best possible control of a taking place on an object engagement. Furthermore, a corresponding device is to be provided.
- the first and the second device are designed in particular for intraoperative imaging.
- the first and the second image that is generated with the first or the second device, for example, by imaging (projection) of at least a portion of the object in an image plane or by a 3D representation of the object.
- the first image is an endoscopy image or a stereo endoscopy image
- the first device is an endoscopy device, respectively. It is also possible, of course, that the first device is a non-endoscopic device.
- the second picture is e.g. an ultrasound image, an MRI image, an X-ray image, a nuclear radiological image, an MRI image and / or a single photon emission tomography image and the second device corresponding to an ultrasound device, an MRI device, an X-ray device, a nuclear radiological Device, an MRI device and / or a single-photon emission tomography (SPECT) device.
- SPECT single-photon emission tomography
- the determination of the second coordinates of the pixels of the second image in the second coordinate system allows in particular a perspective correct (positionally correct) integration of at least a part of the second image in the first image.
- the first device comprises an optical system (e.g., a camera), e.g. the second coordinate system is associated with an image plane of the optical system.
- the optical system of the first device is, for example, a video camera (for example a CCD camera) with which a real-time image of the object to be examined can be generated.
- a real-time image can also be generated by means of the second device, so that a combined real-time image of the object can be realized.
- the object to be displayed is a biological tissue (e.g., a soft tissue), which method of the invention can be used to control operation of the tissue (e.g., removal of a tumor, such as a fibroid), i. H. intraoperatively.
- a biological tissue e.g., a soft tissue
- the method according to the invention is used in non-medical applications, for. B. to control a repair to a component.
- a surgeon is provided with a depth representation of the relevant surgical area in addition to the endoscopy image.
- the surgeon can assess a depth extent of a tumor that can not be recognized in the pure endoscopy image.
- the additional depth information enables the surgeon, for example, to update his planning of the operation or the finding during the course of the operation.
- the second coordinate system in which the coordinates of the pixels of the second image are mapped, is, for example, a two-dimensional coordinate system, e.g. an image plane of a camera of the first device is assigned.
- the image plane of the camera is z. B., as already mentioned above, a CCD chip, wherein z.
- the origin of the second coordinate system is located in a plane corresponding to a plane along which the CCD chip extends, d. H.
- the coordinate axes of the second coordinate system extend in a plane whose position corresponds to the image plane of the camera of the first device.
- the first device is a stereo endoscopy device that generates two perspectively different images by means of two optical systems (cameras).
- an image of the first coordinates of the second image takes place both in the (second) coordinate system which is assigned to the image plane of the first camera and in the (further second) coordinate system which is assigned to the image plane of the second camera to be able to correctly integrate the second image in the correct position in both images of the stereoscopic endoscopy device.
- the mapping of the first coordinates into the second coordinate system takes place in particular by means of a mapping (in particular by means of a transformation), which is determined by calibrating the first device (for example in the form of an endoscopy device).
- This image is based in particular on a model of the first device (in particular of an optical system of the first device), parameters of the optical system being determined by the calibration.
- a calibration pattern is imaged with the first device, wherein the spatial coordinates of the calibration pattern (eg relative to the first device) are determined by means of a position determination device and assigned to the coordinates of the image of the calibration pattern in the image plane of the first device.
- a suitable calibration method in particular for a first device in the form of an endoscopy device, which can also be used to calibrate a distance determining device of an endoscopy device, is disclosed in the German patent application DE 10 2010 042 540.0 of 15.10.2010, to which reference is expressly made.
- a distortion of the object is also detected by an optical system of the first device, ie distortion parameters of the mentioned model of the optical system of the first device are also determined, for example.
- the mapping of the first coordinates of the pixels of the second takes place Image into the second coordinate system using these distortion parameters, ie a distortion function.
- both the calibration of the first device and a calibration of the second device using a calibration wherein the calibration (eg integrally) are interconnected (ie form an assembly), wherein the spatial position of the calibration is determined in particular by a position determining device ,
- determining the second coordinates of the pixels of the second image comprises determining the spatial position of the second device (eg an ultrasound head of an ultrasound device) by means of a position determination device and mapping the first coordinates of the pixels into spatial coordinates in one of Positioning device associated spatial coordinate system ("world coordinate system").
- the position-determining device is, for example, a clinical navigation system or a component of a clinical navigation system that is configured using marking elements (which are arranged in particular on the second device, eg an ultrasound head) and a measuring device (in particular a measuring camera or a magnetic field sensor). allows determination of the spatial position (ie, the position and orientation) of an article such as the ultrasound probe.
- the assignment of the image data of the second device to spatial coordinates takes place in particular after a corresponding calibration of the second device.
- marking elements can be active (self-luminous) elements, such as LEDs, passive (non-luminescent) elements, such as reflective spheres, foils, special patterns (flat targets, laser engraving or natural patterns such as corners and edges) or magnetic field sources (eg in the form of current-carrying coils) can be used.
- active (self-luminous) elements such as LEDs
- passive (non-luminescent) elements such as reflective spheres, foils
- special patterns flat targets, laser engraving or natural patterns such as corners and edges
- magnetic field sources eg in the form of current-carrying coils
- a measuring camera of a position-determining device is based in particular on the principle of a stereo camera, i. it has, for example, two sensor elements (in particular CCD chips) spaced apart from one another, which receive light from different angles at an object (in particular from a marking element of the position-determining device), so that the spatial position of the object can be reconstructed from the data of the sensor elements.
- the spatial positions of the marker elements i. their spatial coordinates are determined in a given coordinate system assigned to the position-determining device. If the relative position of the marking elements in relation to the system to which they are attached is known, it is possible to deduce from the spatial positions of the marking elements on the position of the system. However, such position determination devices are known per se, so that it should not be discussed further here.
- the z. B. also for determining the spatial position of the second device (eg the ultrasonic head of an ultrasound device), the spatial position of the first device (eg in the form of an endoscopy device and in particular their endoscope shaft) are determined, wherein the determination of the second coordinates of the pixels of the first Image is done using the particular spatial location of the first device.
- the second device eg the ultrasonic head of an ultrasound device
- the spatial position of the first device eg in the form of an endoscopy device and in particular their endoscope shaft
- generating the combined image comprises determining coordinates of at least some of the pixels of the first image in the second coordinate system, wherein the pixels of the first image are replaced or overlaid with pixels of the second image with the corresponding second coordinates.
- Overlaying (blending) the pictures can be done eg. B. be generated by adding the intensities of the pixels of the first image and the second image. It is also conceivable that a user can specify the degree of superposition (crossfade degree).
- the first image and / or the second image may be edited prior to the combination, e.g. Using one or more filters (e.g., to introduce a hiding factor).
- the second image may also be in the form of a 3D image (eg, a 3D ultrasound image, eg, a 3-D reconstruction) that may be generated by detecting (by moving, for example, the ultrasound head of an ultrasound device) a plurality of image planes.
- a 3D image eg, a 3D ultrasound image, eg, a 3-D reconstruction
- the spatial coordinates of the second image (second) coordinates associated with the image plane of the first device e.g, a 3D view of the object is thus integrated into the first image.
- the device used to implement the method according to the invention is designed such that a user can switch between the 3D overlay and the overlay of the 2D image currently generated by the second device or completely switch off the integration of the second image.
- a partial selection of pixels of the first image is determined by a selection process, and the particular pixels are replaced or superimposed by pixels of the second image with the corresponding second coordinates. For example, to select pixels to be replaced, a partial area is marked in the first image.
- a partial area (partial section) of the object is selected (for example marked) and the combined image is created by replacing or superimposing pixels of the first image within the partial area of the object.
- the combination of the first and the second image takes place exclusively in the marked subregion.
- a partial section of the partial area of the object represented by the first image is selected as the partial area, or a partial area of the first image is determined which corresponds to the selected partial area (partial section) of the object.
- the partial area of the object is selected by, as mentioned above, selecting a partial area of the first image, e.g. marked in the first picture will.
- the marking (segmentation) of the subarea can, for.
- they serve to characterize a tissue structure (for example a tumor) to be operated, so that the information of the second image relevant to the surgeon is mainly or exclusively displayed in this subarea in order to provide the desired depth information on the one hand, but the operator, on the other hand not to burden with unnecessary image data.
- the marked subregion follows a movement of the object. This is made possible, in particular, by the fact that marking elements can also be arranged on the object (for example a patient to be operated) whose position is tracked by means of the mentioned position-determining device, so that the object position and thus in the generation of the combined image can be taken into account. Alternatively, this can also without marking elements on a recognition of the same Characteristics can be achieved in successive pictures and thus, for example, by tracking a deformation of the object ("non-rigid picture registration"); see below.
- the partial area of the object is selected on the basis of a reconstruction of a three-dimensional area of the object, wherein the reconstruction is generated using a plurality of images generated by the second device.
- the reconstruction of the three-dimensional region of the object comprises an interpolation between two temporally successive images of the second device as a function of their respective spatial positions determined by means of the position-determining device.
- the interpolation of the two temporally successive images of the second device and / or storage of data generated by the interpolation can be done on a graphics card, the interpolation on the graphics card e.g. is calculated in parallel.
- a plurality of light spots (fulcrums) are generated, whereby the light spots (light spots) can be generated simultaneously or sequentially. If the light spots are generated one after another, e.g. for each light spot, a first image showing the light spot is stored, the corresponding first images being evaluated during the marking process (in particular in real time) or after completion of the marking operation, and the sub-area composed of the successively generated light spots in the form of e.g. a 2D polygon surface or a volume is determined.
- a first point in time output coordinates (in particular in the world coordinate system) of the subarea and at a second time updated coordinates of the subarea are determined as a function of the current position and / or shape of the object.
- tracking of the position and / or the shape of the object takes place, for example, relative to the world coordinate system and thus to the coordinate system of the first device, for example by means of marking elements which are arranged relative to the object and rigid and non-rigid transformations thereon stand.
- the marking takes place in that a light beam generated by the light source is continuously guided over the object and a recorded with the first device sequence of images is evaluated.
- the position determination device is in particular a clinical navigation system, as already explained above.
- the distance determining device with which in particular a distance between a section of the first device facing the object to be imaged and the object can be determined, comprises e.g. Means (in particular in the form of a laser) for projecting the light structure onto the object to be imaged and acting e.g. with an imaging optics and a camera of the first device together.
- the means for projecting are arranged in particular in relation to an imaging optics of the first device, which serves for imaging the object in the image plane of a camera of the first device, that between the light beam emitted by the means for projecting and a light beam reflected on the object, that falls from the object into an imaging optics of the first device, an angle (eg 30 °) or a distance between the optical axis of the light beam and optical axis of the imaging optics, so that the position of the image of the light pattern generated on the object in the image plane the camera depends on the distance of the object to the first device.
- an imaging optics of the first device which serves for imaging the object in the image plane of a camera of the first device, that between the light beam emitted by the means for projecting and a light beam reflected on the object, that falls from the object into an imaging optics of the first device, an angle (eg 30 °) or a distance between the optical axis of the light beam and optical axis of the imaging optics, so that the
- a position vector between a portion of the first device and a portion of the light structure projected by the distance determining device that is, not only the distance but also the orientation of a connecting line between these portions is determined.
- a suitable first device in the form of an endoscopy device having such a distance determining device is in the already mentioned German patent application DE 10 2010 042 540.0 which is also expressly incorporated herein by reference.
- the region of interest of the object be marked using a light source.
- the partial area is drawn in the first image (in digital form) (for example using a computer mouse or a drawing pen) or that the partial area is determined by means of a separate positioning device (for example a navigated instrument).
- the partial area is displayed using a ray-casting method in the first and / or second image to convey a depth impression.
- the ray-casting method is e.g. a maximum intensity projection, an average intensity projection, a freely definable transfer function and / or a surface projection used.
- the distance determining device is connected to the first device (e.g., is a component of the first device), or that the distance determining device is a separate device (a separate instrument) from the first device, i. in particular spaced from the first device is arranged.
- the selection of the subarea of interest is determined, for example, by applying a Boolean predicate function (eg, a threshold function) to pixels of the second image and / or an image analysis method (eg, a contour recognition method) to the second image.
- a Boolean predicate function eg, a threshold function
- an image analysis method eg, a contour recognition method
- the partial area can be determined by specifying a threshold value and selecting a plurality of pixels of the second image (eg in the form of an ultrasound image) as a function of the threshold value.
- the predetermined threshold value refers to the intensity of the pixels, for example, areas of the object are selected, to which only pixels of the second image are assigned, whose intensity exceeds the threshold (falls below).
- This automatic selection could be used, for example, to select a tumor to be removed, whereby to select the tumor it would only have to be scanned ultrasonically in order to obtain a corresponding 3D reconstruction of the tumor.
- a light source in particular a laser
- the first device could serve to set a starting point for the 3D reconstruction.
- the threshold-dependent selection may in particular be realized in the form of a filter which masks the background of the second image on the basis of the threshold value.
- This type of selection (segmentation) of a partial area of the second image i.e., the object to be displayed
- the threshold may also be customizable by a user.
- the marked spatial subregion of the object is extrapolated to a closed subregion and a combined image is only generated if the region of the object detected with the second device intersects the closed subregion.
- the marked portion is extrapolated to a closed cylinder.
- the closed portion is detected using the above-described 3D reconstruction (e.g., a tumor).
- the first image eg in the form of an endoscopic image
- information regarding the spatial position of the second device is superimposed in order, for.
- the second image for example an ultrasound image
- the second image to be displayed on a separate display, and information relating to the position of the first device to be superimposed in this representation.
- the spatial position of at least one tool can be determined by means of the position-determining device and the tool can be superimposed into the second image, taking into account its spatial position and the spatial position of the second device.
- the second image with the displayed tool according to step e) is combined with the first image.
- a time delay of the image triggering of the first device, the second device and the position determination of the position determination unit is determined and is available for the timing by a central processing unit for displaying the combined image.
- the first and the second coordinate determination device are designed in particular in the form of software or a correspondingly programmed electronic device (computer, microchip, etc.).
- the imaging device may be in the form of software or suitably programmed hardware (e.g., a graphics card).
- the device also comprises marking elements which are arranged on the first device (for example an endoscopy device) and / or the second device (for example an ultrasound head of an ultrasound device), wherein by means of a position-determining device, a spatial position of the marking elements and thus of the first device and / or the second device can be determined.
- the position determining device is, in particular, a clinical navigation system that includes a position determining device, such as a measuring camera.
- the device 1 comprises a first device in the form of an endoscopy device 2 (eg in the form of a hystoroscopy device) and a second device in the form of an ultrasound device 3. Both the endoscopy device 2 and the ultrasound device 3 serve to generate an image of an object in the form of a tissue 4th
- the endoscopy device 2 is a rigid endoscope which has a rigid endoscope shaft 21 in which an endoscope optical system for imaging an object is located in an image plane of a camera 22 of the endoscopy device 2.
- a possible embodiment of the endoscopy device 2 is in the already mentioned German patent application DE 10 2010 042 540.0 described.
- the camera 22 and the ultrasound device 3 are connected to a data processing system in the form of a computer 100 for processing (eg digitizing and filtering) and displaying the respectively generated image data.
- Marking elements in the form of marking spheres 51-53 are arranged on the endoscopy device 2.
- marking balls 54 - 56 are arranged, wherein the marking balls 51-56 a determination of the position, d. H. allow both the location where the respective device is located, as well as the orientation of the respective device.
- the device 1 according to the invention has a position-determining device in the form of a clinical navigation system 6, which comprises a measuring camera in the form of a stereo camera 61.
- the position of the marking spheres 51 - 53 attached to the endoscopy device 2 and the marking spheres 54 - 56 arranged on the ultrasound head 31 of the ultrasound device 3 and thus of the endoscopy device 2 and the ultrasound device 3 can be determined.
- the position of the image plane of the endoscopy camera 22 as well as the position of the area (e.g., a plane) of the tissue 4 detected by the ultrasound probe 31 are also known, e.g. updated in real time during a (manual) movement of the endoscopy device 2 and / or the ultrasound head 31.
- the ultrasound image generated with the ultrasound device 3 is therefore possible to combine the ultrasound image generated with the ultrasound device 3 in perspective correctly with the endoscopy image generated by the endoscopy device 2.
- (first) coordinates of pixels of the ultrasound image are assigned spatial coordinates in a spatial coordinate system assigned to the position-determining device 6 (the measuring chamber 61), the spatial coordinates to be assigned to the ultrasound pixels being points of the region of the tissue 4 detected with the ultrasound device 3.
- These spatial coordinates associated with the pixels of the ultrasound image are transformed into the (virtual) image plane of the camera 22, ie into a plane corresponding to their position of the image plane, using a (simulated) image determined by calibrating the endoscopy device 2
- Camera 22 corresponds, shown and determines (second) coordinates of the ultrasound pixels with respect to the image plane of the camera 22.
- FIG. 3 An example of such a perspective correctly combined image shows the FIG. 3 , where the FIG. 3 an endoscopy image taken with the endoscopy device 2 is based, which in FIG. 2 is shown.
- FIG. 3 is a portion of the endoscopic image 7 by an ultrasound image 8 er-10 sets, wherein the coordinates of the points of the original ultrasound image were transformed according to the method described above.
- the positionally correct integration of the ultrasound image into the endoscopy image which is made possible in this way, provides depth information of the tissue in addition to the optical information of the endoscopy image.
- the depth of the ultrasound image ie position of the plane, which is detected by the ultrasound head and displayed in the ultrasound image 8, z. B. by a manual movement of the ultrasonic head 31 of the ultrasonic device 3 are changed. It is also conceivable that, instead of the 2D ultrasound image shown, an SD reconstruction of the tissue is integrated.
- a partial region of the tissue 4 is selected and the endoscopic image 7 is replaced or superimposed only in a section through the ultrasound image 8 which corresponds to the selected subregion of the tissue 4.
- the selection of a subregion of interest may, for example, be carried out with the aid of a distance determining device of the endoscopy device, it being possible for a laser of the distance determining device to generate light spots on the tissue 4 which mark the subregion.
- the position of the light spot is marked with a cross.
- the marking is carried out with the aid of the laser in that the laser scans and registers several points of the tissue, as already explained above.
- additional information can be displayed, eg. B.
- Information regarding the orientation of the ultrasonic head 31 can be z. B. by .Enblenden a structure 9, which marks the position of the detected by the ultrasonic head 31 Thomaskegels 9, take place.
- The. Structure 9 is an "augmented reality" overlay. Conceivable are other such displays, for example, indicate a direction along which the ultrasound head must be moved to detect a marked portion of the tissue.
- FIGS. 4 and 5 each show a combined image with a preoperative CT image 8 'displayed in a first image 7 (eg taken by an endoscope or another camera).
- a navigated instrument 150 is shown, ie an instrument whose spatial position was determined by the navigation system (see above), the instrument 150 having, in particular, corresponding navigation marking elements.
- the position of the instrument (tool) 150 is indicated in the combined image in the correct position with respect to the CT image 8 'by a linear marking 101.
- an extension of the axis of the instrument 150 is marked (cross).
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Processing (AREA)
- Confectionery (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Claims (15)
- Méthode de sélection de points d'image dans la représentation d'un objet dans l'imagerie intra-opératoire, en particulier un tissue biologique, comprenant les étapes:a) Générer une première image (7) d'au moins une région partielle de l'objet (4) à l'aide d'un appareil d'endoscopie comme premier appareil (2);b) Générer une seconde image (8, 8') au moins de la région partielle de l'objet (4) de l'étape a) à l'aide d'un second appareil (3);c) Déterminer des premières coordonnées d'au moins certains points d'image de la seconde image dans un premier système de coordonnées;d) Déterminer des secondes coordonnées des points d'image de la seconde image en mappant les premières coordonnées dans un second système de coordonnées qui est différent du premier système de coordonnées et qui est attribué au premier appareil (2); ete) Générer une image combinée de l'objet (4) à partir de la première image et pour des points d'image sélectionnés à partir de la seconde image en utilisant les secondes coordonnées des points d'image de la seconde image qui ont été déterminées, caractérisé en ce quef) la sélection des points d'image à combiner est déterminée par la sélection d'une région partielle de l'objet (4), dans lequel• le premier appareil (2) comprend une source de lumière et la région partielle de l'objet (4) à sélectionner est marquée en générant au moins un point lumineux sur l'objet (4) à l'aide de la source de lumière, ou• la région partielle de l'objet (4) est marquée à l'aide d'un instrument navigué séparé,g) de sorte que la génération de l'image combinée dans l'imagerie intra-opératoire se produit en remplaçant ou superposant des points d'image de la première image (7) dans la région partielle de l'objet (4) par les points d'image sélectionnés de la seconde image.
- Méthode selon la revendication 1, dans laquelle le marquage d'une région partielle de l'objet (4) à l'aide de points lumineux est caractérisé par- marquer une région partielle spatiale de l'objet (4) à l'aide d'une pluralité de points lumineux;- déterminer la position spatiale du premier appareil (2) en utilisant un appareil de détermination de position (6);- déterminer la distance entre le premier appareil (2) et les points lumineux générés avec l'aide d'un appareil de détermination de distance;- dans laquelle la sélection de la région partielle se produit en déterminant la position spatiale de la région partielle en déterminant les coordonnées spatiales des points lumineux dans le système de coordonnées de l'appareil de détermination de position (6) en utilisant la position spatiale déterminé du premier appareil (2) et de la distance déterminée.
- Méthode selon une des revendications 1 ou 2, caractérisée en ce que la région partielle spatiale marquée est extrapolée à une région partielle fermée, et une image combinée est générée seulement si la région de l'objet (4) détectée par le seconde appareil (3) coupe la région partielle fermée.
- Méthode selon l'une quelconque des revendications précédentes, caractérisée en ce que la région partielle extrapolée fermée, les coordonnée spatiales des points lumineux, les régions partielles de la revendication 1 et/ou au moins certaines des distances déterminées sont affichées dans la première et/ou la seconde image.
- Méthode selon l'une quelconque des revendications 2 à 4, caractérisée en ce que l'appareil de détermination de distance est relié au premier appareil (2).
- Méthode selon l'une des revendications précédentes, caractérisée en ce que la sélection des points d'images à combiner peut en outre être limitée en appliquant une fonction de prédicat booléen à des points d'image du seconde image et/ou un procédé d'analyse d'image à la seconde image.
- Méthode selon la revendication 1, caractérisée en ce que la région partielle de l'objet (4) est sélectionnée en utilisant une reconstruction d'une région tridimensionelle de l'objet (4), dans laquelle la reconstruction est générée en utilisant une pluralité d'images générées avec le second appareil (3).
- Méthode selon la revendication 7, caractérisée en ce que la reconstruction de la région tridimensionelle de l'objet (4) comprend une interpolation entre deux ou plusieurs images du second appareil (3) en succession chronologique en fonction de leurs positions spatiales respectives déterminées à l'aide de l'appareil de détermination de position (6).
- Méthode selon l'une quelconque des revendications précédentes, caractérisée en ce que la région partielle de l'objet (4) est représenté en utilisant une méthode raycasting dans la première et/ou la seconde image (7), afin de procurer une impression de profondeur.
- Méthode selon la revendication 9, caractérisée en ce que la méthode raycasting utilise une projection d'intensité maximale, une projection d'intensité moyenne, une fonction de transfert librement définissable et/ou une projection de surface.
- Méthode selon l'une quelconque des revendications précédentes, caractérisé en ce que, dans un premier instant, des coordonnées initiales de la région partielle et dans un seconde instant des coordonnées actualisées de la région partielle sont déterminées en fonction de la position actuelle et/ou de la forme de l'objet (4).
- Méthode selon l'une quelconque des revendications précédentes, caractérisé en ce que la première image (7) est une image générée avec une optique projective sur un plan 2D, en particulier une image endoscopique ou une image stéréoendoscopique.
- Méthode selon l'une quelconque des revendications précédentes, caractérisée en ce que la seconde image est une image intra-opératoire, en particulier une image ultrasonore, ou une image préopératoire, en particulier une image IRM, une image radiographique, une image nucléaire-radiologique, une image de tomographie par émission de photons unique et/ou une fusion de ces images et le second appareil (3) est adéquatement un appareil à ultrasons, un appareil IRM, un appareil à rayon X, un appareil radiologique nucléaire, un appareil de tomographie par émission de photons unique et/ou un appareil de fusion d'image.
- Appareil pour la représentation d'un objet pour effectuer une méthode en imagerie intra-opératoire selon l'une quelconque des revendications précédentes avec- un appareil d'endoscopie comme premier appareil (2) pour générer une première image (7) d'au moins une région partielle d'un objet (4);- un seconde appareil (3) pour générer au moins la région partielle d'une seconde image de l'objet (4) de l'étape a);- un premier appareil de détermination de coordonnées pour déterminer des premières coordonnées d'au moins certaines points d'images de la seconde image dans un premier système de coordonnées;- un seconde appareil de détermination de coordonnées pour déterminer des secondes coordonnées des points d'image de la seconde image en mappant les premières coordonnées dans un seconde système de coordonnées qui est différent du premier système de coordonnées et qui est attribué au seconde appareil (3); et- un appareil de génération d'image pour générer une image combinée de l'objet dans l'imagerie intra-opératoire à partir de la première et la seconde image en utilisant les secondes coordonnées des points d'images de la seconde image qui ont été déterminées, dans laquelle le premier appareil (2) comprend une source de lumière pour générer au moins un point lumineux sur l'objet (4) ou un instrument navigué séparé pour sélectionner la région partielle de l'objet (4).
- Appareil selon la revendication 14, caractérisé par des éléments de marquage (51-56) agencés sur le premier et/ou le seconde appareil (2, 3), aussi bien qu'un appareil de détermination de position (6) pour déterminer la position spatiale des éléments de marquage (51-56) et ainsi la position spatiale du premier et/ou du seconde appareil (2, 3).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17203950.5A EP3330922B1 (fr) | 2011-06-28 | 2012-06-28 | Procede et dispositif destines a la representation d'un objet |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102011078212.5A DE102011078212B4 (de) | 2011-06-28 | 2011-06-28 | Verfahren und Vorrichtung zum Darstellen eines Objektes |
EP12745647.3A EP2727082A2 (fr) | 2011-06-28 | 2012-06-28 | Procédé et dispositif de représentation d'un objet |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12745647.3A Division EP2727082A2 (fr) | 2011-06-28 | 2012-06-28 | Procédé et dispositif de représentation d'un objet |
EP12745647.3A Previously-Filed-Application EP2727082A2 (fr) | 2011-06-28 | 2012-06-28 | Procédé et dispositif de représentation d'un objet |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17203950.5A Division EP3330922B1 (fr) | 2011-06-28 | 2012-06-28 | Procede et dispositif destines a la representation d'un objet |
EP17203950.5A Previously-Filed-Application EP3330922B1 (fr) | 2011-06-28 | 2012-06-28 | Procede et dispositif destines a la representation d'un objet |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3076369A1 EP3076369A1 (fr) | 2016-10-05 |
EP3076369B1 true EP3076369B1 (fr) | 2017-11-29 |
Family
ID=46640647
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12745647.3A Withdrawn EP2727082A2 (fr) | 2011-06-28 | 2012-06-28 | Procédé et dispositif de représentation d'un objet |
EP16161420.1A Active EP3076369B1 (fr) | 2011-06-28 | 2012-06-28 | Procede et dispositif destines a la representation d'un objet |
EP17203950.5A Active EP3330922B1 (fr) | 2011-06-28 | 2012-06-28 | Procede et dispositif destines a la representation d'un objet |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12745647.3A Withdrawn EP2727082A2 (fr) | 2011-06-28 | 2012-06-28 | Procédé et dispositif de représentation d'un objet |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17203950.5A Active EP3330922B1 (fr) | 2011-06-28 | 2012-06-28 | Procede et dispositif destines a la representation d'un objet |
Country Status (4)
Country | Link |
---|---|
US (2) | US9792721B2 (fr) |
EP (3) | EP2727082A2 (fr) |
DE (1) | DE102011078212B4 (fr) |
WO (1) | WO2013001031A2 (fr) |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011078212B4 (de) * | 2011-06-28 | 2017-06-29 | Scopis Gmbh | Verfahren und Vorrichtung zum Darstellen eines Objektes |
US20140275994A1 (en) * | 2013-03-15 | 2014-09-18 | Covidien Lp | Real time image guidance system |
US10154239B2 (en) | 2014-12-30 | 2018-12-11 | Onpoint Medical, Inc. | Image-guided surgery with surface reconstruction and augmented reality visualization |
CN106162065A (zh) * | 2015-04-15 | 2016-11-23 | 戴向东 | 双路视频输入的手术记录系统及方法 |
WO2017057330A1 (fr) * | 2015-09-28 | 2017-04-06 | オリンパス株式会社 | Système d'endoscope et procédé de traitement d'image |
EP3184071A1 (fr) * | 2015-12-22 | 2017-06-28 | SpineMind AG | Dispositif pour la navigation guidee par image intra-operatoire lors d'interventions chirurgicales dans la region de la colonne vertebrale et dans zone adjacente de la tete ou du bassin, du thorax |
WO2017160651A1 (fr) | 2016-03-12 | 2017-09-21 | Lang Philipp K | Dispositifs et méthodes pour chirurgie |
EP3568070B1 (fr) | 2017-01-16 | 2024-01-03 | Philipp K. Lang | Guidage optique pour procédures chirurgicales, médicales et dentaires |
CN106890025B (zh) * | 2017-03-03 | 2020-02-28 | 浙江大学 | 一种微创手术导航系统和导航方法 |
US20190117459A1 (en) | 2017-06-16 | 2019-04-25 | Michael S. Berlin | Methods and Systems for OCT Guided Glaucoma Surgery |
US20180360655A1 (en) | 2017-06-16 | 2018-12-20 | Michael S. Berlin | Methods and systems for oct guided glaucoma surgery |
AU2018214021A1 (en) * | 2017-08-10 | 2019-02-28 | Biosense Webster (Israel) Ltd. | Method and apparatus for performing facial registration |
US11801114B2 (en) | 2017-09-11 | 2023-10-31 | Philipp K. Lang | Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion |
US11348257B2 (en) | 2018-01-29 | 2022-05-31 | Philipp K. Lang | Augmented reality guidance for orthopedic and other surgical procedures |
BR112020019714A2 (pt) * | 2018-04-04 | 2021-02-09 | S.I.T.-Sordina Iort Technologies S.P.A. | sistema de radioterapia |
IT201800004953A1 (it) * | 2018-04-27 | 2019-10-27 | Procedimento e sistema diagnostico | |
US11062527B2 (en) * | 2018-09-28 | 2021-07-13 | General Electric Company | Overlay and manipulation of medical images in a virtual environment |
CN111325077B (zh) * | 2018-12-17 | 2024-04-12 | 同方威视技术股份有限公司 | 一种图像显示方法、装置、设备及计算机存储介质 |
US11857378B1 (en) | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
US11553969B1 (en) | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
US12053247B1 (en) | 2020-12-04 | 2024-08-06 | Onpoint Medical, Inc. | System for multi-directional tracking of head mounted displays for real-time augmented reality guidance of surgical procedures |
DE102021104219A1 (de) | 2021-02-23 | 2022-08-25 | Nano4Imaging Gmbh | Erkennung einer in biologisches Gewebe eingeführten Vorrichtung mit medizinischer Bildgebung |
US11786206B2 (en) | 2021-03-10 | 2023-10-17 | Onpoint Medical, Inc. | Augmented reality guidance for imaging systems |
EP4337125A1 (fr) * | 2021-05-10 | 2024-03-20 | Excera Inc. | Suivi et affichage d'échographie à échelles multiples |
US20230013884A1 (en) * | 2021-07-14 | 2023-01-19 | Cilag Gmbh International | Endoscope with synthetic aperture multispectral camera array |
DE102022205662B3 (de) * | 2022-06-02 | 2023-07-06 | Siemens Healthcare Gmbh | System zum Positionieren eines medizinischen Objekts in einer Solltiefe und Verfahren zum Aussenden einer Lichtverteilung |
Family Cites Families (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6019724A (en) | 1995-02-22 | 2000-02-01 | Gronningsaeter; Aage | Method for ultrasound guidance during clinical procedures |
US6167296A (en) | 1996-06-28 | 2000-12-26 | The Board Of Trustees Of The Leland Stanford Junior University | Method for volumetric image navigation |
US6466815B1 (en) | 1999-03-30 | 2002-10-15 | Olympus Optical Co., Ltd. | Navigation apparatus and surgical operation image acquisition/display apparatus using the same |
US6190395B1 (en) | 1999-04-22 | 2001-02-20 | Surgical Navigation Technologies, Inc. | Image guided universal instrument adapter and method for use with computer-assisted image guided surgery |
US6301495B1 (en) | 1999-04-27 | 2001-10-09 | International Business Machines Corporation | System and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan |
DE10015826A1 (de) * | 2000-03-30 | 2001-10-11 | Siemens Ag | System und Verfahren zur Erzeugung eines Bildes |
US6517478B2 (en) * | 2000-03-30 | 2003-02-11 | Cbyon, Inc. | Apparatus and method for calibrating an endoscope |
CA2314794A1 (fr) * | 2000-08-01 | 2002-02-01 | Dimitre Hristov | Appareil de localisation de lesions ou d'organes |
JP2004513684A (ja) | 2000-09-23 | 2004-05-13 | ザ ボード オブ トラスティーズ オブ ザ リーランド スタンフォード ジュニア ユニバーシティ | 内視鏡による標的化方法およびシステム |
US7605826B2 (en) | 2001-03-27 | 2009-10-20 | Siemens Corporate Research, Inc. | Augmented reality guided instrument positioning with depth determining graphics |
US7379077B2 (en) | 2001-08-23 | 2008-05-27 | Siemens Corporate Research, Inc. | Augmented and virtual reality guided instrument positioning using along-the-line-of-sight alignment |
US20030210812A1 (en) * | 2002-02-26 | 2003-11-13 | Ali Khamene | Apparatus and method for surgical navigation |
US7993353B2 (en) | 2002-06-04 | 2011-08-09 | Brainlab Ag | Medical tracking system with universal interface |
FR2855292B1 (fr) | 2003-05-22 | 2005-12-09 | Inst Nat Rech Inf Automat | Dispositif et procede de recalage en temps reel de motifs sur des images, notamment pour le guidage par localisation |
US20070208252A1 (en) | 2004-04-21 | 2007-09-06 | Acclarent, Inc. | Systems and methods for performing image guided procedures within the ear, nose, throat and paranasal sinuses |
DE102005012295B4 (de) * | 2005-03-17 | 2007-01-18 | Konen, Wolfgang, Dr. | Verfahren zur endoskopischen Navigation und zur Eichung von Endoskopsystemen sowie System |
ATE499894T1 (de) * | 2006-05-04 | 2011-03-15 | Navab Nassir | Interaktive virtuelle spiegelvorrichtung zur visualisierung von virtuellen objekten in endoskopischen anwendungen |
US8248413B2 (en) | 2006-09-18 | 2012-08-21 | Stryker Corporation | Visual navigation system for endoscopic surgery |
US8126239B2 (en) * | 2006-10-20 | 2012-02-28 | Siemens Aktiengesellschaft | Registering 2D and 3D data using 3D ultrasound data |
JP2008108246A (ja) | 2006-10-23 | 2008-05-08 | Internatl Business Mach Corp <Ibm> | 閲覧者の位置に従って仮想画像を生成するための方法、システム、およびコンピュータ・プログラム |
US8267853B2 (en) | 2008-06-23 | 2012-09-18 | Southwest Research Institute | System and method for overlaying ultrasound imagery on a laparoscopic camera display |
US8690776B2 (en) * | 2009-02-17 | 2014-04-08 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US8641621B2 (en) * | 2009-02-17 | 2014-02-04 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US8504139B2 (en) | 2009-03-10 | 2013-08-06 | Medtronic Xomed, Inc. | Navigating a surgical instrument |
EP2236104B1 (fr) | 2009-03-31 | 2013-06-19 | BrainLAB AG | Sortie d'image de navigation médicale dotée d'images primaires virtuelles et d'images secondaires réelles |
JP2011069965A (ja) * | 2009-09-25 | 2011-04-07 | Japan Atomic Energy Agency | 撮像装置、画像表示方法、及び画像表示プログラムが記録された記録媒体 |
EP2501320A4 (fr) * | 2009-11-19 | 2014-03-26 | Univ Johns Hopkins | Systèmes de navigation et d'intervention guidés par image à faible coût utilisant des ensembles coopératifs de capteurs locaux |
WO2011127379A2 (fr) | 2010-04-09 | 2011-10-13 | University Of Florida Research Foundation Inc. | Système interactif de réalité mélangée et ses utilisations |
DE102010042540B4 (de) | 2010-10-15 | 2014-09-04 | Scopis Gmbh | Verfahren und Vorrichtung zum Kalibrieren einer Abstandsbestimmungsvorrichtung eines optischen Systems |
CN103237518A (zh) | 2010-10-28 | 2013-08-07 | 菲亚戈股份有限公司 | 在医学中用于光学仪器的导航附加部和方法 |
US10674968B2 (en) | 2011-02-10 | 2020-06-09 | Karl Storz Imaging, Inc. | Adjustable overlay patterns for medical display |
DE102011078212B4 (de) * | 2011-06-28 | 2017-06-29 | Scopis Gmbh | Verfahren und Vorrichtung zum Darstellen eines Objektes |
US9974503B2 (en) | 2011-07-21 | 2018-05-22 | Carestream Dental Technology Topco Limited | System for paranasal sinus and nasal cavity analysis |
US9510771B1 (en) * | 2011-10-28 | 2016-12-06 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
WO2013126659A1 (fr) * | 2012-02-22 | 2013-08-29 | Veran Medical Technologies, Inc. | Systèmes, procédés et dispositifs pour une navigation à quatre dimensions dans un tissu mou |
US9314188B2 (en) * | 2012-04-12 | 2016-04-19 | Intellijoint Surgical Inc. | Computer-assisted joint replacement surgery and navigation systems |
US20140193056A1 (en) | 2013-01-10 | 2014-07-10 | Siemens Medical Solutions Usa, Inc. | Systems and Methods for Patient Anatomical Image Volume Data Visualization Using A Portable Processing Device |
CN105979900B (zh) | 2014-02-04 | 2020-06-26 | 皇家飞利浦有限公司 | 血管的深度和位置的可视化以及血管横截面的机器人引导的可视化 |
US10567660B2 (en) | 2014-03-14 | 2020-02-18 | Brainlab Ag | Overlay of anatomical information in a microscope image |
US10463242B2 (en) | 2014-07-09 | 2019-11-05 | Acclarent, Inc. | Guidewire navigation for sinuplasty |
US9547940B1 (en) | 2014-09-12 | 2017-01-17 | University Of South Florida | Systems and methods for providing augmented reality in minimally invasive surgery |
-
2011
- 2011-06-28 DE DE102011078212.5A patent/DE102011078212B4/de active Active
-
2012
- 2012-06-28 EP EP12745647.3A patent/EP2727082A2/fr not_active Withdrawn
- 2012-06-28 US US14/128,950 patent/US9792721B2/en active Active
- 2012-06-28 EP EP16161420.1A patent/EP3076369B1/fr active Active
- 2012-06-28 EP EP17203950.5A patent/EP3330922B1/fr active Active
- 2012-06-28 WO PCT/EP2012/062623 patent/WO2013001031A2/fr active Application Filing
-
2017
- 2017-10-16 US US15/784,303 patent/US10706610B2/en active Active
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Publication number | Publication date |
---|---|
EP3330922A1 (fr) | 2018-06-06 |
DE102011078212B4 (de) | 2017-06-29 |
EP3076369A1 (fr) | 2016-10-05 |
US20180053335A1 (en) | 2018-02-22 |
EP3330922B1 (fr) | 2019-05-15 |
US20140218366A1 (en) | 2014-08-07 |
DE102011078212A1 (de) | 2013-01-03 |
US9792721B2 (en) | 2017-10-17 |
EP2727082A2 (fr) | 2014-05-07 |
WO2013001031A2 (fr) | 2013-01-03 |
US10706610B2 (en) | 2020-07-07 |
WO2013001031A3 (fr) | 2013-03-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3076369B1 (fr) | Procede et dispositif destines a la representation d'un objet | |
DE69431875T2 (de) | Anordnung zur bestimmung der gegenseitigen lage von körpern | |
DE102014218558B4 (de) | Benutzerschnittstelle und Verfahren zur automatisierten Positionierung eines Untersuchungstisches relativ zu einer medizintechnischen bildgebenden Anlage | |
WO2012107041A1 (fr) | Système endoscopique de traitement d'image pourvu de moyens générant des informations de mesure géométriques dans la plage de prise de vues d'une caméra numérique optique | |
EP2632382B2 (fr) | Accessoire de navigation pour appareils optiques en médecine et procédé associé | |
DE4417944A1 (de) | Verfahren zum Korrelieren verschiedener Koordinatensysteme in der rechnergestützten, stereotaktischen Chirurgie | |
DE112010004349T5 (de) | Systeme & Verfahren zum Planen und Durchführen perkutaner Nadelverfahren | |
DE102007050607A1 (de) | Kamerakalibrierung für ein Endoskopnavigationssystem | |
WO1994019758A1 (fr) | Procede permettant de programmer et de controler le deroulement d'une intervention chirurgicale | |
EP3598948B1 (fr) | Système d'imagerie et procédé de génération d'une représentation stéréoscopique, programme informatique et mémoire de données | |
DE102012008812A1 (de) | Röntgenquelle mit Modul und Detektor für optische Strahlung | |
EP4213755B1 (fr) | Système d'assistance chirurgicale | |
DE212012000054U1 (de) | Geräte, Aufbau, Schaltungen und Systeme zum Beurteilen, Einschätzen und/oder Bestimmen relativer Positionen, Ausrichtungen, Orientierungen und Rotationswinkel eines Teils eines Knochens und zwischen zwei oder mehreren Teilen eines oder mehrerer Knochen | |
DE102011078405B4 (de) | Verfahren zur Endoskopie mit magnetgeführter Endoskopkapsel sowie Einrichtung dazu | |
EP3626176B1 (fr) | Procédé d'assistance d'un utilisateur, produit programme informatique, support de données et système d'imagerie | |
DE102010015060A1 (de) | Vorrichtung zur Lagerung, Abtastung, tomographischen Darstellung eines Patienten und Durchführung einer Intervention und Verfahren zur Bestimmung der räumlichen Relation zwischen optischen Aufnahmen und tomographischen Darstellungen | |
WO2018007091A1 (fr) | Dispositif d'imagerie dans une salle d'opération | |
EP3499461B1 (fr) | Représentation de marqueurs en imagerie médicale | |
DE102010018291B4 (de) | Navigationssystem und Röntgensystem | |
DE102020215559B4 (de) | Verfahren zum Betreiben eines Visualisierungssystems bei einer chirurgischen Anwendung und Visualisierungssystem für eine chirurgische Anwendung | |
DE102011114146A1 (de) | Verfahren und Vorrichtung zum Darstellen eins Objektes | |
WO2022073784A1 (fr) | Système d'assistance chirurgical et procédé de visualisation | |
EP4228543B1 (fr) | Système de navigation chirurgicale ayant un suivi d'instrument amélioré et procédé de navigation | |
DE102009042712B4 (de) | Wiedergabesystem und Verfahren zum Wiedergeben einer Operations-Umgebung | |
DE102010064320B4 (de) | Optischer Zeiger für ein Chirurgieassistenzsystem |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AC | Divisional application: reference to earlier application |
Ref document number: 2727082 Country of ref document: EP Kind code of ref document: P |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20170405 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 502012011745 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06T0007000000 Ipc: G06T0007330000 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06T 7/33 20170101AFI20170703BHEP Ipc: G06T 11/60 20060101ALI20170703BHEP |
|
INTG | Intention to grant announced |
Effective date: 20170725 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AC | Divisional application: reference to earlier application |
Ref document number: 2727082 Country of ref document: EP Kind code of ref document: P |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 951042 Country of ref document: AT Kind code of ref document: T Effective date: 20171215 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D Free format text: LANGUAGE OF EP DOCUMENT: GERMAN |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 502012011745 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180228 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 7 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180301 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 502012011745 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20180830 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20180630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180628 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180630 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180630 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180628 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180630 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MM01 Ref document number: 951042 Country of ref document: AT Kind code of ref document: T Effective date: 20180628 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180628 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20171129 Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20120628 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20171129 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180329 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R081 Ref document number: 502012011745 Country of ref document: DE Owner name: STRYKER EUROPEAN OPERATIONS LIMITED, CARRIGTWO, IE Free format text: FORMER OWNER: SCOPIS GMBH, 10179 BERLIN, DE |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: 732E Free format text: REGISTERED BETWEEN 20211014 AND 20211020 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: PD Owner name: STRYKER EUROPEAN OPERATIONS LIMITED; IE Free format text: DETAILS ASSIGNMENT: CHANGE OF OWNER(S), ASSIGNMENT; FORMER OWNER NAME: SCOPIS GMBH Effective date: 20211116 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230522 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20240515 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240509 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240502 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20240509 Year of fee payment: 13 |