WO2013158636A1 - Dual-mode stereo imaging system for tracking and control in surgical and interventional procedures - Google Patents
Dual-mode stereo imaging system for tracking and control in surgical and interventional procedures Download PDFInfo
- Publication number
- WO2013158636A1 WO2013158636A1 PCT/US2013/036773 US2013036773W WO2013158636A1 WO 2013158636 A1 WO2013158636 A1 WO 2013158636A1 US 2013036773 W US2013036773 W US 2013036773W WO 2013158636 A1 WO2013158636 A1 WO 2013158636A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fluorescent
- visual
- images
- light source
- image
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/00234—Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/32—Surgical robots operating autonomously
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Master-slave robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/76—Manipulators having means for providing feel, e.g. force or tactile feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/0036—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room including treatment, e.g., using an implantable medical device, ablating, ventilating
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0075—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/061—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7425—Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7455—Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M31/00—Devices for introducing or retaining media, e.g. remedies, in cavities of the body
- A61M31/005—Devices for introducing or retaining media, e.g. remedies, in cavities of the body for contrast media
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00115—Electrical control of surgical instruments with audible or visual output
- A61B2017/00119—Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/3941—Photoluminescent markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/395—Visible markers with marking agent for marking skin or other tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M5/00—Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
- A61M5/007—Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests for contrast media
Definitions
- the present embodiments relate generally to apparatuses and methods for tracking and control in surgery and interventional medical procedures.
- the present embodiments address at least this problem by introducing a robust tracking technique which requires minimal changes to the current robot-assisted surgical workflow and closing the loop with an effector function.
- Figure 1 shows the overall structure of the proposed embodiment of the invention in semi-autonomous mode where the surgical tasks are partially automated by visual servoing
- Figure 2 shows the embodiment of the system in the manual or master-slave robot- assisted mode
- Figure 3 represents an embodiment of the system with supervised autonomy
- Figure 4 shows a spectral range of the excitation and emission lights which clearly describes the distinct spectral ranges associated with the main components involved: i.e., hemoglobin's (oxygenated and deoxygenated), water and the fluorescent dye.
- Fluorescent dyes with different spectral ranges for excitation and emission can be synthesized (e.g. Cyanine dyes);
- Figure 5 illustrates an example of markers placed around a phantom cut
- Figure 6 illustrates images captured using a near infrared camera with two example fluorescent agents
- Figure 7 illustrates stereo image formation and triangulation to extract three dimensional (3D) coordinates of NIR markers according to one embodiment
- Figure 8 illustrates a flow diagram for an exemplary robotic operation algorithm
- Figure 9 illustrates a flow diagram for another exemplary robotic operation algorithm
- Figure 10 illustrates a flow diagram for a method according to one embodiment
- Figure 11 illustrates a block diagram of a computing device according to one embodiment. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
- the system includes a device configured to deploy fluorescent material on at least one of an organ under surgery and a surgical tool, a visual light source, a fluorescent light source corresponding to an excitation wavelength of the fluorescent material, an image acquisition and control element configured to control the visual light source and the fluorescent light source, and configured to capture and digitize at least one of resulting visual images and fluorescent images, and an image-based tracking module configured to apply image processing to the visual and fluorescent images, the image processing detecting fluorescent markers on at least one of the organ and the surgical tool.
- a surgical robot there is further included in the system a surgical robot, and a visual servoing control module configured to receive tracking information from the image-based tracking module and to control the surgical robot, based on the tracking information, to perform a surgical operation.
- a surgical robot there is further included in the system a surgical robot, and a visual servoing control module configured to receive tracking information from the image-based tracking module and to control the surgical robot, based on the tracking information, to perform a surgical operation.
- a manual control module configured to enable manual control of the surgical robot in place of control by the visual servoing control module.
- the visual servoing control module is further configured to receive manual input and to control the surgical robot, based on the manual input, to perform a surgical operation.
- a surgical robot there is further included in the system a surgical robot, and a manual control module configured to receive manual input and execute master-slave control of the surgical robot.
- the image-based tracking module further identifies the organ or the surgical tool based on the detected fluorescent markers.
- the image acquisition and control element further includes a dynamic tunable filter configured to alternatively pass visual light and light emitted by the fluorescent material, and a charged coupled device configured to capture at least one of visual images and fluorescent images.
- the display is stereoscopic or monoscopic.
- the image acquisition and control element generates stereoscopic or monoscopic images.
- the stereoscopic display is further configured to display visual images and a color coded overlay of fluorescent images.
- the stereoscopic display is further configured to display an augmented reality image by overlaying target points detected by the image-based tracking module.
- the system is configured to provide at least one of visual, audio, and haptic feedback to a system operator, based on information provided by the image-based tracking module.
- the system is configured to operate in each of a manual mode, a semi-autonomous mode, and an autonomous mode.
- the image-based tracking module identifies virtual boundaries based on the detected fluorescent markers to designate critical structures.
- the system further includes a detection device configured to determine whether a surgical tool has passed a boundary and to provide constraints on motion or provide alarms when the boundary has been crossed in order to protect the critical structures.
- the fluorescent light source is a near-infrared (NIR) light source.
- NIR near-infrared
- the image acquisition and control element includes two charge coupled devices (CCDs), one assigned to a visual spectrum and one assigned to a NIR spectrum.
- light generated by the visual light source and the fluorescent light source is split by either a beam-splitting or a dichromatic prism.
- light generated by the visual light source and the fluorescent light source are provided separate light paths to the two CCDs.
- the method includes the steps of deploying fluorescent material on at least one of an organ under surgery and a surgical tool, illuminating the organ, the surgical tool, or both, with a visual light source and a fluorescent light source, the fluorescent light source corresponding to an excitation wavelength of the fluorescent material, capturing and digitizing images resulting from the illumination by the visual light source and the fluorescent light source, and applying image processing to the digitized images, the image processing detecting fluorescent markers on at least one of the organ and the surgical tool.
- the step of generating tracking information by tracking the organ, the surgical tool, or both based on the detected fluorescent markers is further included in the method the step of generating tracking information by tracking the organ, the surgical tool, or both based on the detected fluorescent markers.
- the step of controlling a surgical robot, based on the tracking information, to perform a surgical operation is further included in the method.
- the step of providing a stereoscopic or monoscopic display of the digitized images there is further included in the method the step of providing a stereoscopic or monoscopic display of the digitized images.
- the step of capturing and digitizing images further includes the step of generating stereoscopic or monoscopic images.
- the step of displaying visual images and a color coded overlay of fluorescent images is further included in the method.
- the step of displaying an augmented reality image by overlaying target points detected by the image-based tracking module is further included in the method.
- the step of identifying the organ or the surgical tool based on the detected fluorescent markers is further included in the method the step of identifying the organ or the surgical tool based on the detected fluorescent markers.
- the step of performing a surgical procedure based on the detected fluorescent markers is further included in the method the step of performing a surgical procedure based on the detected fluorescent markers.
- the step of designating critical structures by identifying virtual boundaries based on the detected fluorescent markers.
- the system includes means for deploying fluorescent material on at least one of an organ under surgery and a surgical tool, a visual light source, a fluorescent light source corresponding to an excitation wavelength of the fluorescent material, means for controlling the visual light source and the fluorescent light source, means for capturing and digitizing at least one of resulting visual images and fluorescent images, and means for applying image processing to the visual and fluorescent images, the image processing detecting fluorescent markers on at least one of the organ and the surgical tool.
- the disclosed embodiments may be applied in the field automated anastomosis where tubular structures (vessels, bile ducts, urinary tract, etc.) are connected and sealed.
- Anastomosis is one of the four major steps in every surgery: 1) Access through incision; 2) Exposure and dissection; 3) Resection and removal of pathology; and 4) Reconstruction and closure (Anastomosis).
- Anastomosis is currently performed by suturing or applying clips or glue to the anastomosis site.
- the anastomosis procedure may be performed manually or by using robots through master-slave control, both techniques are very time consuming and cumbersome.
- the present embodiments make it possible for the surgeon to mark the anastomosis site by applying fluorescent markers (in terms of miniature clips, spray, paint, tapes, etc.) which can be detected and tracked using the dual-spectrum imaging technology.
- a robotic system can be controlled through visual servoing using this tracking information, in order to apply sutures/clips/glue or weld at specified positions.
- Automation of other steps of surgery Automating all parts of surgery including exposure and dissection, and resection and removal of pathology.
- Automated tumor resection/ablation a tumor will be painted using a fluorescent dye and the robotic system will be guided/controlled to resect or ablate the tumor. This can be applied in applications such as partial nephrectomy, hepatectomy, etc.
- Positional marker for motion tracking/memory during endoscopic procedure [0060] Positional marker for motion tracking/memory during endoscopic procedure. [0061] Some variants of embodiments of the technology are listed below: [0062] The technology can be used with multiple dyes with excitation/emission at different wavelengths. This can be applied to have inherently different markers for tracking multiple objects. In one embodiment, fluorescent dyes A and B are used to mark the two sides of a tubular structure prior to automated anastomosis.
- the markers can be applied to the targets both internally and externally.
- the fluorescent dye can be attached to the target by clips, staples, glue or can be applied by painting or spraying.
- the dye can also be injected to the tissue to mark specific points or can be injected through blood.
- the dye can be selected in order to bind with specific types of cells to mark specific structures (such as tumors).
- Providing "no-fly zones” or “virtual fixtures” to prevent the surgical tools from approaching critical structures the surgeon marks the critical structures prior to the task and the marked borders will be tracked using the dual-mode imaging technology.
- the coordinates will be used to force constraints on the motion of the surgical tools during the automated or semi-automated task. It can also be used to provide alarms (visual/audio or haptic) in manual tasks.
- the imaging system can be monoscopic and provide two-dimensional location of the tracked points which can potentially be used for image-based visual servoing.
- the imaging system can be stereoscopic and provide three-dimensional location of the tracked structures and therefore be used for image-based or position-based visual servoing.
- the embodiments of the technology can be applied for automated or semi-automated applications. It can also provide guidance for manual operations through visual, audio or haptic feedback.
- the present embodiments address these limitations by using a dual-spectrum imaging device which can image in the visual spectrum as well as in near-infrared (NIR) spectrum.
- the surgeon places fluorescent markers on the locations which should be tracked (e.g., tools and tissue);
- the excitation light generated by the imaging device causes the fluorophores to emit NIR light which will be detected by the imaging device.
- the system has a high signal to noise ratio (SNR) because of (a) limited autofluorescence of the tissue compared to the fluorescent dyes, and (b) lack of other NIR sources in the patient's body. This high SNR makes any tracking algorithm more robust and reliable.
- NIR light has a good penetration in the tissue as opposed to the visible light; this makes it possible to track an object even if it is occluded by another organ, flipped over, covered by blood, etc.
- a combination of visual and NIR images can be used to make image-based tracking algorithms even more robust.
- One embodiment describes a system for automation of surgical tasks. It is based on deploying fluorescent markers on the organ under surgery and/or on the surgical tool, tracking the markers in real-time and controlling the surgical tool via visually servoing.
- Figures 1, 2 and 3 represent different modes of the operation for the system.
- Fluorescent markers are deployed on the organ (e.g. two sides of a bile duct to be
- the markers can also be generated by techniques such as by mixing fluorescent dye, e.g. Indocyanine green (ICG), with a biocompatible glue e.g. Cyanoacrylate-ICG mix, delivered by pipette, or spray.
- the markers can also be generated by any element which provides sufficient fluorescence.
- Figure 4 shows spectral characteristics of a fluorescent dye.
- Fluorescent dye can be chosen to have its emitted wavelength beyond the visible light range in order to achieve a high signal to noise ratio in the near-infrared images.
- Also having the fluorescent emission 400 and excitation 401 wavelengths away from peak absorption wavelengths of water 402 and hemoglobin 403 provides a stronger signal and makes it easier to track fluorescent markers in presence of soft tissue (with high water content) and blood.
- multiple different markers are used to help track multiple structures, organs, and tools. Using different markers reduces the error rate for tracking, since the number of similar markers is reduced. Differentiation of markers can be achieved by having different size or volume and/or shape of the markers and or using dyes with excitation/emission at different wavelengths. In one embodiment, markers with 3 micro liters volume and markers with 6 micro liters volume are used to mark the two sides of a tubular structure respectively prior to automated anastomosis. In another embodiment, a fluorescent dye emitting at 790nm corresponds to the no-fly zone while a different wavelength 830nm corresponds to an edge of a structure.
- each structure i.e. organ, stream segment
- each marker is automatically assigned a unique identification number and is automatically labeled with the structure identification number to which it is attached.
- the label of each marker is used to determine which structure it belongs and its overlay color. This tracking may be performed using tables or databases implemented by a computer processor and corresponding software instructions.
- Figure 5 illustrates markers placed on around a phantom cut.
- a first set of markers 451 on the top side of the cut are labeled with a first color (e.g. yellow), and a second set of markers 452 on the bottom side of a cut are labeled with a second color (e.g. green).
- a first color e.g. yellow
- a second color e.g. green
- Figures 1-3 illustrate two light sources 102 and 104 illuminate the scene.
- One light source 104 is a visual light source that makes it possible to acquire normal images of the organs.
- the other light source 102 is a narrow-band source of light (e.g. in the near infrared range) that is chosen according to the excitation wavelength of the fluorescent material.
- a "dynamic tunable filter" 103 changes the filter's characteristics in real-time to pass the visual light and the light emitted by the fluorescent material alternatively. At each moment the filter 103 only passes one type of light and suppresses the other.
- a wide-band CCD 105 captures images of the received light from either source.
- the light sources 102 and 104, the tunable filter 103 and the image capturing in the CCD 105 are controlled and synchronized by the image acquisition and control module 106.
- the image acquisition system runs at a high frame rate (e.g. 60Hz to 120Hz) and therefore it acts like two imaging systems with different wavelengths.
- NIR and visual light is split by using either a beam- splitting or a dichromatic prism, with two CCDs capturing images, one for the visual spectrum and one for the NIR spectrum.
- the stereoscopic display 109 provides the acquired visual images; it can also display fluorescent images as a color coded overlay or display an augmented reality image by overlaying the target points detected by the image-based tracking module 107.
- the image- based tracking module 107 applies image processing algorithms to detect the fluorescent markers in order to track the tools and the organ. Visual features can also be used for tracking.
- the image-based tracking module 107 also includes a tracking module that performs pre-processing of the NIR image and visual tracking based on the processed image information.
- the pre-processing algorithm involves image processing algorithms, such as image smoothing, to mitigate the effect of sensor noise; image histogram equalization to enhance the pixel intensity values, and image segmentation based on pixel intensity values to extract templates for the NIR markers.
- the visual trackers are initialized first. The initialization of the visual trackers starts by detection and segmentation of the NIR marker. Segmentation is based on applying an adaptive intensity threshold on the enhanced NIR image to obtain a binary template for the NIR markers.
- a two dimensional (2D) median filter and additional morphology-based binary operators may be applied on the binary template to remove segmentation noise.
- the binary template may be used as a starting base for visual tracking of NIR markers using visual tracking algorithms. After pre-processing and segmentation, the NIR template is a white blob on a darker background, which represents the rest of the surgical field in the NIR image.
- intraoperative commands from the surgeon 100 and sends appropriate commands to the robot in real-time in order to control the surgical robot 101 and the surgical tool(s) 1 10 to obtain a predetermined goal (e.g. anastomosis).
- the surgeon 100 can be provided with visual, audio or haptic feedback 1 10 while he/she is looking at the stereoscopic display.
- surgeon controls the surgical tool manually (like in conventional laparoscopic surgery) or through master-slave control (201) of a robot arm.
- the surgeon receives visual feedback through the stereoscopic display (109) and may also be provided with other visual, audio or haptic feedback but the control loop is solely closed through the surgeon.
- the tracked visual markers are used to guide the motion of the robot.
- Each visual marker is represented by a representative vector of numbers, which is typically called a visual feature.
- Examples of visual features are coordinates of the centers of NIR markers extracted from the binary image, and/or their higher-order image moments (such as their area in terms of number of pixels).
- Figure 6 illustrates images captured using a NIR camera with two example fluorescent agents.
- Image 601 illustrates a binary image after image processing.
- Image 602 illustrates data that can be used as visual tracking information.
- Robot motion is performed by transforming the sensor measurements into global Cartesian coordinate form for the robot.
- the NIR and tool markers are tracked in the stereo images to compute the 3D coordinates of the marker or tool with respect to the surgical field, as shown in Figure 7.
- Figure 7 illustrates stereo image formation and triangulation to extract three dimensional (3D) coordinates of the NIR Markers. These 3D coordinates are used by the robot motion control algorithm in open-loop or closed-loop architecture. The error between the tool position and the marker position is calculated and used to generate the desired tool displacement.
- PBVS position-based visual servoing
- IBVS image-based visual servoing
- the NIR based robot motion control is a core technology which has not been developed in the past. Previous methods and apparatuses for NIR based imaging (without robot control, Frangioni 2012, US 8,229,548 B2) and NIR based display (Mohr and Mohr, US 201 1/0082369) fail to consider robot motion control or any control whatsoever. With a stereo imaging system consisting of two NIR cameras with appropriate filters, a properly excited NIR agent can be seen in both stereo images. Image processing and visual tracking algorithms, such as the algorithms described above as being implemented by the image-based tracking module 107, are utilized to visually track each NIR marker in the image.
- the 3D estimate of a marker position is found by triangulation of the NIR marker image as seen in both left 701 and right 703 NIR stereo image pairs.
- the 3D estimate of the NIR marker can then be re-projected as an overlay in the RGB image 702.
- the tool position is also found from the stereo image pair.
- the stereo NIR system can be replaced by a 3D sensing camera capable of NIR observation.
- the embodiments described herein are also very useful in non-stereo applications.
- the system can be implemented for mono camera applications.
- mono camera images are sufficient.
- semi-autonomous mode depth of the target points is important for the robot to perform positioning tasks.
- Stereo imaging can provide depth information.
- there are other depth sensors available that do not require a second camera such as time of flight, conoscope, laser, and other depth cameras.
- This invention would also work with single cameras for manual and master-slave mode.
- the present embodiments would also work with single camera and an additional depth sensor.
- Figures 8 and 9 illustrate two flow charts of exemplary robotic operation algorithms implemented by the system.
- Figure 8 illustrates an algorithm for robotic knot tying
- Figure 9 illustrates an algorithm for robotic suturing.
- the marker positions are used to estimate knot 3D position ( Figure 8) and suture 3D position ( Figure 9).
- the flow charts describe the robotic motions that follow position estimation.
- the robotic operation algorithm begins in step S801 with the execution of an estimation of the knot.
- step S802 the knot offset is determined and communicated to the robot.
- step S803 the robot moves to hover above the suture placement.
- step S804 the approach process is performed.
- the robot takes into account the position information obtained based on the detected markers.
- the robot uses visual servoing to guide the needle toward the NIR marker.
- step S805 the needle is triggered. This trigger could be met when the robot has come within a predetermined distance of the knot.
- step S806 the robots lifts the tool to pull enough thread.
- step S807 the robot lifts the tool furthermore until a sufficient tension F is measured in the thread. This process is repeated for the number of desired loops in the knot.
- FIG. 9 is an example of a robotic suturing process.
- the suture 3D position track is estimated.
- the suture offset is determined.
- the robot moves to hover above the suture placement.
- the robot uses visual servoing to drive the needle toward the placement indicated by the NIR marker.
- the suture is triggered.
- an estimation of the length of thread is calculated. Using this estimation, in step S907, the robot lifts the needle to complete the suture.
- steps S908, S909 robot lifts the needle until a tension of F is measured in the thread. The system exits if the tension is greater than F.
- FIG 10 illustrates an overall process according to one embodiment.
- fluorescent dye markers are deployed to a surgical field.
- the dye markers can be deployed, for example, by spraying, painting, attachment, tissue injection, intravenous injection etc.
- the surgical field is illuminated with fluorescent and visible light sources.
- light is captured with a camera. The light captured by the camera is both in the visible and IR range.
- the resulting images are processed by the image processing algorithms described previously in order to identify markers in the image.
- step SI 005 based on the detected markers, the tool or organ, which is marked by the markers is tracked.
- step SI 006 This tracking is described in detail previously and includes determining the location of tools, organs, or other marked portions of the subject within the surgical field based on markers which are associated with respective elements.
- step SI 006 a stereo display is provided based on the tracking.
- step SI 008 visual, audio and haptic feedback is provided to the surgeon.
- step SI 009 a robot is controlled based on the tracking.
- the computer processor can be implemented as discrete logic gates, as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Complex Programmable Logic Device (CPLD).
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- CPLD Complex Programmable Logic Device
- An FPGA or CPLD implementation may be coded in VHDL, Verilog or any other hardware description language and the code may be stored in an electronic memory directly within the FPGA or CPLD, or as a separate electronic memory.
- the electronic memory may be non-volatile, such as ROM, EPROM, EEPROM or FLASH memory.
- the electronic memory may also be volatile, such as static or dynamic RAM, and a processor, such as a microcontroller or microprocessor, may be provided to manage the electronic memory as well as the interaction between the FPGA or CPLD and the electronic memory.
- the computer processor may execute a computer program including a set of computer-readable instructions that perform the functions described herein, the program being stored in any of the above-described non-transitory electronic memories and/or a hard disk drive, CD, DVD, FLASH drive or any other known storage media.
- the computer-readable instructions may be provided as a utility application, background daemon, or component of an operating system, or combination thereof, executing in conjunction with a processor, such as a Xenon processor from Intel of America or an Opteron processor from AMD of America and an operating system, such as Microsoft VISTA, UNIX, Solaris, LINUX, Apple, MAC-OSX and other operating systems known to those skilled in the art.
- a processor such as a Xenon processor from Intel of America or an Opteron processor from AMD of America
- an operating system such as Microsoft VISTA, UNIX, Solaris, LINUX, Apple, MAC-OSX and other operating systems known to those skilled in the art.
- the computer 1000 includes a bus B or other
- the computer 1000 also includes a main memory /memory unit 1003, such as a random access memory (RAM) or other dynamic storage device (e.g., dynamic RAM (DRAM), static RAM (SRAM), and synchronous DRAM (SDRAM)), coupled to the bus B for storing information and instructions to be executed by processor/CPU 1004.
- main memory /memory unit 1003 such as a random access memory (RAM) or other dynamic storage device (e.g., dynamic RAM (DRAM), static RAM (SRAM), and synchronous DRAM (SDRAM)), coupled to the bus B for storing information and instructions to be executed by processor/CPU 1004.
- the memory unit 1003 may be used for storing temporary variables or other intermediate information during the execution of instructions by the CPU 1004.
- the computer 1000 may also further include a read only memory (ROM) or other static storage device (e.g., programmable ROM (PROM), erasable PROM (EPROM), and electrically erasable PROM (EEPROM)) coupled to the bus B for storing static information and instructions for the CPU 1004.
- ROM read only memory
- PROM programmable ROM
- EPROM erasable PROM
- EEPROM electrically erasable PROM
- the computer 1000 may also include a disk controller coupled to the bus B to control one or more storage devices for storing information and instructions, such as mass storage 1002, and drive device 1006 (e.g., floppy disk drive, read-only compact disc drive, read/write compact disc drive, compact disc jukebox, tape drive, and removable magneto-optical drive).
- the storage devices may be added to the computer 1000 using an appropriate device interface (e.g., small computer system interface (SCSI), integrated device electronics (IDE), enhanced- IDE (E-IDE), direct memory access (DMA), or ultra-DMA).
- SCSI small computer system interface
- IDE integrated device electronics
- E-IDE enhanced- IDE
- DMA direct memory access
- ultra-DMA ultra-DMA
- the computer 1000 may also include special purpose logic devices (e.g., application specific integrated circuits (ASICs)) or configurable logic devices (e.g., simple programmable logic devices (SPLDs), complex programmable logic devices (CPLDs), and field
- special purpose logic devices e.g., application specific integrated circuits (ASICs)
- configurable logic devices e.g., simple programmable logic devices (SPLDs), complex programmable logic devices (CPLDs), and field
- FPGAs programmable gate arrays
- the computer 1000 may also include a display controller coupled to the bus B to control a display, such as a cathode ray tube (CRT), for displaying information to a computer user.
- a display such as a cathode ray tube (CRT)
- the computer system includes input devices, such as a keyboard and a pointing device, for interacting with a computer user and providing information to the processor.
- the pointing device for example, may be a mouse, a trackball, or a pointing stick for communicating direction information and command selections to the processor and for controlling cursor movement on the display.
- a printer may provide printed listings of data stored and/or generated by the computer system.
- the computer 1000 performs at least a portion of the processing steps of the invention in response to the CPU 1004 executing one or more sequences of one or more instructions contained in a memory, such as the memory unit 1003. Such instructions may be read into the memory unit from another computer readable medium, such as the mass storage 1002 or a removable media 1001.
- processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in memory unit 1003.
- hard- wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
- the computer 1000 includes at least one computer readable medium 1001 or memory for holding instructions programmed according to the teachings of the invention and for containing data structures, tables, records, or other data described herein.
- Examples of computer readable media are compact discs, hard disks, floppy disks, tape, magneto-optical disks, PROMs (EPROM, EEPROM, flash EPROM), DRAM, SRAM, SDRAM, or any other magnetic medium, compact discs (e.g., CD-ROM), or any other medium from which a computer can read.
- the present invention includes software for controlling the main processing unit 1004, for driving a device or devices for implementing the invention, and for enabling the main processing unit
- Such software may include, but is not limited to, device drivers, operating systems, development tools, and applications software.
- Such computer readable media further includes the computer program product of the present invention for performing all or a portion (if processing is distributed) of the processing performed in implementing the invention.
- the computer code elements on the medium of the present invention may be any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs), Java classes, and complete executable programs. Moreover, parts of the processing of the present invention may be distributed for better performance, reliability, and/or cost.
- Non-volatile media includes, for example, optical, magnetic disks, and magneto- optical disks, such as the mass storage 1002 or the removable media 1001.
- Volatile media includes dynamic memory, such as the memory unit 1003.
- Various forms of computer readable media may be involved in carrying out one or more sequences of one or more instructions to the CPU 1004 for execution.
- the instructions may initially be carried on a magnetic disk of a remote computer.
- An input coupled to the bus B can receive the data and place the data on the bus B.
- the bus B carries the data to the memory unit 1003, from which the CPU 1004 retrieves and executes the instructions.
- the instructions received by the memory unit 1003 may optionally be stored on mass storage 1002 either before or after execution by the CPU 1004.
- the computer 1000 also includes a communication interface 1005 coupled to the bus B.
- the communication interface 1004 provides a two-way data communication coupling to a network that is connected to, for example, a local area network (LAN), or to another communications network such as the Internet.
- LAN local area network
- the communication interface 1005 provides a two-way data communication coupling to a network that is connected to, for example, a local area network (LAN), or to another communications network such as the Internet.
- LAN local area network
- the communication interface 1005 provides a two-way data communication coupling to a network that is connected to, for example, a local area network (LAN), or to another communications network such as the Internet.
- LAN local area network
- the Internet such as the Internet
- the communication interface 1005 may be a network interface card to attach to any packet switched LAN.
- the communication interface 1005 may be an asymmetrical digital subscriber line (ADSL) card, an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of communications line. Wireless links may also be implemented.
- the communication interface 1005 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
- the network typically provides data communication through one or more networks to other data devices.
- the network may provide a connection to another computer through a local network (e.g., a LAN) or through equipment operated by a service provider, which provides communication services through a communications network.
- the local network and the communications network use, for example, electrical, electromagnetic, or optical signals that carry digital data streams, and the associated physical layer (e.g., CAT 5 cable, coaxial cable, optical fiber, etc).
- the network may provide a connection to a mobile device such as a personal digital assistant (PDA) laptop computer, or cellular telephone.
- PDA personal digital assistant
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Hematology (AREA)
- Anesthesiology (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Human Computer Interaction (AREA)
- Gynecology & Obstetrics (AREA)
- Endoscopes (AREA)
- Manipulator (AREA)
- Image Processing (AREA)
- Vascular Medicine (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020147028899A KR102214789B1 (en) | 2012-04-16 | 2013-04-16 | Dual-mode stereo imaging system for tracking and control in surgical and interventional procedures |
JP2015507109A JP2015523102A (en) | 2012-04-16 | 2013-04-16 | Dual-mode stereo imaging system for tracking and control in surgical and interventional procedures |
EP13778281.9A EP2838463B1 (en) | 2012-04-16 | 2013-04-16 | Dual-mode stereo imaging system for tracking and control in surgical and interventional procedures |
ES13778281.9T ES2653924T3 (en) | 2012-04-16 | 2013-04-16 | Dual mode stereo imaging system for monitoring and control in surgical and interventional procedures |
CN201380025626.1A CN104582622B (en) | 2012-04-16 | 2013-04-16 | For the tracking in surgery and intervention medical procedure and the bimodulus stereo imaging system of control |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261624665P | 2012-04-16 | 2012-04-16 | |
US61/624,665 | 2012-04-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013158636A1 true WO2013158636A1 (en) | 2013-10-24 |
Family
ID=49325701
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2013/036773 WO2013158636A1 (en) | 2012-04-16 | 2013-04-16 | Dual-mode stereo imaging system for tracking and control in surgical and interventional procedures |
Country Status (7)
Country | Link |
---|---|
US (1) | US20130274596A1 (en) |
EP (1) | EP2838463B1 (en) |
JP (1) | JP2015523102A (en) |
KR (1) | KR102214789B1 (en) |
CN (1) | CN104582622B (en) |
ES (1) | ES2653924T3 (en) |
WO (1) | WO2013158636A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3165153A1 (en) | 2015-11-05 | 2017-05-10 | Deutsches Krebsforschungszentrum Stiftung des Öffentlichen Rechts | System for fluorescence aided surgery |
Families Citing this family (128)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103501416B (en) | 2008-05-20 | 2017-04-12 | 派力肯成像公司 | Imaging system |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
WO2011063347A2 (en) | 2009-11-20 | 2011-05-26 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
CN103004180A (en) | 2010-05-12 | 2013-03-27 | 派力肯影像公司 | Architectures for imager arrays and array cameras |
US8878950B2 (en) | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
CN103765864B (en) | 2011-05-11 | 2017-07-04 | 派力肯影像公司 | For transmitting the system and method with receiving array camera image data |
WO2013036233A1 (en) * | 2011-09-08 | 2013-03-14 | Intel Corporation | Augmented reality based on imaged object characteristics |
WO2013043761A1 (en) | 2011-09-19 | 2013-03-28 | Pelican Imaging Corporation | Determining depth from multiple views of a scene that include aliasing using hypothesized fusion |
US9129183B2 (en) | 2011-09-28 | 2015-09-08 | Pelican Imaging Corporation | Systems and methods for encoding light field image files |
WO2013126578A1 (en) | 2012-02-21 | 2013-08-29 | Pelican Imaging Corporation | Systems and methods for the manipulation of captured light field image data |
US10383765B2 (en) * | 2012-04-24 | 2019-08-20 | Auris Health, Inc. | Apparatus and method for a global coordinate system for use in robotic surgery |
EP2873028A4 (en) | 2012-06-28 | 2016-05-25 | Pelican Imaging Corp | Systems and methods for detecting defective camera arrays, optic arrays, and sensors |
US20140002674A1 (en) | 2012-06-30 | 2014-01-02 | Pelican Imaging Corporation | Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors |
AU2013305770A1 (en) | 2012-08-21 | 2015-02-26 | Pelican Imaging Corporation | Systems and methods for parallax detection and correction in images captured using array cameras |
EP2888698A4 (en) | 2012-08-23 | 2016-06-29 | Pelican Imaging Corp | Feature based high resolution motion estimation from low resolution images captured using an array source |
US9477307B2 (en) * | 2013-01-24 | 2016-10-25 | The University Of Washington | Methods and systems for six degree-of-freedom haptic interaction with streaming point data |
US9462164B2 (en) | 2013-02-21 | 2016-10-04 | Pelican Imaging Corporation | Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information |
US9774789B2 (en) | 2013-03-08 | 2017-09-26 | Fotonation Cayman Limited | Systems and methods for high dynamic range imaging using array cameras |
US8866912B2 (en) | 2013-03-10 | 2014-10-21 | Pelican Imaging Corporation | System and methods for calibration of an array camera using a single captured image |
WO2014164909A1 (en) | 2013-03-13 | 2014-10-09 | Pelican Imaging Corporation | Array camera architecture implementing quantum film sensors |
US9124831B2 (en) | 2013-03-13 | 2015-09-01 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
WO2014153098A1 (en) | 2013-03-14 | 2014-09-25 | Pelican Imaging Corporation | Photmetric normalization in array cameras |
US9578259B2 (en) | 2013-03-14 | 2017-02-21 | Fotonation Cayman Limited | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
US9497429B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Extended color processing on pelican array cameras |
EP2973476A4 (en) | 2013-03-15 | 2017-01-18 | Pelican Imaging Corporation | Systems and methods for stereo imaging with camera arrays |
US9633442B2 (en) * | 2013-03-15 | 2017-04-25 | Fotonation Cayman Limited | Array cameras including an array camera module augmented with a separate camera |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US9445003B1 (en) | 2013-03-15 | 2016-09-13 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
WO2015048694A2 (en) | 2013-09-27 | 2015-04-02 | Pelican Imaging Corporation | Systems and methods for depth-assisted perspective distortion correction |
US10119808B2 (en) | 2013-11-18 | 2018-11-06 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
US9456134B2 (en) | 2013-11-26 | 2016-09-27 | Pelican Imaging Corporation | Array camera configurations incorporating constituent array cameras and constituent cameras |
JP6487455B2 (en) * | 2014-01-29 | 2019-03-20 | ベクトン・ディキンソン・アンド・カンパニーBecton, Dickinson And Company | Wearable electronic device for improved visualization during insertion of an invasive device |
EP3114677B1 (en) | 2014-03-03 | 2020-08-05 | University of Washington | Haptic virtual fixture tools |
US10089740B2 (en) | 2014-03-07 | 2018-10-02 | Fotonation Limited | System and methods for depth regularization and semiautomatic interactive matting using RGB-D images |
KR20150128049A (en) * | 2014-05-08 | 2015-11-18 | 삼성전자주식회사 | Surgical robot and control method thereof |
US20160106516A1 (en) * | 2014-05-30 | 2016-04-21 | Sameh Mesallum | Systems for automated biomechanical computerized surgery |
US20150356737A1 (en) * | 2014-06-09 | 2015-12-10 | Technical Illusions, Inc. | System and method for multiple sensor fiducial tracking |
WO2016023846A2 (en) | 2014-08-15 | 2016-02-18 | Sanofi-Aventis Deutschland Gmbh | An injection device and a supplemental device configured for attachment thereto |
US10250871B2 (en) | 2014-09-29 | 2019-04-02 | Fotonation Limited | Systems and methods for dynamic calibration of array cameras |
US9486128B1 (en) | 2014-10-03 | 2016-11-08 | Verily Life Sciences Llc | Sensing and avoiding surgical equipment |
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
WO2016126914A1 (en) * | 2015-02-05 | 2016-08-11 | Intuitive Surgical Operations, Inc. | System and method for anatomical markers |
KR101667152B1 (en) * | 2015-05-22 | 2016-10-24 | 고려대학교 산학협력단 | Smart glasses system for supplying surgery assist image and method for supplying surgery assist image using smart glasses |
WO2016190607A1 (en) * | 2015-05-22 | 2016-12-01 | 고려대학교 산학협력단 | Smart glasses system for providing surgery assisting image and method for providing surgery assisting image by using smart glasses |
KR102371053B1 (en) * | 2015-06-04 | 2022-03-10 | 큐렉소 주식회사 | Surgical robot system |
KR102378632B1 (en) * | 2015-07-28 | 2022-03-25 | 한국전자기술연구원 | Apparatus for detecting chest compression position and electrode pad attachment location |
EP3310286A1 (en) * | 2015-08-13 | 2018-04-25 | Siemens Healthcare GmbH | Device and method for controlling a system comprising an imaging modality |
US11202680B2 (en) | 2015-08-14 | 2021-12-21 | Intuitive Surgical Operations, Inc. | Systems and methods of registration for image-guided surgery |
CN108024698B (en) | 2015-08-14 | 2020-09-15 | 直观外科手术操作公司 | Registration system and method for image guided surgery |
US10046459B2 (en) * | 2015-11-16 | 2018-08-14 | Abb Schweiz Ag | Three-dimensional visual servoing for robot positioning |
US11172895B2 (en) | 2015-12-07 | 2021-11-16 | Covidien Lp | Visualization, navigation, and planning with electromagnetic navigation bronchoscopy and cone beam computed tomography integrated |
JP6493885B2 (en) | 2016-03-15 | 2019-04-03 | 富士フイルム株式会社 | Image alignment apparatus, method of operating image alignment apparatus, and image alignment program |
EP3432776B1 (en) * | 2016-03-23 | 2023-03-29 | The Procter & Gamble Company | Imaging method for determining stray fibers |
CN105856259B (en) * | 2016-06-19 | 2017-12-01 | 福州环亚众志计算机有限公司 | Intelligent transfusion robot based on internet of things |
US10695134B2 (en) | 2016-08-25 | 2020-06-30 | Verily Life Sciences Llc | Motion execution of a robotic system |
CN110192390A (en) | 2016-11-24 | 2019-08-30 | 华盛顿大学 | The light-field capture of head-mounted display and rendering |
WO2018112424A1 (en) * | 2016-12-16 | 2018-06-21 | Intuitive Surgical Operations, Inc. | Systems and methods for teleoperated control of an imaging instrument |
US10537394B2 (en) * | 2016-12-19 | 2020-01-21 | Ethicon Llc | Hot device indication of video display |
CN108937849A (en) * | 2017-05-29 | 2018-12-07 | 王虎 | One kind indicating system for the imaging of tumour nano target fluorescence probe and surgical navigational |
US10552978B2 (en) | 2017-06-27 | 2020-02-04 | International Business Machines Corporation | Dynamic image and image marker tracking |
EP3424458B1 (en) | 2017-07-07 | 2020-11-11 | Leica Instruments (Singapore) Pte. Ltd. | Apparatus and method for tracking a movable target |
US10482618B2 (en) | 2017-08-21 | 2019-11-19 | Fotonation Limited | Systems and methods for hybrid depth regularization |
WO2019091875A1 (en) * | 2017-11-07 | 2019-05-16 | Koninklijke Philips N.V. | Augmented reality triggering of devices |
KR101852403B1 (en) | 2017-11-17 | 2018-04-27 | 부경대학교 산학협력단 | Parathyroid real-time sensing system |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
US20200015900A1 (en) | 2018-07-16 | 2020-01-16 | Ethicon Llc | Controlling an emitter assembly pulse sequence |
EP3824621A4 (en) | 2018-07-19 | 2022-04-27 | Activ Surgical, Inc. | Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots |
US11344374B2 (en) | 2018-08-13 | 2022-05-31 | Verily Life Sciences Llc | Detection of unintentional movement of a user interface device |
CN108833883A (en) * | 2018-08-24 | 2018-11-16 | 上海准视生物科技有限公司 | A kind of system and method for real-time generation and display 2D/3D image and image |
JP7278387B2 (en) | 2018-09-14 | 2023-05-19 | ニューラリンク コーポレーション | Device embedding using cartridge |
CA3112749A1 (en) * | 2018-09-14 | 2020-03-19 | Neuralink Corp. | Computer vision techniques |
US10623660B1 (en) | 2018-09-27 | 2020-04-14 | Eloupes, Inc. | Camera array for a mediated-reality system |
WO2020081651A1 (en) | 2018-10-16 | 2020-04-23 | Activ Surgical, Inc. | Autonomous methods and systems for tying surgical knots |
US11278360B2 (en) * | 2018-11-16 | 2022-03-22 | Globus Medical, Inc. | End-effectors for surgical robotic systems having sealed optical components |
CN109754007A (en) * | 2018-12-27 | 2019-05-14 | 武汉唐济科技有限公司 | Peplos intelligent measurement and method for early warning and system in operation on prostate |
CN109662695A (en) * | 2019-01-16 | 2019-04-23 | 北京数字精准医疗科技有限公司 | Fluorescent molecules imaging system, device, method and storage medium |
WO2020154351A1 (en) * | 2019-01-25 | 2020-07-30 | Intuitive Surgical Operations, Inc. | Augmented medical vision systems and methods |
WO2020210168A1 (en) | 2019-04-08 | 2020-10-15 | Activ Surgical, Inc. | Systems and methods for medical imaging |
KR20220002372A (en) * | 2019-04-12 | 2022-01-06 | 로켓 이노베이션스, 아이엔씨. | Recording Surface Boundary Markers for Computer Vision |
US11516388B2 (en) | 2019-06-20 | 2022-11-29 | Cilag Gmbh International | Pulsed illumination in a fluorescence imaging system |
US11895397B2 (en) | 2019-06-20 | 2024-02-06 | Cilag Gmbh International | Image synchronization without input clock and data transmission clock in a pulsed fluorescence imaging system |
US11892403B2 (en) | 2019-06-20 | 2024-02-06 | Cilag Gmbh International | Image synchronization without input clock and data transmission clock in a pulsed fluorescence imaging system |
US11516387B2 (en) | 2019-06-20 | 2022-11-29 | Cilag Gmbh International | Image synchronization without input clock and data transmission clock in a pulsed hyperspectral, fluorescence, and laser mapping imaging system |
US20200397246A1 (en) | 2019-06-20 | 2020-12-24 | Ethicon Llc | Minimizing image sensor input/output in a pulsed hyperspectral, fluorescence, and laser mapping imaging system |
US20200400566A1 (en) * | 2019-06-20 | 2020-12-24 | Ethicon Llc | Image synchronization without input clock and data transmission clock in a pulsed laser mapping imaging system |
US11986160B2 (en) | 2019-06-20 | 2024-05-21 | Cllag GmbH International | Image synchronization without input clock and data transmission clock in a pulsed hyperspectral imaging system |
JP7286815B2 (en) * | 2019-06-20 | 2023-06-05 | ジェンテックス コーポレイション | Illumination system and method for object tracking |
CN114599263A (en) | 2019-08-21 | 2022-06-07 | 艾科缇弗外科公司 | System and method for medical imaging |
CN114600165A (en) | 2019-09-17 | 2022-06-07 | 波士顿偏振测定公司 | System and method for surface modeling using polarization cues |
DE112020004813B4 (en) | 2019-10-07 | 2023-02-09 | Boston Polarimetrics, Inc. | System for expanding sensor systems and imaging systems with polarization |
JP7329143B2 (en) | 2019-11-30 | 2023-08-17 | ボストン ポーラリメトリックス,インコーポレイティド | Systems and methods for segmentation of transparent objects using polarization cues |
US11992373B2 (en) | 2019-12-10 | 2024-05-28 | Globus Medical, Inc | Augmented reality headset with varied opacity for navigated robotic surgery |
US11896442B2 (en) | 2019-12-30 | 2024-02-13 | Cilag Gmbh International | Surgical systems for proposing and corroborating organ portion removals |
US11832996B2 (en) | 2019-12-30 | 2023-12-05 | Cilag Gmbh International | Analyzing surgical trends by a surgical system |
US12002571B2 (en) | 2019-12-30 | 2024-06-04 | Cilag Gmbh International | Dynamic surgical visualization systems |
US11776144B2 (en) | 2019-12-30 | 2023-10-03 | Cilag Gmbh International | System and method for determining, adjusting, and managing resection margin about a subject tissue |
US11284963B2 (en) | 2019-12-30 | 2022-03-29 | Cilag Gmbh International | Method of using imaging devices in surgery |
US11219501B2 (en) | 2019-12-30 | 2022-01-11 | Cilag Gmbh International | Visualization systems using structured light |
US11759283B2 (en) | 2019-12-30 | 2023-09-19 | Cilag Gmbh International | Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto |
US12053223B2 (en) | 2019-12-30 | 2024-08-06 | Cilag Gmbh International | Adaptive surgical system control according to surgical smoke particulate characteristics |
US11648060B2 (en) | 2019-12-30 | 2023-05-16 | Cilag Gmbh International | Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ |
US11744667B2 (en) | 2019-12-30 | 2023-09-05 | Cilag Gmbh International | Adaptive visualization by a surgical system |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11195303B2 (en) | 2020-01-29 | 2021-12-07 | Boston Polarimetrics, Inc. | Systems and methods for characterizing object pose detection and measurement systems |
US11797863B2 (en) | 2020-01-30 | 2023-10-24 | Intrinsic Innovation Llc | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
WO2021159061A1 (en) | 2020-02-07 | 2021-08-12 | Smith & Nephew, Inc. | Augmented reality ready optical tracking system |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
IT202000007252A1 (en) * | 2020-04-06 | 2021-10-06 | Artiness S R L | Method of tracking a medical device in real time from echocardiographic images for remote holographic supervision |
NL2025325B1 (en) * | 2020-04-09 | 2021-10-26 | Academisch Ziekenhuis Leiden | Tracking position and orientation of a surgical device through fluorescence imaging |
EP4138703A1 (en) | 2020-04-24 | 2023-03-01 | Smith&Nephew, Inc. | Optical tracking device with built-in structured light module |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US10949986B1 (en) | 2020-05-12 | 2021-03-16 | Proprio, Inc. | Methods and systems for imaging a scene, such as a medical scene, and tracking objects within the scene |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US20220047334A1 (en) * | 2020-08-17 | 2022-02-17 | Georgia Tech Research Corporation | Systems and methods for magnetic resonance imaging guided robotics |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
CA3200540A1 (en) | 2020-12-04 | 2022-06-09 | Bogdan MITREA | Systems and methods for providing surgical guidance |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US20220330799A1 (en) * | 2021-04-14 | 2022-10-20 | Arthrex, Inc | System and method for using detectable radiation in surgery |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5928137A (en) * | 1996-05-03 | 1999-07-27 | Green; Philip S. | System and method for endoscopic imaging and endosurgery |
US20030192557A1 (en) * | 1998-05-14 | 2003-10-16 | David Krag | Systems and methods for locating and defining a target location within a human body |
US20050182321A1 (en) * | 2002-03-12 | 2005-08-18 | Beth Israel Deaconess Medical Center | Medical imaging systems |
US7008373B2 (en) * | 2001-11-08 | 2006-03-07 | The Johns Hopkins University | System and method for robot targeting under fluoroscopy based on image servoing |
US20070273610A1 (en) * | 2006-05-26 | 2007-11-29 | Itt Manufacturing Enterprises, Inc. | System and method to display maintenance and operational instructions of an apparatus using augmented reality |
US20080312561A1 (en) * | 2004-05-06 | 2008-12-18 | Nanyang Technological University | Mechanical Manipulator for Hifu Transducers |
US20090248038A1 (en) * | 2008-03-31 | 2009-10-01 | Intuitive Surgical Inc., A Delaware Corporation | Force and torque sensing in a surgical robot setup arm |
US20090268010A1 (en) | 2008-04-26 | 2009-10-29 | Intuitive Surgical, Inc. | Augmented stereoscopic visualization for a surgical robot using a captured fluorescence image and captured stereoscopic visible images |
US8090194B2 (en) * | 2006-11-21 | 2012-01-03 | Mantis Vision Ltd. | 3D geometric modeling and motion capture using both single and dual imaging |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5986271A (en) * | 1997-07-03 | 1999-11-16 | Lazarev; Victor | Fluorescence imaging system |
US7831292B2 (en) * | 2002-03-06 | 2010-11-09 | Mako Surgical Corp. | Guidance system and method for surgical procedures with improved feedback |
AU2003218010A1 (en) * | 2002-03-06 | 2003-09-22 | Z-Kat, Inc. | System and method for using a haptic device in combination with a computer-assisted surgery system |
JP4142326B2 (en) * | 2002-04-05 | 2008-09-03 | Hoya株式会社 | Diagnostic system using autofluorescence |
DE10357184A1 (en) * | 2003-12-08 | 2005-07-07 | Siemens Ag | Combination of different images relating to bodily region under investigation, produces display images from assembled three-dimensional fluorescence data image set |
US8073528B2 (en) * | 2007-09-30 | 2011-12-06 | Intuitive Surgical Operations, Inc. | Tool tracking systems, methods and computer products for image guided surgery |
JP4761899B2 (en) * | 2005-09-12 | 2011-08-31 | Hoya株式会社 | Electronic endoscope system |
US9724165B2 (en) * | 2006-05-19 | 2017-08-08 | Mako Surgical Corp. | System and method for verifying calibration of a surgical device |
US8184880B2 (en) * | 2008-12-31 | 2012-05-22 | Intuitive Surgical Operations, Inc. | Robust sparse image matching for robotic surgery |
US8706184B2 (en) * | 2009-10-07 | 2014-04-22 | Intuitive Surgical Operations, Inc. | Methods and apparatus for displaying enhanced imaging data on a clinical image |
-
2013
- 2013-04-16 JP JP2015507109A patent/JP2015523102A/en active Pending
- 2013-04-16 ES ES13778281.9T patent/ES2653924T3/en active Active
- 2013-04-16 CN CN201380025626.1A patent/CN104582622B/en active Active
- 2013-04-16 KR KR1020147028899A patent/KR102214789B1/en active IP Right Grant
- 2013-04-16 EP EP13778281.9A patent/EP2838463B1/en active Active
- 2013-04-16 US US13/863,954 patent/US20130274596A1/en not_active Abandoned
- 2013-04-16 WO PCT/US2013/036773 patent/WO2013158636A1/en active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5928137A (en) * | 1996-05-03 | 1999-07-27 | Green; Philip S. | System and method for endoscopic imaging and endosurgery |
US20030192557A1 (en) * | 1998-05-14 | 2003-10-16 | David Krag | Systems and methods for locating and defining a target location within a human body |
US7008373B2 (en) * | 2001-11-08 | 2006-03-07 | The Johns Hopkins University | System and method for robot targeting under fluoroscopy based on image servoing |
US20050182321A1 (en) * | 2002-03-12 | 2005-08-18 | Beth Israel Deaconess Medical Center | Medical imaging systems |
US20080312561A1 (en) * | 2004-05-06 | 2008-12-18 | Nanyang Technological University | Mechanical Manipulator for Hifu Transducers |
US20070273610A1 (en) * | 2006-05-26 | 2007-11-29 | Itt Manufacturing Enterprises, Inc. | System and method to display maintenance and operational instructions of an apparatus using augmented reality |
US8090194B2 (en) * | 2006-11-21 | 2012-01-03 | Mantis Vision Ltd. | 3D geometric modeling and motion capture using both single and dual imaging |
US20090248038A1 (en) * | 2008-03-31 | 2009-10-01 | Intuitive Surgical Inc., A Delaware Corporation | Force and torque sensing in a surgical robot setup arm |
US20090268010A1 (en) | 2008-04-26 | 2009-10-29 | Intuitive Surgical, Inc. | Augmented stereoscopic visualization for a surgical robot using a captured fluorescence image and captured stereoscopic visible images |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3165153A1 (en) | 2015-11-05 | 2017-05-10 | Deutsches Krebsforschungszentrum Stiftung des Öffentlichen Rechts | System for fluorescence aided surgery |
WO2017076571A1 (en) | 2015-11-05 | 2017-05-11 | Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts | System for fluorescence aided surgery |
Also Published As
Publication number | Publication date |
---|---|
EP2838463B1 (en) | 2017-11-08 |
CN104582622B (en) | 2017-10-13 |
ES2653924T3 (en) | 2018-02-09 |
JP2015523102A (en) | 2015-08-13 |
EP2838463A1 (en) | 2015-02-25 |
CN104582622A (en) | 2015-04-29 |
KR102214789B1 (en) | 2021-02-09 |
US20130274596A1 (en) | 2013-10-17 |
EP2838463A4 (en) | 2016-01-13 |
KR20150001756A (en) | 2015-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2838463B1 (en) | Dual-mode stereo imaging system for tracking and control in surgical and interventional procedures | |
US20190282307A1 (en) | Dual-mode imaging system for tracking and control during medical procedures | |
US10182704B2 (en) | Robotic control of an endoscope from blood vessel tree images | |
US20220095903A1 (en) | Augmented medical vision systems and methods | |
Leonard et al. | Smart tissue anastomosis robot (STAR): A vision-guided robotics system for laparoscopic suturing | |
KR20150047478A (en) | Automated surgical and interventional procedures | |
JP2017507708A (en) | Spatial visualization of the internal thoracic artery during minimally invasive bypass surgery | |
US20220218427A1 (en) | Medical tool control system, controller, and non-transitory computer readable storage | |
US11937799B2 (en) | Surgical sealing systems for instrument stabilization | |
Zhou et al. | Visual tracking of laparoscopic instruments | |
US20230233272A1 (en) | System and method for determining tool positioning, and fiducial marker therefore | |
US20230210603A1 (en) | Systems and methods for enhancing imaging during surgical procedures | |
US20230096406A1 (en) | Surgical devices, systems, and methods using multi-source imaging | |
JP2024536154A (en) | Surgical system with devices for both intraluminal and extraluminal access - Patents.com | |
JP2024536155A (en) | Surgical system for independently ventilating two separate anatomical spaces - Patents.com | |
WO2023052931A1 (en) | Surgical sealing systems for instrument stabilization | |
EP4216846A1 (en) | Surgical systems with port devices for instrument control | |
JP2024536169A (en) | Surgical sealing device for natural body orifices | |
WO2023052951A1 (en) | Surgical systems with intraluminal and extraluminal cooperative instruments | |
CN118159217A (en) | Surgical devices, systems, and methods using multi-source imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13778281 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015507109 Country of ref document: JP Kind code of ref document: A Ref document number: 20147028899 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2013778281 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013778281 Country of ref document: EP |