US20240032772A1 - Medical device tracking systems and methods of using the same - Google Patents
Medical device tracking systems and methods of using the same Download PDFInfo
- Publication number
- US20240032772A1 US20240032772A1 US18/485,026 US202318485026A US2024032772A1 US 20240032772 A1 US20240032772 A1 US 20240032772A1 US 202318485026 A US202318485026 A US 202318485026A US 2024032772 A1 US2024032772 A1 US 2024032772A1
- Authority
- US
- United States
- Prior art keywords
- target site
- medical
- light
- location
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 50
- 238000003384 imaging method Methods 0.000 claims abstract description 57
- 230000004044 response Effects 0.000 claims description 24
- 230000033001 locomotion Effects 0.000 claims description 13
- 230000000007 visual effect Effects 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 6
- 238000012545 processing Methods 0.000 description 29
- 210000004141 ampulla of vater Anatomy 0.000 description 10
- 210000000813 small intestine Anatomy 0.000 description 8
- 239000003708 ampul Substances 0.000 description 7
- 239000012530 fluid Substances 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 210000001953 common bile duct Anatomy 0.000 description 5
- 210000001035 gastrointestinal tract Anatomy 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 230000000153 supplemental effect Effects 0.000 description 4
- 210000003484 anatomy Anatomy 0.000 description 3
- 210000001198 duodenum Anatomy 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000003238 esophagus Anatomy 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 238000002679 ablation Methods 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 210000004534 cecum Anatomy 0.000 description 1
- 210000001072 colon Anatomy 0.000 description 1
- 210000003459 common hepatic duct Anatomy 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000002001 electrophysiology Methods 0.000 description 1
- 230000007831 electrophysiology Effects 0.000 description 1
- 238000001839 endoscopy Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 210000000232 gallbladder Anatomy 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 210000002429 large intestine Anatomy 0.000 description 1
- 238000002324 minimally invasive surgery Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 210000000277 pancreatic duct Anatomy 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00097—Sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00098—Deflecting means for inserted tools
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/005—Flexible endoscopes
- A61B1/0051—Flexible endoscopes with controlled bending of insertion part
- A61B1/0057—Constructional details of force transmission elements, e.g. control wires
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/012—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor
- A61B1/018—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor for receiving instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/063—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for monochromatic or narrow-band illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/012—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor
- A61B1/0125—Endoscope within endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/11—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
- A61B90/13—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
Definitions
- aspects of the disclosure relate generally to medical device tracking systems, devices, and related methods. More specifically, at least certain embodiments of the disclosure relate to systems, devices, and related methods for locating one or more target sites within a patient during an endoscopic procedure to facilitate the positioning of medical devices, among other aspects.
- aspects of the disclosure relate to, among other things, systems, devices, and methods for positioning a medical device at a target treatment site with a medical system including target identification logic, among other aspects.
- Each of the aspects disclosed herein may include one or more of the features described in connection with any of the other disclosed aspects.
- a medical system includes a medical device having an imaging device configured to capture images of a target site. A location of the target site is determined based on the images.
- the medical device further includes a light source configured to direct light onto the location of the target site, and a processor and non-transitory computer readable medium storing instructions that, when executed by the processor, causes the processor to move a sensor of a medical instrument toward the location of the target site based on the sensor detecting the light at the target site.
- the sensor is movable relative to the imaging device toward the location of the target site based on the sensor detecting the light at the target site.
- the instructions stored in the non-transitory computer readable medium causes the processor to detect a change in location of the imaging device relative to the target site, determine the location of the target site relative to the imaging device, and redirect the light to the location of the target site.
- the processor is configured to detect the change in location of the imaging device relative to the target site based on images periodically captured by the imaging device.
- the processor is configured to compare the location of the target site to an original location of the target site to determine a positional variance.
- the processor is configured to determine whether the positional variance exceeds a preprogrammed threshold.
- the light source includes a source to generate a laser beam.
- the imaging device includes a camera.
- the medical system may include a medical instrument, and wherein the sensor includes at least one of a photodetector, a photodiode, and a charged coupled device (CCD).
- the sensor is configured to generate a photodiode signal in response to detecting the light at the target site.
- a strength of the photodiode signal generated by the sensor includes a greater intensity when the sensor is positioned at a first distance from the light, and includes a smaller intensity when the sensor is positioned at a second distance from the light. The first distance is less than the second distance.
- the medical device includes a mirror configured to reflect the light generated by the light source toward the location of the target site.
- the mirror is configured to move to redirect the light toward the location of the target site in response to the processor detecting the change in location of the imaging device relative to the target site.
- the mirror includes a micro-mirror (MEMs mirror) configured to reflect the light along two axes.
- the mirror is positioned adjacent to the light source on the medical device.
- the processor is configured to generate a visual identifier along the images captured by the imaging device indicative of the location of the target site.
- a medical system includes a medical device including an imaging device configured to capture images of a target site, and a light source configured to direct light onto the target site.
- the medical system includes a medical instrument movable relative to the medical device.
- the medical instrument including a sensor configured to detect the light on the target site.
- the medical instrument is movable toward the target site in response to the sensor detecting the light on the target site.
- the medical system may include a processor configured to detect movement of the medical device relative to the target site based on images captured by the imaging device.
- the light source is configured to redirect the light based on the detected movement of the medical device.
- the medical device is an endoscope or duodenoscope, and the medical instrument is a catheter.
- a method of moving a medical instrument toward a target site includes capturing images of the target site with an imaging device. A first location of the target site is determined based on the images. The method includes transmitting light to the first location by a light source, detecting the light incident at the first location by a sensor of the medical instrument, and moving the medical instrument toward the target site based on the sensor detecting the light incident at the first location.
- the method includes capturing images of the target site with the imaging device to determine a second location of the target site, redirecting the light from the light source to the second location, and moving the medical instrument toward the target site based on the sensor detecting the light at the second location.
- FIG. 1 is a schematic view of an exemplary medical system, according to aspects of this disclosure
- FIG. 2 is a partial perspective view of a medical device of the medical system of FIG. 1 , according to aspects of this disclosure;
- FIG. 3 is a partial perspective view of a medical instrument of the medical system of FIG. 1 , according to aspects of this disclosure;
- FIG. 4 is a schematic view of the medical system of FIG. 1 positioned at a target site of a patient, according to aspects of this disclosure
- FIG. 5 A is an image including locating a target site of a patient, according to aspects of this disclosure.
- FIG. 5 B is an image including marking the target site of FIG. 5 A with a light beam, according to aspects of this disclosure
- FIG. 5 C is an image including marking a target site with a light beam upon movement of the medical system, according to aspects of this disclosure.
- FIG. 6 is a block diagram of an exemplary method of locating a target site with the medical system of FIG. 1 , according to aspects of this disclosure.
- Embodiments of the disclosure include systems, devices, and methods for locating, tracking, and/or steering one or more tools or other medical devices at a target site within the body.
- distal refers to a portion farthest away from a user when introducing a device into a patient.
- proximal refers to a portion closest to the user when placing the device into the patient.
- the terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not necessarily include only those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- the term “exemplary” is used in the sense of “example,” rather than “ideal.”
- the terms “about,” “substantially,” and “approximately,” indicate a range of values within +/ ⁇ 10% of a stated value.
- Embodiments of the disclosure may be used to locate a target site with a medical system, such as, for example, a medical system having target identification logic.
- a medical system such as, for example, a medical system having target identification logic.
- some embodiments combine an imaging device and a light source with a medical device to locate a target site.
- the imaging device may capture images of the target site and the light source may direct light onto the target site in response to identifying a location of the target site based on the images.
- the target identification logic of the medical system may detect movements of the medical device and determine an adjusted location of the target site relative to the medical device in response, thereby redirecting the light from the light source toward the location of the target site.
- Embodiments of the disclosure may relate to devices and methods for performing various medical procedures and/or treating portions of the large intestine (colon), small intestine, cecum, esophagus, any other portion of the gastrointestinal tract, and/or any other suitable patient anatomy (collectively referred to herein as a “target treatment site”).
- Various embodiments described herein include single-use or disposable medical devices.
- FIG. 1 shows a schematic depiction of an exemplary medical system 100 in accordance with an embodiment of this disclosure.
- the medical system 100 may include an image processing device 102 , a medical device 110 , and a medical instrument 140 .
- the image processing device 102 may be communicatively coupled to the medical device 110 via a cable 118 . It should be understood that in other embodiments the image processing device 102 may be in wireless communication with the medical device 110 .
- the image processing device 102 is a computer system incorporating a plurality of hardware components that allow the image processing device 102 to receive and monitor data, accurately display images of one or more features (e.g., a target site), and/or process other information described herein.
- Illustrative hardware components of the image processing device 102 may include at least one processor 104 and at least one memory 106 .
- the processor 104 of the image processing device 102 may include any computing device capable of executing machine-readable instructions, which may be stored on a non-transitory computer-readable medium, such as, for example, the memory 106 of the image processing device 102 .
- the processor 104 may include a controller, an integrated circuit, a microchip, a computer, and/or any other computer processing unit operable to perform calculations and logic operations required to execute a program.
- the processor 104 is configured to perform one or more operations in accordance with the instructions stored on the memory 106 , such as, for example, a target identification logic 108 .
- the memory 106 of the image processing device 102 is a non-transitory computer readable medium that stores machine-readable instructions thereon, such as, for example, the target identification logic 108 .
- the target identification logic 108 may include executable instructions that allow the medical device 110 to track a location of a target site for the medical instrument 140 to lock onto and steer toward for the performance of one or more procedures on or near the target site. It should be understood that various programming algorithms and data that support an operation of the medical system 100 may reside in whole or in part in the memory 106 .
- the memory 106 may include any type of computer readable medium suitable for storing data and algorithms, such as, for example, random access memory (RAM), read only memory (ROM), a flash memory, a hard drive, and/or any device capable of storing machine-readable instructions.
- the memory 106 may include one or more data sets, including, but not limited to, image data 109 from one or more components of the medical system 100 (e.g., the medical device 110 , the medical instrument 140 , etc.).
- the medical device 110 may be configured to facilitate positioning one or more components of the medical system 100 relative to a patient, such as, for example, the medical instrument 140 .
- the medical device 110 may be any type of endoscope and may include a handle 112 , an actuation mechanism 114 , at least one port 116 , and a shaft 120 .
- the handle 112 of the medical device 110 may have one or more lumens (not shown) that communicate with a lumen(s) of one or more other components of the medical system 100 .
- the handle 112 further includes the at least one port 116 that opens into the one or more lumens of the handle 112 .
- the at least one port 116 is sized and shaped to receive one or more instruments therethrough, such as, for example, the medical instrument 140 of the medical system 100 .
- the shaft 120 of the medical device 110 may include a tube that is sufficiently flexible such that the shaft 120 is configured to selectively bend, rotate, and/or twist when being inserted into and/or through a patient's tortuous anatomy to a target treatment site.
- the shaft 120 may have one or more lumens (not shown) extending therethrough that include, for example, a working lumen for receiving instruments (e.g., the medical instrument 140 ).
- the shaft 120 may include additional lumens such as a control wire lumen for receiving one or more control wires for actuating one or more distal parts/tools (including an articulation joint and an elevator, for example), a fluid lumen for delivering a fluid, an illumination lumen for receiving at least a portion of an illumination assembly (not shown), and/or an imaging lumen for receiving at least a portion of an imaging assembly (not shown).
- a control wire lumen for receiving one or more control wires for actuating one or more distal parts/tools (including an articulation joint and an elevator, for example)
- a fluid lumen for delivering a fluid
- an illumination lumen for receiving at least a portion of an illumination assembly
- an imaging lumen for receiving at least a portion of an imaging assembly (not shown).
- the medical device 110 may further include a tip 122 at a distal end of the shaft 120 .
- the tip 122 may be attached to the distal end of the shaft 120 , while in other embodiments the tip 122 may be integral with the shaft 120 .
- the tip 122 may include a cap configured to receive the distal end of the shaft 120 therein.
- the tip 122 may include one or more openings that are in communication with the one or more lumens of the shaft 120 .
- the tip 122 may include a working opening 124 through which the medical instrument 140 may exit from a working lumen of the shaft 120 .
- the tip 122 of the shaft 120 may include additional and/or fewer openings thereon, such as, for example, a fluid opening or nozzle through which fluid may be emitted from a fluid lumen of the shaft 120 , an illumination opening/window through which light may be emitted, and/or an imaging opening/window for receiving light used by an imaging device to generate an image.
- the actuation mechanism 114 of the medical device 110 is positioned on the handle 112 and may include one or more knobs, buttons, levers, switches, and/or other suitable actuators.
- the actuation mechanism 114 is configured to control at least one of deflection of the shaft 120 (through actuation of a control wire, for example), delivery of a fluid, emission of illumination, and/or various imaging functions.
- the medical device 110 includes one or more control wires for actuating an elevator 126 of the medical device 110 at the tip 122 (see FIGS. 2 - 3 ).
- a user of the medical device 110 may manipulate the actuation mechanism 114 to selectively exert at least one of a pulling force and a pushing force on the one or more control wires to control a position of the elevator 126 , and thereby control a position of an instrument adjacent to the elevator 126 (e.g., the medical instrument 140 ).
- the medical instrument 140 of the medical system 100 may include a catheter having a longitudinal body 142 between a proximal end of the longitudinal body 142 and a distal end 144 .
- a handle 141 is at the proximal end of the longitudinal body 142 .
- the longitudinal body 142 of the medical instrument is flexible such that the medical instrument 140 is configured to bend, rotate, and/or twist when being inserted into a working lumen of the medical device 110 .
- the handle 141 of the medical instrument 140 may be configured to move, rotate, and bend the longitudinal body 142 .
- the handle 141 may define one or more ports (not shown) sized to receive one or more tools through the longitudinal body 142 of the medical instrument 140 .
- the medical device 110 is configured to receive the medical instrument 140 via the at least one port 116 and through the shaft 120 to the working opening 124 at the tip 122 via a working lumen.
- the medical instrument 140 may extend distally out of the working opening 124 and into a surrounding environment of the tip 122 , such as, for example, at a target treatment site of a patient as described in further detail below.
- the distal end 144 of the medical instrument 140 may extend distally from the working opening 124 in response to a translation of the longitudinal body 142 through the working lumen of the shaft 120 .
- the medical instrument 140 may include various other devices than those show and described herein, including but not limited to, a guidewire, cutting or grasping forceps, a biopsy device, a snare loop, an injection needle, a cutting blade, scissors, a retractable basket, a retrieval device, an ablation and/or electrophysiology catheter, a stent placement device, a surgical stapling device, a balloon catheter, a laser-emitting device, an imaging device, and/or any other suitable instrument.
- the tip 122 of the shaft 120 is depicted with the medical instrument 140 omitted from the working opening 124 .
- the tip 122 includes the elevator 126 positioned adjacent to the working opening 124 and partially disposed within a working lumen of the shaft 120 .
- the elevator 126 is shown and described herein in an unactuated position and that actuation of the actuation mechanism 114 on the handle 112 may provide for an extension of the elevator 126 to an actuated position (see FIG. 3 ).
- the elevator 126 is configured to position an instrument received through a working lumen of the shaft 120 (e.g., the medical instrument 140 ) outward from the working opening 124 when in the actuated position.
- the tip 122 of the medical device 110 further includes a light source 128 , an imaging device 130 , and a laser 132 positioned adjacent to the working opening 124 of the shaft 120 .
- the light source 128 of the medical device 110 is configured and operable to direct light outwardly from the tip 122 of the shaft 120 to thereby illuminate a surrounding environment of the tip 122 , such as, for example, a target treatment site of a patient in which the medical device 110 may be located in (see FIGS. 5 A- 5 C ).
- the light source 128 may include a light emitter, such as, for example, a light-emitting diode (LED), or the like.
- the imaging device 130 of the medical device 110 is configured and operable to capture images of a surrounding environment of the tip 122 , such as, for example, the target treatment site of a patient (see FIGS. 5 A- 5 C ).
- the imaging device 130 may include a camera capable of high resolution imaging. It should be understood that in other embodiments the medical device 110 may omit the imaging device 130 on the tip 122 entirely such that a separate imaging device may be received by the medical device 110 through the shaft 120 .
- the laser 132 of the medical device 110 is configured and operable to generate a light/laser beam outwardly from the tip 122 of the shaft 120 .
- the laser 132 is further configured to selectively direct the light/laser beam to a predetermined location to thereby mark the predetermined location with the light/laser beam.
- the light/laser beam generated by the laser 132 may be independently steerable relative to the light emitted by the light source 128 and/or any other component of the medical system 100 .
- a target site within a patient may be marked with a light/laser beam from the laser 132 for tracking a location of said target site during use of the medical system 100 in a procedure (see FIGS. 5 A- 5 C ).
- the medical device 110 may further include a mirror positioned along and/or adjacent to the tip 122 of the shaft 120 .
- the mirror of the medical device 110 may be disposed adjacent to the laser 132 thereby forming a unitary structure such that the mirror is coincident with a beam of light emitted by the laser 132 .
- the mirror of the medical device 110 is configured and operable to selectively reflect the light/laser beam generated by the laser 132 toward a predetermined location of a target site.
- the mirror of the medical device 110 is configured to move, pivot, translate, and/or rotate relative to the laser 132 and/or the tip 122 of the shaft 120 to thereby redirect the light/laser beam to a predetermined location of the target site.
- the mirror includes a micro-mirror (MEMS mirror) configured to reflect the light/laser beam along two axes (e.g., x-y directions of a coordinate axis) and/or to optical scanning angles ranging up to approximately 32 degrees.
- MEMS mirror micro-mirror
- a predetermined location of a target site may be determined based on images (e.g., the image data 109 ) captured by the imaging device 128 of the medical device 110 .
- the medical instrument 140 is depicted extending outwardly from the tip 122 of the shaft 120 with the elevator 126 engaged against the longitudinal body 142 of the medical instrument 140 .
- an anterior-facing surface of the elevator 126 engages the longitudinal body 142 of the medical instrument 140 to thereby deflect the distal end 144 laterally outward from the working opening 124 .
- the anterior-facing surface of the elevator 126 has a curvature that facilitates a deflection and/or bend of the longitudinal body 142 of the medical instrument 140 .
- the elevator 126 may include various other shapes, sizes, and/or configurations than those shown and described herein without departing from a scope of the disclosure.
- the medical instrument 140 further includes a sensor 146 positioned along the longitudinal body 142 adjacent to the distal end 144 .
- the sensor 146 may be located on a distally-facing surface and/or a distal-most surface of the medical instrument 140 .
- the sensor 146 of the medical instrument 140 is configured to detect one or more objects, properties, characteristics, and/or features present at and/or proximate to the distal end 144 of the medical instrument 140 .
- the sensor 146 may be configured to detect light, such as the light generated by the light source 128 of the medical device 110 .
- the senor 146 may be configured to detect the light/laser beam generated by the laser 132 of the medical device 110 , for example a point on a target site on which the light/laser beam 132 is incident.
- the sensor 146 may include at least one of a photodetector, a photodiode, a charged coupled device (CCD), and/or various other suitable detectors.
- the senor 146 includes a four-quadrant photodiode configured to convert light into an electrical current. As described in greater detail herein, in embodiments the sensor 146 is configured and operable to identify a predetermined location of a target site in response to detecting a light/laser beam directed by the laser 132 onto that target site (see FIGS. 5 A- 5 C ). In some embodiments, the sensor 146 may be positioned along a proximal end of the longitudinal body 142 adjacent to the handle 141 of the medical instrument 140 with a fiber that is communicatively coupled to the sensor 146 positioned adjacent to the distal end 144 . In this instance, the distal end 144 of the medical instrument 140 may have a relatively smaller profile.
- the medical instrument 140 may omit the sensor 146 on the distal end 144 entirely such that a separate sensing device may be received by the medical instrument 140 through the longitudinal body 142 , such as, for example, via one or more guidewires.
- FIGS. 4 - 5 C in conjunction with the flow diagram of FIG. 6 , an exemplary method 200 of using the medical system 100 to locate and access a target site is schematically depicted.
- the depiction of FIGS. 4 - 6 and the accompanying description below is not meant to limit the subject matter described herein to a particular method.
- the medical device 110 of the medical system 100 may be inserted within a patient's body 10 .
- the shaft 120 of the medical device 100 is guided through a digestive tract of the patient 10 by inserting the tip 122 into a nose or mouth (or other suitable natural body orifice) of the patient's body 10 .
- the medical device 110 is inserted through a gastrointestinal tract of the patient's body 10 , including an esophagus 12 , a stomach 16 , and into a small intestine 18 until reaching a target treatment site.
- a length of the shaft 120 may be sufficient so that a proximal end of medical device 110 (including the handle 112 ) is external of the patient's body 10 while the tip 122 of the medical device 110 is internal to the patient's body 10 . While this disclosure relates to the use of the medical system 100 in a digestive tract of the patient's body 10 , it should be understood that the features of this disclosure could be used in various other locations (e.g., other organs, tissue, etc.) within the patient's body 10 .
- the shaft 120 of the medical device 110 may extend into the patient's body 10 until it reaches a position in which tools disposed within the medical device 110 can access the target treatment site, such as the medical instrument 140 of the medical system 100 .
- this position may be, for example, the duodenum of the small intestine 18 .
- a target site may be the ampulla/papilla of Vater 22 located in a portion of the duodenum of the small intestine 18 .
- ampulla/papilla of Vater 22 generally forms an opening where the pancreatic duct and the common bile duct 20 empty into the duodenum of the small intestine 18 , with the hepatic ducts and the gall bladder emptying into the common bile duct 20 .
- the medical instrument 140 of the medical system 100 may be slidably received within the medical device 110 to thereby position the distal end 144 proximate to the target site. Advancement of the medical instrument 140 into the port 106 and through the shaft 120 to the tip 122 may be provided in response to actuation of the handle 142 . It should be understood that in other embodiments the medical instrument 140 may be received through the medical device 110 prior to an insertion of the shaft 120 through the patient body 10 at step 202 .
- rotation of the tip 122 near the target site may be desirable to facilitate positioning the working opening 124 toward a location of the target site. For example, it may be desired that the distal end 144 of the medical instrument 140 reach the ampulla/papilla of Vater 22 when deflected outwardly from the working opening 124 by the elevator 126 ( FIG. 3 ). In this instance, the tip 122 of the shaft 120 may be rotated until the working opening 124 , in which the medical instrument 140 may exit the medical device 110 , is facing the ampulla/papilla of Vater 22 .
- Rotation of the tip 122 and/or the shaft 120 may be provided in response to actuating the actuation mechanism 114 on the handle 112 , and/or by rotating all of the handle 112 , and identification of a relative orientation and/or position of the tip 122 may be provided in response to actuating the imaging device 130 on the tip 122 .
- a surrounding environment of the target site may be illuminated in response to actuating the light source 128 .
- the light source 128 may already be actuated to direct light outwardly from the tip 122 , such as, for example, prior to and/or as the medical device 110 is inserted into the patient body 10 at step 202 .
- the processor 104 of the image processing device 102 executes the target identification logic 108 to actuate the imaging device 130 of the medical device 110 . Accordingly, the imaging device 130 captures images of the target site. With the imaging device 130 facing the target site (e.g., the ampulla of Vater 22 ), images of a location of the target site may be obtained by the medical device 110 and communicated to the image processing device 102 for storing in the memory 106 as image data 109 .
- the target site e.g., the ampulla of Vater 22
- the processor 104 of the image processing device 102 executes the target identification logic 108 to determine a first location 52 A of the target site (e.g., the ampulla of Vater 22 within the small intestine 18 ) relative to the imaging device 130 on the tip 122 .
- the processor 104 analyzes the image data 109 captured by the imaging device 130 and determines a coordinate position of the target site relative to the tip 122 , pursuant to executing the machine-readable instructions of the target identification logic 108 .
- a user of the medical system 100 may manually identify the first location 52 A of the target site based on the image data 109 , such as, for example, via a touch-screen user interface display (not shown) that is communicatively coupled to the image processing device 102 .
- the processor 104 when executing the target identification logic 108 , may generate a visual identifier at the first location 52 A (e.g., highlights, geometric figures, arrows, and the like) to thereby visually designate the first location 52 A of the target site for reference.
- the visual identifier of the first location 52 A may include a box and/or “X” superimposed on the images of the target site for purposes of visually designating the target site in the image data 109 .
- the visual identifier of the first location 52 A may be displayed on a user interface display (not shown) that is communicatively coupled to the image processing device 102 .
- a user of the medical system 100 may manually mark the first location 52 A of the target site with a visual identifier based on the image data 109 , such as, for example, via a touch-screen user interface display (not shown) that is communicatively coupled to the image processing device 102 .
- the processor 104 may analyze the image data 109 to determine the first location 52 A of the target site in accordance with the manual mark and/or identification by the user of the medical system 100 for continued tracking in subsequent images of the target site.
- the processor 104 of the image processing device 102 executes the target identification logic 108 to mark the first location 52 A of the target site with a light/laser beam 134 by actuating the laser 132 of the medical device 110 .
- the processor 104 actuates the mirror of the medical device 110 to reflect the light/laser beam 134 generated by the laser 132 to redirect the light/laser beam 132 toward the first location 52 A of the target site, pursuant to executing the machine-readable instructions of the target identification logic 108 .
- the medical instrument 140 may be moved toward the target site in response to sensor 146 detecting the light/laser beam 132 .
- the handle 141 of the medical instrument 140 may be actuated to automatically translate the longitudinal body 142 through a working lumen of the shaft 120 to position the distal end 144 adjacent to the target site.
- the medical device 110 tracks the first location 52 A of the target site to allow the medical instrument 140 to lock onto the first location 52 A with the sensor 146 and autonomously steer the distal end 144 toward the target site to perform one or more procedures thereon, such as, for example, cannulate the ampulla duct opening 22 of the common bile duct 20 .
- the sensor 146 is configured to generate a feedback in response to detecting the incidence of the light/laser beam 132 onto the target site, relative to the distal end 144 .
- the sensor 146 includes a photodiode configured to convert the light/laser beam 134 into an electrical current such that the feedback generated by the sensor 146 includes a photodiode signal transmitted to a user of the medical instrument 140 .
- a strength of the photodiode signal generated by the sensor 146 may be indicative of a spatial (e.g., three-dimensional) proximity of the sensor 146 to the point of incidence of the light/laser beam 134 .
- a strength of the photodiode signal generated by the sensor 146 may increase as a distance between the distal end 144 of the medical instrument 140 and the target site decreases as the sensor 146 may detect the light/laser beam 132 in a relatively close proximity.
- a strength (e.g., intensity variation) of the photodiode signal generated by the sensor 146 may decrease as a distance between the distal end 144 of the medical instrument 140 and the target site increases, as the sensor 146 may detect the light/laser beam 132 in a relatively further proximity.
- the sensor 146 in embodiments described herein includes a photodiode or CCD that is configured to generate a feedback in response to detecting the light/laser beam 132 in the form of a photodiode signal, it should be appreciated that various other suitable sensors and/or forms of feedback may be generated by a sensor on the medical instrument 140 without departing from a scope of this disclosure.
- the medical instrument 140 may include a processor and memory similar to the processor 104 and the memory 106 of the image processing device 102 shown and described above.
- the processor of the medical instrument 140 when executing target identification logic stored on the memory of the medical instrument 140 , may provide for autonomous steering of the medical instrument 140 relative to the first location 52 A of the target site by tracking the light/laser beam 134 with the sensor 146 .
- the medical instrument 140 may be manually navigated to the first location 52 A of the target site by a user of the medical system 100 by visually tracking a position of the distal end 144 relative to the first location 52 A via a user interface display (not shown).
- a user may visually navigate the distal end 144 of the medical instrument 140 toward the visual identifier generated by the light/laser beam 132 .
- the distal end 144 of the medical instrument 140 may be displayed on a user interface display by a visual identifier, such as, for example, cross-hairs superimposed on the user interface display that are indicative of a position of the distal end 144 .
- the feedback generated by the sensor 146 may be utilized in addition to and/or in lieu of the user interface display for manually steering the medical instrument 140 toward the first location 52 A of the target site.
- the medical device 110 of the medical system 100 may move, intentionally and/or inadvertently, relative to the target site during a procedure as the medical instrument 140 moves toward the target site at step 212 . Such movements may occur due to difficulties in maintaining the medical device 110 stable during a procedure.
- a position of the target site e.g., the ampulla of Vater 22
- the tip 122 of the shaft 120 and/or the distal end 144 of the medical instrument 140 may be modified and/or vary relative to an initial corresponding position between the target site and the medical device 110 .
- the image data 109 initially obtained by the medical system 100 at step 206 may include inaccuracies and/or deficiencies in providing a current location of the target site (e.g., the ampulla of Vater 22 ).
- a current location of the target site e.g., the ampulla of Vater 22
- continued movement of the distal end 144 of the medical instrument 140 toward the first location 52 A, as initially determined by the processor 104 of the imaging processing device 102 at step 208 may not allow a user of the medical system 100 to adequately access the target site.
- the processor 104 may execute the target identification logic 108 to actuate the imaging device 130 to obtain updated image data 109 of the target site.
- the processor 104 of the image processing device 102 when executing the target identification logic 108 , may be configured to determine whether the medical device 110 has moved relative to the target site by periodically capturing images with the imaging device 130 for comparison to the image data 109 stored in the memory 106 at step 206 .
- movement of the medical device 110 relative to the target site may be based on determining that a positional variance between the first location 52 A and a detected position of the target site is equal to or greater than a preprogrammed threshold (e.g., a millimeter(s), a micrometer(s), a nanometer(s), etc.).
- a preprogrammed threshold e.g., a millimeter(s), a micrometer(s), a nanometer(s), etc.
- the processor 104 of the image processing system 102 repeats steps 206 , 208 , 210 , and 212 of the method 200 described above.
- the processor 104 executes the target identification logic 108 to capture images (e.g., image data 109 ) of the target site at step 206 , determine a second location 52 B of the target site (e.g., the ampulla of Vater 22 ) at step 208 , and mark the second location 52 B with the light/laser beam 134 at step 210 .
- the method 200 performs these steps substantially similar to those shown and described above to facilitate locating the target site with the medical system 100 in accordance with the new, second location 52 B of the target site.
- the image processing device 102 of the medical system 100 may be communicatively coupled to a remote station (not shown) for purposes of dynamically updating the target identification logic 108 stored on the memory 106 .
- the image processing device 102 may be operable to receive neural network data from a remote station (e.g., a computer server), such as, for example, via a wired and/or wireless connection.
- the neural network data received by the imaging processing device 102 may include supplemental image data 109 , similar to the image data 109 shown and described above, recorded from a plurality of prior procedures, devices, systems, etc.
- image data may be from a plurality of different patients, acquired over time, of the same or similar patient anatomy.
- the supplemental image data 109 may be stored in the memory 106 and utilized by the processor 104 of the image processing device 102 to artificially determine and/or identify common physical properties and/or characteristics of one or more target sites, such as, for example, the ampulla of Vater 22 within the small intestine 18 , the ampulla duct opening 22 of the common bile duct 20 , etc.
- the processor 104 of the image processing device 102 when executing the machine-readable instructions of the target identification logic 108 , may reference the supplemental image data 109 when analyzing the image data 109 captured by the imaging device 130 of the medical device 110 to determine the first location 52 A of the target site (e.g., the ampulla of Vater 22 within the small intestine 18 ). Accordingly, it should be appreciated that the supplemental image data 109 may facilitate determining a coordinate position of a target site relative to the medical device 110 during a procedure by providing the image processing device 102 additional data for artificial learning of a size, shape, and/or configuration of similar target sites.
- the supplemental image data 109 may facilitate determining a coordinate position of a target site relative to the medical device 110 during a procedure by providing the image processing device 102 additional data for artificial learning of a size, shape, and/or configuration of similar target sites.
- Each of the aforementioned devices, assemblies, and methods may be used to detect, mark and track a location of a target site.
- a user may accurately interact with a patient's tissue using artificial intelligence software in an image processing device during a procedure, allowing a user to reduce overall procedure time, increase efficiency of procedures, and avoid unnecessary harm to a patient's body caused by lack of control over a motion and positioning of a medical device when accessing target tissue of a patient.
- the disclosed devices may include various suitable computer systems and/or computing units incorporating a plurality of hardware components, such as, for example, a processor and non-transitory computer-readable medium, that allow the devices to perform one or more operations during a procedure in accordance with those described herein.
- suitable computer systems and/or computing units incorporating a plurality of hardware components, such as, for example, a processor and non-transitory computer-readable medium, that allow the devices to perform one or more operations during a procedure in accordance with those described herein.
- Other aspects of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the features disclosed herein. It is intended that the specification and examples be considered as exemplary only.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Robotics (AREA)
- Endoscopes (AREA)
Abstract
A medical system that includes a medical device having an imaging device configured to capture images of a target site. A location of the target site is determined based on the images. The medical device includes a light source configured to direct light onto the location of the target site, and a processor and non-transitory computer readable medium storing instructions that, when executed by the processor, causes the processor to move a sensor of a medical instrument toward the location of the target site based on the sensor detecting the light at the target site.
Description
- This application claims the benefit of priority from U.S. Provisional Application No. 62/942,959, filed Dec. 3, 2019, which is incorporated by reference herein in its entirety.
- Various aspects of the disclosure relate generally to medical device tracking systems, devices, and related methods. More specifically, at least certain embodiments of the disclosure relate to systems, devices, and related methods for locating one or more target sites within a patient during an endoscopic procedure to facilitate the positioning of medical devices, among other aspects.
- Technological developments have given users of medical systems, devices, and methods, the ability to conduct increasingly complex procedures on subjects. One challenge in the field of minimally invasive surgeries such as endoscopy, among other surgical procedures, is associated with the cannulation of target sites within a patient, such as an ampulla opening into the common bile duct. Placement of medical devices within a patient at precise locations of target sites may be difficult due to general lack of visualization at the target site and lack of control over a positioning of the medical device at a location of the target site. The limitations of medical devices in providing stability toward positioning an endoscope at a target treatment site of a patient may prolong the procedure, limit its effectiveness, and/or cause injury to the patient due to misalignment or instability of the medical device. There is a need for devices and methods that address one or more of these difficulties or other related problems.
- Aspects of the disclosure relate to, among other things, systems, devices, and methods for positioning a medical device at a target treatment site with a medical system including target identification logic, among other aspects. Each of the aspects disclosed herein may include one or more of the features described in connection with any of the other disclosed aspects.
- According to an example, a medical system includes a medical device having an imaging device configured to capture images of a target site. A location of the target site is determined based on the images. The medical device further includes a light source configured to direct light onto the location of the target site, and a processor and non-transitory computer readable medium storing instructions that, when executed by the processor, causes the processor to move a sensor of a medical instrument toward the location of the target site based on the sensor detecting the light at the target site.
- Any of the medical systems described herein may have any of the following features. The sensor is movable relative to the imaging device toward the location of the target site based on the sensor detecting the light at the target site. The instructions stored in the non-transitory computer readable medium causes the processor to detect a change in location of the imaging device relative to the target site, determine the location of the target site relative to the imaging device, and redirect the light to the location of the target site. The processor is configured to detect the change in location of the imaging device relative to the target site based on images periodically captured by the imaging device. The processor is configured to compare the location of the target site to an original location of the target site to determine a positional variance. The processor is configured to determine whether the positional variance exceeds a preprogrammed threshold. The light source includes a source to generate a laser beam. The imaging device includes a camera. The medical system may include a medical instrument, and wherein the sensor includes at least one of a photodetector, a photodiode, and a charged coupled device (CCD). The sensor is configured to generate a photodiode signal in response to detecting the light at the target site. A strength of the photodiode signal generated by the sensor includes a greater intensity when the sensor is positioned at a first distance from the light, and includes a smaller intensity when the sensor is positioned at a second distance from the light. The first distance is less than the second distance. The medical device includes a mirror configured to reflect the light generated by the light source toward the location of the target site. The mirror is configured to move to redirect the light toward the location of the target site in response to the processor detecting the change in location of the imaging device relative to the target site. The mirror includes a micro-mirror (MEMs mirror) configured to reflect the light along two axes. The mirror is positioned adjacent to the light source on the medical device. The processor is configured to generate a visual identifier along the images captured by the imaging device indicative of the location of the target site.
- According to another example, a medical system includes a medical device including an imaging device configured to capture images of a target site, and a light source configured to direct light onto the target site. The medical system includes a medical instrument movable relative to the medical device. The medical instrument including a sensor configured to detect the light on the target site. The medical instrument is movable toward the target site in response to the sensor detecting the light on the target site.
- Any of the medical systems described herein may have any of the following features. The medical system may include a processor configured to detect movement of the medical device relative to the target site based on images captured by the imaging device. The light source is configured to redirect the light based on the detected movement of the medical device. The medical device is an endoscope or duodenoscope, and the medical instrument is a catheter.
- According to another example, a method of moving a medical instrument toward a target site includes capturing images of the target site with an imaging device. A first location of the target site is determined based on the images. The method includes transmitting light to the first location by a light source, detecting the light incident at the first location by a sensor of the medical instrument, and moving the medical instrument toward the target site based on the sensor detecting the light incident at the first location.
- Any of the methods of using the medical systems described herein may have any of the following steps and/or features. In response to detecting movement of the medical device within the target site, the method includes capturing images of the target site with the imaging device to determine a second location of the target site, redirecting the light from the light source to the second location, and moving the medical instrument toward the target site based on the sensor detecting the light at the second location.
- It may be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
- The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary aspects of the disclosure and together with the description, serve to explain the principles of the disclosure.
-
FIG. 1 is a schematic view of an exemplary medical system, according to aspects of this disclosure; -
FIG. 2 is a partial perspective view of a medical device of the medical system ofFIG. 1 , according to aspects of this disclosure; -
FIG. 3 is a partial perspective view of a medical instrument of the medical system ofFIG. 1 , according to aspects of this disclosure; -
FIG. 4 is a schematic view of the medical system ofFIG. 1 positioned at a target site of a patient, according to aspects of this disclosure; -
FIG. 5A is an image including locating a target site of a patient, according to aspects of this disclosure; -
FIG. 5B is an image including marking the target site ofFIG. 5A with a light beam, according to aspects of this disclosure; -
FIG. 5C is an image including marking a target site with a light beam upon movement of the medical system, according to aspects of this disclosure; and -
FIG. 6 is a block diagram of an exemplary method of locating a target site with the medical system ofFIG. 1 , according to aspects of this disclosure. - Embodiments of the disclosure include systems, devices, and methods for locating, tracking, and/or steering one or more tools or other medical devices at a target site within the body. Reference will now be made in detail to aspects of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same or similar reference numbers will be used through the drawings to refer to the same or like parts. The term “distal” refers to a portion farthest away from a user when introducing a device into a patient. By contrast, the term “proximal” refers to a portion closest to the user when placing the device into the patient. As used herein, the terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not necessarily include only those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The term “exemplary” is used in the sense of “example,” rather than “ideal.” As used herein, the terms “about,” “substantially,” and “approximately,” indicate a range of values within +/−10% of a stated value.
- Embodiments of the disclosure may be used to locate a target site with a medical system, such as, for example, a medical system having target identification logic. For example, some embodiments combine an imaging device and a light source with a medical device to locate a target site. The imaging device may capture images of the target site and the light source may direct light onto the target site in response to identifying a location of the target site based on the images. The target identification logic of the medical system may detect movements of the medical device and determine an adjusted location of the target site relative to the medical device in response, thereby redirecting the light from the light source toward the location of the target site.
- Embodiments of the disclosure may relate to devices and methods for performing various medical procedures and/or treating portions of the large intestine (colon), small intestine, cecum, esophagus, any other portion of the gastrointestinal tract, and/or any other suitable patient anatomy (collectively referred to herein as a “target treatment site”). Various embodiments described herein include single-use or disposable medical devices. Reference will now be made in detail to examples of the disclosure described above and illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
-
FIG. 1 shows a schematic depiction of an exemplarymedical system 100 in accordance with an embodiment of this disclosure. Themedical system 100 may include animage processing device 102, amedical device 110, and amedical instrument 140. Theimage processing device 102 may be communicatively coupled to themedical device 110 via acable 118. It should be understood that in other embodiments theimage processing device 102 may be in wireless communication with themedical device 110. In embodiments, theimage processing device 102 is a computer system incorporating a plurality of hardware components that allow theimage processing device 102 to receive and monitor data, accurately display images of one or more features (e.g., a target site), and/or process other information described herein. Illustrative hardware components of theimage processing device 102 may include at least oneprocessor 104 and at least onememory 106. - The
processor 104 of theimage processing device 102 may include any computing device capable of executing machine-readable instructions, which may be stored on a non-transitory computer-readable medium, such as, for example, thememory 106 of theimage processing device 102. By way of example, theprocessor 104 may include a controller, an integrated circuit, a microchip, a computer, and/or any other computer processing unit operable to perform calculations and logic operations required to execute a program. As described in greater detail herein, theprocessor 104 is configured to perform one or more operations in accordance with the instructions stored on thememory 106, such as, for example, atarget identification logic 108. - The
memory 106 of theimage processing device 102 is a non-transitory computer readable medium that stores machine-readable instructions thereon, such as, for example, thetarget identification logic 108. As described in further detail below, thetarget identification logic 108 may include executable instructions that allow themedical device 110 to track a location of a target site for themedical instrument 140 to lock onto and steer toward for the performance of one or more procedures on or near the target site. It should be understood that various programming algorithms and data that support an operation of themedical system 100 may reside in whole or in part in thememory 106. Thememory 106 may include any type of computer readable medium suitable for storing data and algorithms, such as, for example, random access memory (RAM), read only memory (ROM), a flash memory, a hard drive, and/or any device capable of storing machine-readable instructions. Thememory 106 may include one or more data sets, including, but not limited to, imagedata 109 from one or more components of the medical system 100 (e.g., themedical device 110, themedical instrument 140, etc.). - Still referring to
FIG. 1 , themedical device 110 may be configured to facilitate positioning one or more components of themedical system 100 relative to a patient, such as, for example, themedical instrument 140. In embodiments, themedical device 110 may be any type of endoscope and may include ahandle 112, anactuation mechanism 114, at least oneport 116, and ashaft 120. Thehandle 112 of themedical device 110 may have one or more lumens (not shown) that communicate with a lumen(s) of one or more other components of themedical system 100. Thehandle 112 further includes the at least oneport 116 that opens into the one or more lumens of thehandle 112. As described in further detail herein, the at least oneport 116 is sized and shaped to receive one or more instruments therethrough, such as, for example, themedical instrument 140 of themedical system 100. - The
shaft 120 of themedical device 110 may include a tube that is sufficiently flexible such that theshaft 120 is configured to selectively bend, rotate, and/or twist when being inserted into and/or through a patient's tortuous anatomy to a target treatment site. Theshaft 120 may have one or more lumens (not shown) extending therethrough that include, for example, a working lumen for receiving instruments (e.g., the medical instrument 140). In other embodiments, theshaft 120 may include additional lumens such as a control wire lumen for receiving one or more control wires for actuating one or more distal parts/tools (including an articulation joint and an elevator, for example), a fluid lumen for delivering a fluid, an illumination lumen for receiving at least a portion of an illumination assembly (not shown), and/or an imaging lumen for receiving at least a portion of an imaging assembly (not shown). - Still referring to
FIG. 1 , themedical device 110 may further include atip 122 at a distal end of theshaft 120. In some embodiments, thetip 122 may be attached to the distal end of theshaft 120, while in other embodiments thetip 122 may be integral with theshaft 120. For example, thetip 122 may include a cap configured to receive the distal end of theshaft 120 therein. Thetip 122 may include one or more openings that are in communication with the one or more lumens of theshaft 120. For example, thetip 122 may include a workingopening 124 through which themedical instrument 140 may exit from a working lumen of theshaft 120. In other embodiments, thetip 122 of theshaft 120 may include additional and/or fewer openings thereon, such as, for example, a fluid opening or nozzle through which fluid may be emitted from a fluid lumen of theshaft 120, an illumination opening/window through which light may be emitted, and/or an imaging opening/window for receiving light used by an imaging device to generate an image. - The
actuation mechanism 114 of themedical device 110 is positioned on thehandle 112 and may include one or more knobs, buttons, levers, switches, and/or other suitable actuators. Theactuation mechanism 114 is configured to control at least one of deflection of the shaft 120 (through actuation of a control wire, for example), delivery of a fluid, emission of illumination, and/or various imaging functions. As described in greater detail herein, in some embodiments themedical device 110 includes one or more control wires for actuating anelevator 126 of themedical device 110 at the tip 122 (seeFIGS. 2-3 ). Accordingly, a user of themedical device 110 may manipulate theactuation mechanism 114 to selectively exert at least one of a pulling force and a pushing force on the one or more control wires to control a position of theelevator 126, and thereby control a position of an instrument adjacent to the elevator 126 (e.g., the medical instrument 140). - Still referring to
FIG. 1 , themedical instrument 140 of themedical system 100 may include a catheter having alongitudinal body 142 between a proximal end of thelongitudinal body 142 and adistal end 144. Ahandle 141 is at the proximal end of thelongitudinal body 142. Thelongitudinal body 142 of the medical instrument is flexible such that themedical instrument 140 is configured to bend, rotate, and/or twist when being inserted into a working lumen of themedical device 110. Thehandle 141 of themedical instrument 140 may be configured to move, rotate, and bend thelongitudinal body 142. Further, thehandle 141 may define one or more ports (not shown) sized to receive one or more tools through thelongitudinal body 142 of themedical instrument 140. Themedical device 110 is configured to receive themedical instrument 140 via the at least oneport 116 and through theshaft 120 to the workingopening 124 at thetip 122 via a working lumen. In this instance, themedical instrument 140 may extend distally out of the workingopening 124 and into a surrounding environment of thetip 122, such as, for example, at a target treatment site of a patient as described in further detail below. Thedistal end 144 of themedical instrument 140 may extend distally from the workingopening 124 in response to a translation of thelongitudinal body 142 through the working lumen of theshaft 120. It should be understood that in other embodiments themedical instrument 140 may include various other devices than those show and described herein, including but not limited to, a guidewire, cutting or grasping forceps, a biopsy device, a snare loop, an injection needle, a cutting blade, scissors, a retractable basket, a retrieval device, an ablation and/or electrophysiology catheter, a stent placement device, a surgical stapling device, a balloon catheter, a laser-emitting device, an imaging device, and/or any other suitable instrument. - Referring now to
FIG. 2 , thetip 122 of theshaft 120 is depicted with themedical instrument 140 omitted from the workingopening 124. Thetip 122 includes theelevator 126 positioned adjacent to the workingopening 124 and partially disposed within a working lumen of theshaft 120. It should be understood that theelevator 126 is shown and described herein in an unactuated position and that actuation of theactuation mechanism 114 on thehandle 112 may provide for an extension of theelevator 126 to an actuated position (seeFIG. 3 ). As described in further detail below, theelevator 126 is configured to position an instrument received through a working lumen of the shaft 120 (e.g., the medical instrument 140) outward from the workingopening 124 when in the actuated position. - The
tip 122 of themedical device 110 further includes alight source 128, animaging device 130, and alaser 132 positioned adjacent to the workingopening 124 of theshaft 120. In embodiments, thelight source 128 of themedical device 110 is configured and operable to direct light outwardly from thetip 122 of theshaft 120 to thereby illuminate a surrounding environment of thetip 122, such as, for example, a target treatment site of a patient in which themedical device 110 may be located in (seeFIGS. 5A-5C ). Thelight source 128 may include a light emitter, such as, for example, a light-emitting diode (LED), or the like. Theimaging device 130 of themedical device 110 is configured and operable to capture images of a surrounding environment of thetip 122, such as, for example, the target treatment site of a patient (seeFIGS. 5A-5C ). In some embodiments, theimaging device 130 may include a camera capable of high resolution imaging. It should be understood that in other embodiments themedical device 110 may omit theimaging device 130 on thetip 122 entirely such that a separate imaging device may be received by themedical device 110 through theshaft 120. - Still referring to
FIG. 2 , thelaser 132 of themedical device 110 is configured and operable to generate a light/laser beam outwardly from thetip 122 of theshaft 120. In some embodiments, thelaser 132 is further configured to selectively direct the light/laser beam to a predetermined location to thereby mark the predetermined location with the light/laser beam. It should be understood that the light/laser beam generated by thelaser 132 may be independently steerable relative to the light emitted by thelight source 128 and/or any other component of themedical system 100. As described further below, a target site within a patient may be marked with a light/laser beam from thelaser 132 for tracking a location of said target site during use of themedical system 100 in a procedure (seeFIGS. 5A-5C ). - In some embodiments, the
medical device 110 may further include a mirror positioned along and/or adjacent to thetip 122 of theshaft 120. The mirror of themedical device 110 may be disposed adjacent to thelaser 132 thereby forming a unitary structure such that the mirror is coincident with a beam of light emitted by thelaser 132. In embodiments, the mirror of themedical device 110 is configured and operable to selectively reflect the light/laser beam generated by thelaser 132 toward a predetermined location of a target site. The mirror of themedical device 110 is configured to move, pivot, translate, and/or rotate relative to thelaser 132 and/or thetip 122 of theshaft 120 to thereby redirect the light/laser beam to a predetermined location of the target site. In embodiments, the mirror includes a micro-mirror (MEMS mirror) configured to reflect the light/laser beam along two axes (e.g., x-y directions of a coordinate axis) and/or to optical scanning angles ranging up to approximately 32 degrees. As described in further detail below, a predetermined location of a target site may be determined based on images (e.g., the image data 109) captured by theimaging device 128 of themedical device 110. - As shown in
FIG. 3 , themedical instrument 140 is depicted extending outwardly from thetip 122 of theshaft 120 with theelevator 126 engaged against thelongitudinal body 142 of themedical instrument 140. With theelevator 126 in an actuated position, an anterior-facing surface of theelevator 126 engages thelongitudinal body 142 of themedical instrument 140 to thereby deflect thedistal end 144 laterally outward from the workingopening 124. In some embodiments, the anterior-facing surface of theelevator 126 has a curvature that facilitates a deflection and/or bend of thelongitudinal body 142 of themedical instrument 140. It should be appreciated that theelevator 126 may include various other shapes, sizes, and/or configurations than those shown and described herein without departing from a scope of the disclosure. - The
medical instrument 140 further includes asensor 146 positioned along thelongitudinal body 142 adjacent to thedistal end 144. In embodiments, thesensor 146 may be located on a distally-facing surface and/or a distal-most surface of themedical instrument 140. Thesensor 146 of themedical instrument 140 is configured to detect one or more objects, properties, characteristics, and/or features present at and/or proximate to thedistal end 144 of themedical instrument 140. By way of example, in some embodiments thesensor 146 may be configured to detect light, such as the light generated by thelight source 128 of themedical device 110. In other embodiments, thesensor 146 may be configured to detect the light/laser beam generated by thelaser 132 of themedical device 110, for example a point on a target site on which the light/laser beam 132 is incident. Thesensor 146 may include at least one of a photodetector, a photodiode, a charged coupled device (CCD), and/or various other suitable detectors. - In embodiments, the
sensor 146 includes a four-quadrant photodiode configured to convert light into an electrical current. As described in greater detail herein, in embodiments thesensor 146 is configured and operable to identify a predetermined location of a target site in response to detecting a light/laser beam directed by thelaser 132 onto that target site (seeFIGS. 5A-5C ). In some embodiments, thesensor 146 may be positioned along a proximal end of thelongitudinal body 142 adjacent to thehandle 141 of themedical instrument 140 with a fiber that is communicatively coupled to thesensor 146 positioned adjacent to thedistal end 144. In this instance, thedistal end 144 of themedical instrument 140 may have a relatively smaller profile. It should be understood that in other embodiments themedical instrument 140 may omit thesensor 146 on thedistal end 144 entirely such that a separate sensing device may be received by themedical instrument 140 through thelongitudinal body 142, such as, for example, via one or more guidewires. - Referring now to
FIGS. 4-5C in conjunction with the flow diagram ofFIG. 6 , anexemplary method 200 of using themedical system 100 to locate and access a target site is schematically depicted. The depiction ofFIGS. 4-6 and the accompanying description below is not meant to limit the subject matter described herein to a particular method. - At
step 202 and as shown inFIG. 4 , themedical device 110 of themedical system 100 may be inserted within a patient'sbody 10. Theshaft 120 of themedical device 100 is guided through a digestive tract of the patient 10 by inserting thetip 122 into a nose or mouth (or other suitable natural body orifice) of the patient'sbody 10. In embodiments, themedical device 110 is inserted through a gastrointestinal tract of the patient'sbody 10, including anesophagus 12, a stomach 16, and into asmall intestine 18 until reaching a target treatment site. It should be appreciated that a length of theshaft 120 may be sufficient so that a proximal end of medical device 110 (including the handle 112) is external of the patient'sbody 10 while thetip 122 of themedical device 110 is internal to the patient'sbody 10. While this disclosure relates to the use of themedical system 100 in a digestive tract of the patient'sbody 10, it should be understood that the features of this disclosure could be used in various other locations (e.g., other organs, tissue, etc.) within the patient'sbody 10. - The
shaft 120 of themedical device 110 may extend into the patient'sbody 10 until it reaches a position in which tools disposed within themedical device 110 can access the target treatment site, such as themedical instrument 140 of themedical system 100. In examples in which themedical device 110 is used to access and visualize aspects of the pancreatico-biliary system, this position may be, for example, the duodenum of thesmall intestine 18. In such examples, a target site may be the ampulla/papilla ofVater 22 located in a portion of the duodenum of thesmall intestine 18. It should be understood that the ampulla/papilla ofVater 22 generally forms an opening where the pancreatic duct and the common bile duct 20 empty into the duodenum of thesmall intestine 18, with the hepatic ducts and the gall bladder emptying into the common bile duct 20. - Still referring to
FIG. 4 , with thetip 122 of theshaft 120 located proximate to the target site (e.g., the ampulla of Vater 22), themedical instrument 140 of themedical system 100 may be slidably received within themedical device 110 to thereby position thedistal end 144 proximate to the target site. Advancement of themedical instrument 140 into theport 106 and through theshaft 120 to thetip 122 may be provided in response to actuation of thehandle 142. It should be understood that in other embodiments themedical instrument 140 may be received through themedical device 110 prior to an insertion of theshaft 120 through thepatient body 10 atstep 202. - In some embodiments, rotation of the
tip 122 near the target site may be desirable to facilitate positioning the workingopening 124 toward a location of the target site. For example, it may be desired that thedistal end 144 of themedical instrument 140 reach the ampulla/papilla ofVater 22 when deflected outwardly from the workingopening 124 by the elevator 126 (FIG. 3 ). In this instance, thetip 122 of theshaft 120 may be rotated until the workingopening 124, in which themedical instrument 140 may exit themedical device 110, is facing the ampulla/papilla ofVater 22. Rotation of thetip 122 and/or theshaft 120 may be provided in response to actuating theactuation mechanism 114 on thehandle 112, and/or by rotating all of thehandle 112, and identification of a relative orientation and/or position of thetip 122 may be provided in response to actuating theimaging device 130 on thetip 122. - At
step 204, with the workingopening 124 on thetip 122 facing the target site, a surrounding environment of the target site may be illuminated in response to actuating thelight source 128. It should be understood that in other embodiments thelight source 128 may already be actuated to direct light outwardly from thetip 122, such as, for example, prior to and/or as themedical device 110 is inserted into thepatient body 10 atstep 202. - At
step 206, with the target site illuminated by thelighting source 128, theprocessor 104 of theimage processing device 102 executes thetarget identification logic 108 to actuate theimaging device 130 of themedical device 110. Accordingly, theimaging device 130 captures images of the target site. With theimaging device 130 facing the target site (e.g., the ampulla of Vater 22), images of a location of the target site may be obtained by themedical device 110 and communicated to theimage processing device 102 for storing in thememory 106 asimage data 109. - At
step 208 and referring toFIG. 5A , with theimage data 109 received from themedical device 110 and stored within thememory 106, theprocessor 104 of theimage processing device 102 executes thetarget identification logic 108 to determine afirst location 52A of the target site (e.g., the ampulla ofVater 22 within the small intestine 18) relative to theimaging device 130 on thetip 122. Theprocessor 104 analyzes theimage data 109 captured by theimaging device 130 and determines a coordinate position of the target site relative to thetip 122, pursuant to executing the machine-readable instructions of thetarget identification logic 108. Alternatively, in other embodiments a user of themedical system 100 may manually identify thefirst location 52A of the target site based on theimage data 109, such as, for example, via a touch-screen user interface display (not shown) that is communicatively coupled to theimage processing device 102. - In some embodiments, the
processor 104, when executing thetarget identification logic 108, may generate a visual identifier at thefirst location 52A (e.g., highlights, geometric figures, arrows, and the like) to thereby visually designate thefirst location 52A of the target site for reference. As seen inFIG. 5A , the visual identifier of thefirst location 52A may include a box and/or “X” superimposed on the images of the target site for purposes of visually designating the target site in theimage data 109. The visual identifier of thefirst location 52A may be displayed on a user interface display (not shown) that is communicatively coupled to theimage processing device 102. Alternatively, in other embodiments a user of themedical system 100 may manually mark thefirst location 52A of the target site with a visual identifier based on theimage data 109, such as, for example, via a touch-screen user interface display (not shown) that is communicatively coupled to theimage processing device 102. In this instance, theprocessor 104 may analyze theimage data 109 to determine thefirst location 52A of the target site in accordance with the manual mark and/or identification by the user of themedical system 100 for continued tracking in subsequent images of the target site. - At
step 210 and referring toFIG. 5B , with thefirst location 52A of the target site determined relative to thetip 122, theprocessor 104 of theimage processing device 102 executes thetarget identification logic 108 to mark thefirst location 52A of the target site with a light/laser beam 134 by actuating thelaser 132 of themedical device 110. Theprocessor 104 actuates the mirror of themedical device 110 to reflect the light/laser beam 134 generated by thelaser 132 to redirect the light/laser beam 132 toward thefirst location 52A of the target site, pursuant to executing the machine-readable instructions of thetarget identification logic 108. - At
step 212 and still referring toFIG. 5B , with the light/laser beam 134 of thelaser 132 directed (e.g., by the mirror) to thefirst location 52A of the target site (e.g., the ampulla of Vater 22), themedical instrument 140 may be moved toward the target site in response tosensor 146 detecting the light/laser beam 132. Thehandle 141 of themedical instrument 140 may be actuated to automatically translate thelongitudinal body 142 through a working lumen of theshaft 120 to position thedistal end 144 adjacent to the target site. Accordingly, themedical device 110 tracks thefirst location 52A of the target site to allow themedical instrument 140 to lock onto thefirst location 52A with thesensor 146 and autonomously steer thedistal end 144 toward the target site to perform one or more procedures thereon, such as, for example, cannulate theampulla duct opening 22 of the common bile duct 20. - With the
sensor 146 positioned along thedistal end 144, thesensor 146 is configured to generate a feedback in response to detecting the incidence of the light/laser beam 132 onto the target site, relative to thedistal end 144. In some embodiments, thesensor 146 includes a photodiode configured to convert the light/laser beam 134 into an electrical current such that the feedback generated by thesensor 146 includes a photodiode signal transmitted to a user of themedical instrument 140. A strength of the photodiode signal generated by thesensor 146 may be indicative of a spatial (e.g., three-dimensional) proximity of thesensor 146 to the point of incidence of the light/laser beam 134. Accordingly, with the light/laser beam 134 directed to thefirst location 52A of the target site, it should be understood that a strength of the photodiode signal generated by thesensor 146 may increase as a distance between thedistal end 144 of themedical instrument 140 and the target site decreases as thesensor 146 may detect the light/laser beam 132 in a relatively close proximity. - It should be further understood that a strength (e.g., intensity variation) of the photodiode signal generated by the
sensor 146 may decrease as a distance between thedistal end 144 of themedical instrument 140 and the target site increases, as thesensor 146 may detect the light/laser beam 132 in a relatively further proximity. Although thesensor 146 in embodiments described herein includes a photodiode or CCD that is configured to generate a feedback in response to detecting the light/laser beam 132 in the form of a photodiode signal, it should be appreciated that various other suitable sensors and/or forms of feedback may be generated by a sensor on themedical instrument 140 without departing from a scope of this disclosure. - In some embodiments, the
medical instrument 140 may include a processor and memory similar to theprocessor 104 and thememory 106 of theimage processing device 102 shown and described above. In this instance, the processor of themedical instrument 140, when executing target identification logic stored on the memory of themedical instrument 140, may provide for autonomous steering of themedical instrument 140 relative to thefirst location 52A of the target site by tracking the light/laser beam 134 with thesensor 146. In other embodiments, themedical instrument 140 may be manually navigated to thefirst location 52A of the target site by a user of themedical system 100 by visually tracking a position of thedistal end 144 relative to thefirst location 52A via a user interface display (not shown). In this instance, a user may visually navigate thedistal end 144 of themedical instrument 140 toward the visual identifier generated by the light/laser beam 132. By way of illustrative example only, thedistal end 144 of themedical instrument 140 may be displayed on a user interface display by a visual identifier, such as, for example, cross-hairs superimposed on the user interface display that are indicative of a position of thedistal end 144. Further, the feedback generated by thesensor 146 may be utilized in addition to and/or in lieu of the user interface display for manually steering themedical instrument 140 toward thefirst location 52A of the target site. - In some instances, the
medical device 110 of themedical system 100 may move, intentionally and/or inadvertently, relative to the target site during a procedure as themedical instrument 140 moves toward the target site atstep 212. Such movements may occur due to difficulties in maintaining themedical device 110 stable during a procedure. In this instance, a position of the target site (e.g., the ampulla of Vater 22) relative to thetip 122 of theshaft 120 and/or thedistal end 144 of themedical instrument 140 may be modified and/or vary relative to an initial corresponding position between the target site and themedical device 110. Accordingly, theimage data 109 initially obtained by themedical system 100 atstep 206 may include inaccuracies and/or deficiencies in providing a current location of the target site (e.g., the ampulla of Vater 22). As a result, continued movement of thedistal end 144 of themedical instrument 140 toward thefirst location 52A, as initially determined by theprocessor 104 of theimaging processing device 102 atstep 208, may not allow a user of themedical system 100 to adequately access the target site. - At
step 214 and referring toFIG. 5C , in response to theprocessor 104 of theimage processing device 102 detecting a movement of themedical device 110 relative to the target site (e.g., the ampulla of Vater 22), theprocessor 104 may execute thetarget identification logic 108 to actuate theimaging device 130 to obtain updatedimage data 109 of the target site. In some embodiments, theprocessor 104 of theimage processing device 102, when executing thetarget identification logic 108, may be configured to determine whether themedical device 110 has moved relative to the target site by periodically capturing images with theimaging device 130 for comparison to theimage data 109 stored in thememory 106 atstep 206. Accordingly, movement of themedical device 110 relative to the target site may be based on determining that a positional variance between thefirst location 52A and a detected position of the target site is equal to or greater than a preprogrammed threshold (e.g., a millimeter(s), a micrometer(s), a nanometer(s), etc.). - In this instance, upon determining that a recorded position of the
first location 52A varies relative to a detected position of the target site via the periodically-captured images, theprocessor 104 of theimage processing system 102 repeatssteps method 200 described above. Theprocessor 104 executes thetarget identification logic 108 to capture images (e.g., image data 109) of the target site atstep 206, determine asecond location 52B of the target site (e.g., the ampulla of Vater 22) atstep 208, and mark thesecond location 52B with the light/laser beam 134 atstep 210. It should be understood that themethod 200 performs these steps substantially similar to those shown and described above to facilitate locating the target site with themedical system 100 in accordance with the new,second location 52B of the target site. - In other embodiments, the
image processing device 102 of themedical system 100 may be communicatively coupled to a remote station (not shown) for purposes of dynamically updating thetarget identification logic 108 stored on thememory 106. By way of illustrative example, theimage processing device 102 may be operable to receive neural network data from a remote station (e.g., a computer server), such as, for example, via a wired and/or wireless connection. The neural network data received by theimaging processing device 102 may includesupplemental image data 109, similar to theimage data 109 shown and described above, recorded from a plurality of prior procedures, devices, systems, etc. Such image data may be from a plurality of different patients, acquired over time, of the same or similar patient anatomy. Thesupplemental image data 109 may be stored in thememory 106 and utilized by theprocessor 104 of theimage processing device 102 to artificially determine and/or identify common physical properties and/or characteristics of one or more target sites, such as, for example, the ampulla ofVater 22 within thesmall intestine 18, theampulla duct opening 22 of the common bile duct 20, etc. - In the embodiment, the
processor 104 of theimage processing device 102, when executing the machine-readable instructions of thetarget identification logic 108, may reference thesupplemental image data 109 when analyzing theimage data 109 captured by theimaging device 130 of themedical device 110 to determine thefirst location 52A of the target site (e.g., the ampulla ofVater 22 within the small intestine 18). Accordingly, it should be appreciated that thesupplemental image data 109 may facilitate determining a coordinate position of a target site relative to themedical device 110 during a procedure by providing theimage processing device 102 additional data for artificial learning of a size, shape, and/or configuration of similar target sites. - Each of the aforementioned devices, assemblies, and methods may be used to detect, mark and track a location of a target site. By providing a medical assembly, a user may accurately interact with a patient's tissue using artificial intelligence software in an image processing device during a procedure, allowing a user to reduce overall procedure time, increase efficiency of procedures, and avoid unnecessary harm to a patient's body caused by lack of control over a motion and positioning of a medical device when accessing target tissue of a patient.
- It will be apparent to those skilled in the art that various modifications and variations may be made in the disclosed devices and methods without departing from the scope of the disclosure. It should be appreciated that the disclosed devices may include various suitable computer systems and/or computing units incorporating a plurality of hardware components, such as, for example, a processor and non-transitory computer-readable medium, that allow the devices to perform one or more operations during a procedure in accordance with those described herein. Other aspects of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the features disclosed herein. It is intended that the specification and examples be considered as exemplary only.
Claims (21)
1-20. (canceled)
21. A medical system, comprising:
a medical device including:
an imaging device configured to capture images of a target site, wherein a location of the target site is determined based on the images; and
a light source configured to direct a light onto the location of the target site; and
a processor and non-transitory computer readable medium storing instructions, wherein the medical device is configured to receive the instructions, wherein the medical device is movable toward the target site in response to (i) a sensor receiving the instructions or (ii) the sensor detecting the light at the target site, wherein the sensor is configured to detect the light at the target site.
22. The medical system of claim 21 , wherein the instructions stored in the non-transitory computer readable medium cause the processor to:
detect a change in location of the imaging device relative to the target site; and
determine the location of the target site relative to the imaging device,
wherein, in response to detecting a change in location of the imaging device relative to the target site and determining the location of the target site relative to the imaging device, the light is redirected to the location of the target site.
23. The medical system of claim 22 , wherein the processor is configured to detect the change in location of the imaging device relative to the target site based on images periodically captured by the imaging device; and
wherein the processor is configured to compare the location of the target site to an original location of the target site to determine a positional variance.
24. The medical system of claim 23 , wherein the processor is configured to determine whether the positional variance exceeds a preprogrammed threshold, wherein when the positional variance exceeds the preprogrammed threshold the processor at least obtains image data of the target site with the imaging device and analyzes the location of a target site.
25. The medical system of claim 24 , wherein the light source includes a source to generate a laser beam.
26. The medical system of claim 21 , wherein the imaging device includes a camera.
27. The medical system of claim 21 , wherein the processor is configured to generate a visual identifier along the images captured by the imaging device indicative of the location of the target site.
28. A medical system comprising:
a medical device including:
an imaging device configured to capture images of a target site;
a light source configured to direct a light onto the target site; and
a mirror configured to reflect the light generated by the light source toward the target site; and
a medical instrument movably disposed within a working channel of the medical device, wherein the medical instrument is movable relative to the medical device, the medical instrument including a sensor configured to detect the light on the target site, wherein the medical instrument is movable toward the target site in response to the sensor detecting the light on the target site.
29. The medical system of claim 28 , wherein the mirror includes a micro-mirror (MEMs mirror) configured to reflect the light along two axes.
30. The medical system of claim 28 , wherein the light source is disposed on a distal end of the medical device, wherein the mirror is positioned adjacent to the light source on the medical device, wherein the light source includes a source to generate a laser beam.
31. The medical system of claim 28 , further comprising a processor configured to detect movement of the medical device relative to the target site based on images captured by the imaging device; and wherein the light source is configured to redirect the light based on the detected movement of the medical device.
32. The medical system of claim 28 , wherein the sensor includes at least one of a photodetector, a photodiode, and a charged coupled device (CCD), wherein the sensor is configured to generate a photodiode signal in response to detecting the light at the target site.
33. The medical system of claim 32 , wherein a strength of the photodiode signal generated by the sensor includes a greater intensity when the sensor is positioned at a first distance from the light, and includes a smaller intensity when the sensor is positioned at a second distance from the light; and
wherein the first distance is less than the second distance.
34. A method of moving a medical instrument toward a target site, the method comprising:
delivering a medical instrument having a sensor to a target site;
capturing images of the target site with an imaging device;
marking a location of the target site based on the images;
detecting light from the location by the sensor; and
automatically moving the medical instrument toward the target site based on the sensor detecting the light at the location.
35. The method of claim 34 , wherein in response to detecting movement of the medical instrument toward the target site, the method further comprises:
capturing images of the target site with the imaging device to determine a second location of the target site;
redirecting the light from the light source to the second location; and
automatically moving the medical instrument toward the target site based on the sensor detecting the light at the second location.
36. The method of claim 34 , wherein in response to moving the medical instrument toward the target site, the method further comprises:
locking onto the target site with the sensor and automatically steering a distal end of the medical instrument toward the target site.
37. The method of claim 34 , wherein in response to capturing images of the target site with an imaging device, the method further comprises:
using target identification logic to mark a location of a target site, wherein the imaging device executes the target identification logic to automatically redirect the medical instrument when the medical device has moved relative to the target site by periodically capturing images for comparison to image data within the imaging device.
38. The method of claim 34 , wherein the sensor includes at least one of a photodetector, a photodiode, and a charged coupled device (CCD).
39. The method of claim 38 , wherein the sensor is configured to generate a photodiode signal in response to detecting the light at the target site.
40. The method of claim 39 , wherein a strength of the photodiode signal generated by the sensor includes a greater intensity when the medical device is positioned at a first distance from the light, and includes a smaller intensity when the medical device is positioned at a second distance from the light; and
wherein the first distance is less than the second distance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/485,026 US20240032772A1 (en) | 2019-12-03 | 2023-10-11 | Medical device tracking systems and methods of using the same |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962942959P | 2019-12-03 | 2019-12-03 | |
US17/109,933 US11812926B2 (en) | 2019-12-03 | 2020-12-02 | Medical device tracking systems and methods of using the same |
US18/485,026 US20240032772A1 (en) | 2019-12-03 | 2023-10-11 | Medical device tracking systems and methods of using the same |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/109,933 Continuation US11812926B2 (en) | 2019-12-03 | 2020-12-02 | Medical device tracking systems and methods of using the same |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240032772A1 true US20240032772A1 (en) | 2024-02-01 |
Family
ID=74003914
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/109,933 Active 2041-12-24 US11812926B2 (en) | 2019-12-03 | 2020-12-02 | Medical device tracking systems and methods of using the same |
US18/485,026 Pending US20240032772A1 (en) | 2019-12-03 | 2023-10-11 | Medical device tracking systems and methods of using the same |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/109,933 Active 2041-12-24 US11812926B2 (en) | 2019-12-03 | 2020-12-02 | Medical device tracking systems and methods of using the same |
Country Status (5)
Country | Link |
---|---|
US (2) | US11812926B2 (en) |
EP (1) | EP4030983A1 (en) |
JP (1) | JP2023504626A (en) |
CN (1) | CN114746002A (en) |
WO (1) | WO2021113375A1 (en) |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6239868B1 (en) * | 1996-01-02 | 2001-05-29 | Lj Laboratories, L.L.C. | Apparatus and method for measuring optical characteristics of an object |
US7800758B1 (en) | 1999-07-23 | 2010-09-21 | Faro Laser Trackers, Llc | Laser-based coordinate measuring device and laser-based method for measuring coordinates |
US8082020B2 (en) | 2006-08-07 | 2011-12-20 | Biosense Webster, Inc. | Distortion-immune position tracking using redundant magnetic field measurements |
US20080287783A1 (en) | 2007-05-16 | 2008-11-20 | General Electric Company | System and method of tracking delivery of an imaging probe |
US20090021818A1 (en) * | 2007-07-20 | 2009-01-22 | Ethicon Endo-Surgery, Inc. | Medical scanning assembly with variable image capture and display |
US9636031B2 (en) | 2007-11-26 | 2017-05-02 | C.R. Bard, Inc. | Stylets for use with apparatus for intravascular placement of a catheter |
DE102010050011A1 (en) * | 2010-11-02 | 2012-05-03 | Karl Storz Gmbh & Co. Kg | Endoscope with adjustable viewing direction |
US20120130171A1 (en) * | 2010-11-18 | 2012-05-24 | C2Cure Inc. | Endoscope guidance based on image matching |
US20130345510A1 (en) * | 2011-05-10 | 2013-12-26 | Ron Hadani | Method and endoscopic device for examining or imaging an interior surface of a corporeal cavity |
US9131850B2 (en) * | 2011-07-18 | 2015-09-15 | St. Jude Medical, Inc. | High spatial resolution optical coherence tomography rotation catheter |
JP5904750B2 (en) * | 2011-10-14 | 2016-04-20 | オリンパス株式会社 | Stereoscopic endoscope device |
US9125612B2 (en) | 2011-10-31 | 2015-09-08 | Volcano Corporation | Devices, systems, and methods for controlling field of view in imaging systems |
JP6027803B2 (en) * | 2012-07-17 | 2016-11-16 | Hoya株式会社 | Image processing apparatus and endoscope apparatus |
US9295372B2 (en) * | 2013-09-18 | 2016-03-29 | Cerner Innovation, Inc. | Marking and tracking an area of interest during endoscopy |
JP6305088B2 (en) * | 2014-02-07 | 2018-04-04 | オリンパス株式会社 | Surgical system and method of operating the surgical system |
US11020144B2 (en) * | 2015-07-21 | 2021-06-01 | 3Dintegrated Aps | Minimally invasive surgery system |
WO2018208994A1 (en) * | 2017-05-12 | 2018-11-15 | Auris Health, Inc. | Biopsy apparatus and system |
EP3641686A4 (en) * | 2017-06-23 | 2021-03-24 | Intuitive Surgical Operations, Inc. | Systems and methods for navigating to a target location during a medical procedure |
-
2020
- 2020-12-02 US US17/109,933 patent/US11812926B2/en active Active
- 2020-12-02 JP JP2022532757A patent/JP2023504626A/en active Pending
- 2020-12-02 CN CN202080084241.2A patent/CN114746002A/en active Pending
- 2020-12-02 WO PCT/US2020/062911 patent/WO2021113375A1/en unknown
- 2020-12-02 EP EP20829104.7A patent/EP4030983A1/en active Pending
-
2023
- 2023-10-11 US US18/485,026 patent/US20240032772A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US11812926B2 (en) | 2023-11-14 |
WO2021113375A1 (en) | 2021-06-10 |
US20210161361A1 (en) | 2021-06-03 |
CN114746002A (en) | 2022-07-12 |
EP4030983A1 (en) | 2022-07-27 |
JP2023504626A (en) | 2023-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11850008B2 (en) | Image-based branch detection and mapping for navigation | |
US11759266B2 (en) | Robotic systems for determining a roll of a medical device in luminal networks | |
US11957446B2 (en) | System and method for medical instrument navigation and targeting | |
JP7235803B2 (en) | Robotic endoscopic probe with orientation reference marker | |
JP6835850B2 (en) | Systems, control units, and methods for controlling surgical robots | |
KR102334904B1 (en) | Systems and methods for robotic medical system integration with external imaging | |
KR20210114556A (en) | Systems And Methods For Interventional Procedure Planning | |
US11944422B2 (en) | Image reliability determination for instrument localization | |
US20220319031A1 (en) | Vision-based 6dof camera pose estimation in bronchoscopy | |
US20220202500A1 (en) | Intraluminal navigation using ghost instrument information | |
US11812926B2 (en) | Medical device tracking systems and methods of using the same | |
US20240315781A1 (en) | Deflectable sensing instrument systems and methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: BOSTON SCIENTIFIC SCIMED, INC., MINNESOTA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FLANAGAN, AIDEN;CLARK, BRYAN;REEL/FRAME:065549/0549 Effective date: 20201001 |