CN118317739A - Surgical anchoring system for endoluminal access - Google Patents
Surgical anchoring system for endoluminal access Download PDFInfo
- Publication number
- CN118317739A CN118317739A CN202280078170.4A CN202280078170A CN118317739A CN 118317739 A CN118317739 A CN 118317739A CN 202280078170 A CN202280078170 A CN 202280078170A CN 118317739 A CN118317739 A CN 118317739A
- Authority
- CN
- China
- Prior art keywords
- surgical
- anchoring
- tissue
- anchor
- instrument
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004873 anchoring Methods 0.000 title claims abstract description 128
- 210000000056 organ Anatomy 0.000 claims abstract description 70
- 230000033001 locomotion Effects 0.000 claims abstract description 37
- 230000003287 optical effect Effects 0.000 claims description 76
- 238000004891 communication Methods 0.000 claims description 25
- 230000007246 mechanism Effects 0.000 claims description 4
- 210000001519 tissue Anatomy 0.000 description 250
- 238000012800 visualization Methods 0.000 description 158
- 238000003384 imaging method Methods 0.000 description 136
- 206010028980 Neoplasm Diseases 0.000 description 134
- 210000002784 stomach Anatomy 0.000 description 131
- 238000000034 method Methods 0.000 description 106
- 210000004072 lung Anatomy 0.000 description 95
- 238000001356 surgical procedure Methods 0.000 description 95
- 238000004422 calculation algorithm Methods 0.000 description 53
- 210000001072 colon Anatomy 0.000 description 49
- 239000012530 fluid Substances 0.000 description 43
- 230000003595 spectral effect Effects 0.000 description 37
- 210000003484 anatomy Anatomy 0.000 description 34
- 230000003993 interaction Effects 0.000 description 33
- 210000003123 bronchiole Anatomy 0.000 description 31
- 238000000701 chemical imaging Methods 0.000 description 21
- 238000005520 cutting process Methods 0.000 description 20
- 238000012326 endoscopic mucosal resection Methods 0.000 description 20
- 230000008859 change Effects 0.000 description 19
- 238000013507 mapping Methods 0.000 description 19
- 238000002271 resection Methods 0.000 description 18
- 238000001228 spectrum Methods 0.000 description 17
- 210000000621 bronchi Anatomy 0.000 description 16
- 230000000007 visual effect Effects 0.000 description 16
- 238000010521 absorption reaction Methods 0.000 description 15
- 239000000463 material Substances 0.000 description 15
- 238000002224 dissection Methods 0.000 description 14
- 238000005259 measurement Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 13
- 230000006378 damage Effects 0.000 description 12
- 230000003902 lesion Effects 0.000 description 12
- 210000000664 rectum Anatomy 0.000 description 12
- 230000003044 adaptive effect Effects 0.000 description 11
- 210000001367 artery Anatomy 0.000 description 11
- 239000012636 effector Substances 0.000 description 11
- 230000004044 response Effects 0.000 description 11
- 210000003437 trachea Anatomy 0.000 description 11
- 210000004204 blood vessel Anatomy 0.000 description 10
- 238000005286 illumination Methods 0.000 description 10
- 230000036961 partial effect Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 210000005070 sphincter Anatomy 0.000 description 10
- 201000011510 cancer Diseases 0.000 description 9
- 238000012321 colectomy Methods 0.000 description 9
- 210000001198 duodenum Anatomy 0.000 description 9
- 210000003238 esophagus Anatomy 0.000 description 9
- 238000012544 monitoring process Methods 0.000 description 9
- 239000000779 smoke Substances 0.000 description 9
- 210000003815 abdominal wall Anatomy 0.000 description 8
- 230000009471 action Effects 0.000 description 8
- 239000000470 constituent Substances 0.000 description 8
- 230000008878 coupling Effects 0.000 description 8
- 238000010168 coupling process Methods 0.000 description 8
- 238000005859 coupling reaction Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 210000001165 lymph node Anatomy 0.000 description 8
- 210000000626 ureter Anatomy 0.000 description 8
- 230000005670 electromagnetic radiation Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000002496 gastric effect Effects 0.000 description 7
- 238000011282 treatment Methods 0.000 description 7
- 238000004458 analytical method Methods 0.000 description 6
- 230000008901 benefit Effects 0.000 description 6
- 238000004140 cleaning Methods 0.000 description 6
- 230000010354 integration Effects 0.000 description 6
- 230000008439 repair process Effects 0.000 description 6
- 210000000115 thoracic cavity Anatomy 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000006835 compression Effects 0.000 description 5
- 238000007906 compression Methods 0.000 description 5
- 238000001647 drug administration Methods 0.000 description 5
- 208000014674 injury Diseases 0.000 description 5
- 238000010801 machine learning Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 210000001187 pylorus Anatomy 0.000 description 5
- 210000000813 small intestine Anatomy 0.000 description 5
- 238000012876 topography Methods 0.000 description 5
- 238000002604 ultrasonography Methods 0.000 description 5
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 4
- 238000002679 ablation Methods 0.000 description 4
- 210000004369 blood Anatomy 0.000 description 4
- 239000008280 blood Substances 0.000 description 4
- 230000009977 dual effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 238000003780 insertion Methods 0.000 description 4
- 230000037431 insertion Effects 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 4
- 238000002955 isolation Methods 0.000 description 4
- 210000001630 jejunum Anatomy 0.000 description 4
- 230000004807 localization Effects 0.000 description 4
- 210000005036 nerve Anatomy 0.000 description 4
- 238000002310 reflectometry Methods 0.000 description 4
- 239000000523 sample Substances 0.000 description 4
- 210000003462 vein Anatomy 0.000 description 4
- 238000001429 visible spectrum Methods 0.000 description 4
- 206010058467 Lung neoplasm malignant Diseases 0.000 description 3
- 208000035346 Margins of Excision Diseases 0.000 description 3
- 238000004497 NIR spectroscopy Methods 0.000 description 3
- 208000005718 Stomach Neoplasms Diseases 0.000 description 3
- 208000027418 Wounds and injury Diseases 0.000 description 3
- 230000003187 abdominal effect Effects 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 3
- 239000012620 biological material Substances 0.000 description 3
- 238000012512 characterization method Methods 0.000 description 3
- 238000002591 computed tomography Methods 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 206010017758 gastric cancer Diseases 0.000 description 3
- 230000036541 health Effects 0.000 description 3
- 201000005202 lung cancer Diseases 0.000 description 3
- 208000020816 lung neoplasm Diseases 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 201000011549 stomach cancer Diseases 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 210000004291 uterus Anatomy 0.000 description 3
- 240000006409 Acacia auriculiformis Species 0.000 description 2
- 102000008186 Collagen Human genes 0.000 description 2
- 108010035532 Collagen Proteins 0.000 description 2
- 238000012323 Endoscopic submucosal dissection Methods 0.000 description 2
- XUMBMVFBXHLACL-UHFFFAOYSA-N Melanin Chemical compound O=C1C(=O)C(C2=CNC3=C(C(C(=O)C4=C32)=O)C)=C2C4=CNC2=C1C XUMBMVFBXHLACL-UHFFFAOYSA-N 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 2
- 210000000683 abdominal cavity Anatomy 0.000 description 2
- 230000003872 anastomosis Effects 0.000 description 2
- 238000005452 bending Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 2
- 230000017531 blood circulation Effects 0.000 description 2
- 230000036770 blood supply Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 229910002092 carbon dioxide Inorganic materials 0.000 description 2
- 239000001569 carbon dioxide Substances 0.000 description 2
- 210000000038 chest Anatomy 0.000 description 2
- 229920001436 collagen Polymers 0.000 description 2
- 208000029742 colonic neoplasm Diseases 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 210000002808 connective tissue Anatomy 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 210000005069 ears Anatomy 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000002357 laparoscopic surgery Methods 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 210000001363 mesenteric artery superior Anatomy 0.000 description 2
- 230000000149 penetrating effect Effects 0.000 description 2
- 230000035515 penetration Effects 0.000 description 2
- 210000003105 phrenic nerve Anatomy 0.000 description 2
- 210000004224 pleura Anatomy 0.000 description 2
- 210000003240 portal vein Anatomy 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000008733 trauma Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- INGWEZCOABYORO-UHFFFAOYSA-N 2-(furan-2-yl)-7-methyl-1h-1,8-naphthyridin-4-one Chemical compound N=1C2=NC(C)=CC=C2C(O)=CC=1C1=CC=CO1 INGWEZCOABYORO-UHFFFAOYSA-N 0.000 description 1
- 206010000060 Abdominal distension Diseases 0.000 description 1
- 240000005020 Acaciella glauca Species 0.000 description 1
- 241001631457 Cannula Species 0.000 description 1
- 240000000560 Citrus x paradisi Species 0.000 description 1
- 206010009944 Colon cancer Diseases 0.000 description 1
- 206010014561 Emphysema Diseases 0.000 description 1
- 241000219470 Mirabilis Species 0.000 description 1
- 108010064719 Oxyhemoglobins Proteins 0.000 description 1
- 206010034238 Pelvic adhesions Diseases 0.000 description 1
- 208000007452 Plasmacytoma Diseases 0.000 description 1
- FAPWRFPIFSIZLT-UHFFFAOYSA-M Sodium chloride Chemical compound [Na+].[Cl-] FAPWRFPIFSIZLT-UHFFFAOYSA-M 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000008649 adaptation response Effects 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 210000001815 ascending colon Anatomy 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 201000009267 bronchiectasis Diseases 0.000 description 1
- 238000002052 colonoscopy Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 239000002872 contrast media Substances 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 108010002255 deoxyhemoglobin Proteins 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000012377 drug delivery Methods 0.000 description 1
- 238000000295 emission spectrum Methods 0.000 description 1
- 238000001839 endoscopy Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000005251 gamma ray Effects 0.000 description 1
- 238000010413 gardening Methods 0.000 description 1
- 238000013110 gastrectomy Methods 0.000 description 1
- 210000004211 gastric acid Anatomy 0.000 description 1
- 208000021302 gastroesophageal reflux disease Diseases 0.000 description 1
- 238000000338 in vitro Methods 0.000 description 1
- 238000001727 in vivo Methods 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 238000011221 initial treatment Methods 0.000 description 1
- 230000005865 ionizing radiation Effects 0.000 description 1
- 238000007629 laparoscopic insertion Methods 0.000 description 1
- 210000003041 ligament Anatomy 0.000 description 1
- 238000001748 luminescence spectrum Methods 0.000 description 1
- 210000004324 lymphatic system Anatomy 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- QSHDDOUJBYECFT-UHFFFAOYSA-N mercury Chemical compound [Hg] QSHDDOUJBYECFT-UHFFFAOYSA-N 0.000 description 1
- 229910052753 mercury Inorganic materials 0.000 description 1
- 210000000713 mesentery Anatomy 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000002324 minimally invasive surgery Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 238000001320 near-infrared absorption spectroscopy Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000001208 nuclear magnetic resonance pulse sequence Methods 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 210000004197 pelvis Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004962 physiological condition Effects 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012913 prioritisation Methods 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 108090000623 proteins and genes Proteins 0.000 description 1
- 102000004169 proteins and genes Human genes 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 235000003499 redwood Nutrition 0.000 description 1
- 238000013538 segmental resection Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000009450 smart packaging Methods 0.000 description 1
- 239000011780 sodium chloride Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000003685 thermal hair damage Effects 0.000 description 1
- 230000000451 tissue damage Effects 0.000 description 1
- 231100000827 tissue damage Toxicity 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 210000003384 transverse colon Anatomy 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Landscapes
- Endoscopes (AREA)
Abstract
A surgical anchor system for use with an endoluminal access surgical instrument is provided. The surgical instrument includes an outer sleeve defining a working channel therethrough. The outer sleeve is configured to be disposed within a first natural body cavity. The channel arms are configured to extend through the working channel and move independently of each other. The channel arm has an anchor member and is configured to be movable between an expanded state and an unexpanded state. The anchoring member is configured to be disposed within the second natural body lumen when in the expanded state. A control actuator extends along the channel arm and is operatively coupled to the anchor member, the control actuator being operatively coupled to a drive system configured to control movement of the channel arm to selectively manipulate organs associated with the first natural body lumen and the second natural body lumen.
Description
Cross Reference to Related Applications
The present application claims priority from U.S. provisional patent application No. 63/249,980, filed on 9 months 29 of 2021 and entitled "Cooperative Access," the disclosure of which is incorporated herein by reference in its entirety.
Technical Field
The present invention relates generally to surgical systems for anchored collaborative endoscopic and laparoscopic access, tissue manipulation, and the like, and methods of using the same.
Background
Surgical systems often incorporate imaging systems that may allow a practitioner to view a surgical site and/or one or more portions thereof on one or more displays (e.g., monitors, computer tablet screens, etc.). The display may be local and/or remote to the operating room. The imaging system may include a scope having a camera that views the surgical site and transmits the view to one or more displays viewable by the practitioner.
Imaging systems may be limited by the information they can identify and/or communicate to a medical practitioner. For example, some imaging systems may not be able to intra-operatively identify certain hidden structures, physical contours, and/or dimensions within a three-dimensional space. For another example, some imaging systems may not be able to communicate and/or convey certain information to a medical practitioner intraoperatively.
Thus, there remains a need for improved surgical imaging.
Disclosure of Invention
A surgical anchor system is provided. In one exemplary embodiment, a surgical anchor system includes a surgical instrument having: an outer sleeve defining a working channel therethrough and configured to be at least partially disposed within the first natural body cavity; and at least one channel arm configured to extend through the working channel and configured to be independently movable relative to one another. The at least one channel arm has: at least one anchor member coupled to the at least one channel arm; and at least one control actuator extending along the at least one channel arm and operatively coupled to the at least one anchor member. The at least one anchoring member is configured to be movable between an expanded state and an unexpanded state, and when in the expanded state, the at least one anchoring member is configured to be disposed at least partially within a second natural body lumen that is in communication with the first natural body lumen. The at least one control actuator is operably coupled to a drive system configured to control movement of the at least one channel arm to selectively manipulate organs associated with the first natural body lumen and the second natural body lumen.
Surgical instruments can have a variety of configurations. In some embodiments, the surgical instrument can include an anchoring balloon disposed proximal of the distal end of the outer sleeve. In certain embodiments, the anchoring balloon may be configured to expand and at least partially contact an inner surface of the first natural body lumen.
The anchoring member may have a variety of configurations. In some embodiments, the at least one anchoring member may be configured to expand and at least partially contact an inner surface of the second natural body lumen. In certain embodiments, the at least one anchoring member may be configured to be axially movable along the length of the channel arm. In such embodiments, the at least one anchoring member may be configured to be selectively lockable at an axial position along the length of the at least one channel arm by a releasable locking mechanism.
The at least one channel arm may have a variety of configurations. In some embodiments, the at least one channel arm may be configured to apply a force to the second natural body lumen via at least one anchoring member to manipulate the second natural body lumen relative to the first natural body lumen. In certain embodiments, the at least one channel arm may include an optical sensor disposed at a distal end of the at least one channel arm.
In some embodiments, the surgical anchoring system may include a controller configured to coordinate movement of the at least one channel arm within the second natural body lumen and movement of the at least one instrument outside the second natural body lumen to prevent tearing of the second natural body lumen.
In another exemplary embodiment, a surgical anchor system includes a tubular member and an anchor assembly coupled to and extending distally from a distal portion of the tubular member. The tubular member is configured for intra-luminal access and has a central lumen therein configured to allow an endoscope to pass therethrough. The anchor assembly includes a first anchor member coupled to the tubular member and a second anchor member movable relative to and positioned distal to the first anchor member. The first anchor member is configured to engage a first anatomical location and fix the first anatomical location relative to the tubular member, and the second anchor member is configured to engage a second anatomical location movable relative to the first anatomical location, wherein movement of the second anchor member relative to the first anchor member is effective to selectively reposition the second anatomical location relative to the first anatomical location.
The first and second anchor members may have a variety of configurations. In some embodiments, the first anchoring member may include a first plurality of expandable anchoring elements and a first plurality of working channels extending through the first anchoring member. In such embodiments, the second anchoring member may include a second plurality of working channels through which the second plurality of expandable anchoring elements extend. In such embodiments, the tubular member may include a third plurality of working channels extending therethrough.
In some embodiments, the surgical anchor system may include a plurality of first actuators passing through a first working channel of the first plurality of working channels of the first anchor member and a first working channel of the third plurality of working channels of the tubular member. In such embodiments, the plurality of first actuators may be configured to be rotatable to expand the first plurality of expandable anchor elements.
In other embodiments, the surgical anchoring system may include a plurality of second actuators passing through a second working channel of the first plurality of working channels of the first anchoring member, a first working channel of the second plurality of working channels of the second anchoring member, and a second working channel of the third plurality of working channels of the tubular member. In such embodiments, the plurality of second actuators may be configured to be rotatable to expand the second plurality of expandable anchor elements.
In still other embodiments, the surgical anchor system may include a plurality of third actuators passing through a third working channel of the first plurality of working channels of the first anchor member and a third working channel of the third plurality of working channels of the tubular member and may terminate at a proximal surface of the second anchor member. In certain embodiments, the plurality of third actuators may be configured to be rotatable to axially displace the second anchor relative to the first anchor. In some embodiments, the plurality of third actuators can be configured to extend, retract, or bend to selectively reposition the second anatomical location relative to the first anatomical location.
Drawings
The invention is described with reference to the following drawings:
FIG. 1 is a schematic view of one embodiment of a surgical visualization system;
FIG. 2 is a schematic illustration of triangulation between the surgical device, imaging device and critical structures of FIG. 1;
FIG. 3 is a schematic view of another embodiment of a surgical visualization system;
FIG. 4 is a schematic view of one embodiment of a control system of a surgical visualization system;
FIG. 5 is a schematic diagram of one embodiment of a control circuit of a control system of a surgical visualization system;
FIG. 6 is a schematic diagram of one embodiment of a combinational logic circuit of a surgical visualization system;
FIG. 7 is a schematic diagram of one embodiment of sequential logic circuitry of a surgical visualization system;
FIG. 8 is a schematic view of yet another embodiment of a surgical visualization system;
FIG. 9 is a schematic view of another embodiment of a control system of a surgical visualization system;
FIG. 10 is a graph showing wavelength versus absorption coefficient for various biological materials;
FIG. 11 is a schematic view of an embodiment of a spectral emitter to visualize a surgical site;
Fig. 12 is a graph depicting illustrative hyperspectral identification features for distinguishing ureters from shadows;
FIG. 13 is a graph depicting illustrative hyperspectral identification features for distinguishing arteries from a mask;
FIG. 14 is a graph depicting illustrative hyperspectral identification features for distinguishing nerves from a mask;
FIG. 15 is a schematic diagram of one embodiment of a Near Infrared (NIR) time-of-flight measurement system utilized intraoperatively;
FIG. 16 shows a time-of-flight timing diagram of the system of FIG. 15;
FIG. 17 is a schematic diagram of another embodiment of a Near Infrared (NIR) time-of-flight measurement system utilized intraoperatively;
FIG. 18 is a schematic diagram of an embodiment of a computer-implemented interactive surgical system;
FIG. 19 is a schematic view of an embodiment of a surgical system for performing a surgical procedure in an operating room;
FIG. 20 is a schematic view of an embodiment of a surgical system including a smart surgical instrument and a surgical hub;
FIG. 21 is a flow chart illustrating a method of controlling the intelligent surgical instrument of FIG. 20;
FIG. 22 is a schematic view of one embodiment of a surgical anchor system having an outer sleeve and a channel arm extending through the outer sleeve, wherein the channel arm includes a corresponding anchor member, showing the surgical anchor system inserted through the throat and into the lung, wherein a portion of the outer sleeve passes through the throat and into the lung, and the channel arm extends through the outer sleeve and into a corresponding portion of the lung, wherein the corresponding anchor member is in an unexpanded state;
FIG. 23A is an enlarged view of a portion of one of the channel arms and a portion of a corresponding anchor member of the surgical anchor system of FIG. 22 with the lung removed;
FIG. 23B is the channel arm of FIG. 23A, showing the anchor member in an expanded state;
FIG. 24 is an enlarged view of the distal end of the channel arm of the surgical anchor system of FIG. 22 with the lung removed;
FIG. 25 is an enlarged view of a portion of the surgical anchor system of FIG. 22 with the lung removed;
FIG. 26 is a schematic view of the surgical anchor system of FIG. 22, showing the anchor member in an expanded state as the lung is maneuvered from the endoluminal space and from the extraluminal space using a laparoscopic insertion instrument;
FIG. 27 is a schematic view of another embodiment of a surgical anchor system, showing the surgical anchor system inserted through the throat and into the lung;
FIG. 28 is a schematic view of the colon;
FIG. 29 is a schematic view of a conventional surgical system inserted into an organ;
FIG. 30 is a schematic view of another embodiment of a surgical anchor system having a first anchor member and a second anchor member, showing the first anchor member and the second anchor member in an unexpanded state;
FIG. 31 is a schematic view of the surgical anchor system of FIG. 30, showing the first and second anchor members in an expanded state;
FIG. 32 is a cross-sectional view of the surgical anchor system of FIG. 31, showing the surgical anchor system inserted into an organ; and
FIG. 33 is a schematic view of another embodiment of a surgical anchoring system having a circular stapler and an anvil, each having a tracking device, showing the surgical anchoring system inserted through the rectum and into the colon, wherein a portion of the circular stapler passes through the rectum and into the colon, and the anvil passes through an mobilized portion of the colon.
FIG. 34 is a schematic view of the stomach;
FIG. 35 is a schematic view of a conventional surgical system having a laparoscope and an endoscope, showing the laparoscope positioned outside of the stomach and the endoscope positioned within the stomach;
FIG. 36 is a schematic view of the stomach of FIG. 35 showing a conventional wedge resection for removing a tumor from the stomach using the surgical system of FIG. 35;
FIG. 37 is a schematic view of an embodiment of a surgical system having a laparoscope, laparoscopic instrument, and endoscope, showing the laparoscopic and laparoscopic instruments positioned outside of the stomach and the endoscope positioned within the stomach;
FIG. 38 is a schematic view of the surgical system of FIG. 37 showing the relative distances between a laparoscope, laparoscopic instrument, endoscope, and intragastric tumor;
FIG. 39 is a schematic illustration of a combined image of the surgical system of FIG. 37 from the perspective of an endoscope;
FIG. 40 is a schematic illustration of a combined image of the surgical system of FIG. 38 from a laparoscopic perspective;
FIG. 41 is a schematic view of the surgical system of FIG. 40, showing the partial removal of a tumor disposed in the stomach by an endoscopic deployment instrument;
FIG. 41a is a schematic view of a combined image of the surgical system of FIG. 41 from the perspective of an endoscope;
FIG. 42 is a schematic view of the surgical system of FIG. 41, showing partial removal of a tumor from the interior tissue wall of the stomach;
FIG. 42a is a schematic illustration of a combined image of the surgical system of FIG. 42 from the perspective of an endoscope;
FIG. 43 is a schematic view of the surgical system of FIG. 42 showing the mobilization of the upper portion of the stomach by a laparoscopic deployment instrument;
FIG. 43a is a schematic illustration of a combined image of the surgical system of FIG. 43 from the perspective of an endoscope;
FIG. 44a is a detailed view of another embodiment of a surgical system showing removal of a damaged portion of tissue from the colon;
FIG. 44b is a detailed view of the surgical system of FIG. 44a, showing an incision in the layer of the serous muscle;
FIG. 44c is a detailed view of the surgical system of FIG. 44b, showing inflation of the endoscope balloon;
FIG. 44d is a detailed view of the surgical system of FIG. 44c, showing lesion removal;
FIG. 44e is a detailed view of the surgical system of FIG. 44d, showing the closing of the colon;
FIG. 45 is a schematic view of another embodiment of a surgical system showing the removal of a tumor from the stomach by a laparoscopic deployment instrument;
FIG. 46 is a schematic illustration of a combined image of the surgical system of FIG. 45 from a laparoscopic perspective;
FIG. 47 is a schematic view of the surgical system of FIG. 45, showing tissue strain measurements using markers disposed on the interior tissue surface of the stomach; and
FIG. 48 is a schematic view of another embodiment of a surgical system showing the removal of lymph nodes from the outer tissue wall of the stomach.
Detailed Description
Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the devices, systems, and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. Those skilled in the art will understand that the devices, systems and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present invention.
Furthermore, in the present disclosure, similarly-named components in various embodiments typically have similar features, and thus, in particular embodiments, each feature of each similarly-named component is not necessarily set forth entirely. In addition, to the extent that linear or circular dimensions are used in the description of the disclosed systems, devices, and methods, such dimensions are not intended to limit the types of shapes that may be used in connection with such systems, devices, and methods. Those skilled in the art will recognize that equivalent dimensions of such linear and circular dimensions can be readily determined for any geometry. Those skilled in the art will appreciate that the dimensions may not be an exact value, but are considered to be approximately at that value due to any number of factors such as manufacturing tolerances and sensitivity of the measurement device. The size and shape of the systems and devices and their components may depend at least on the size and shape of the components with which the systems and devices are to be used.
Surgical visualization
Generally, surgical visualization systems are configured to utilize "digital surgery" to obtain additional information about the anatomy and/or surgery of a patient. The surgical visualization system is also configured to communicate data to one or more medical practitioners in a helpful manner. Various aspects of the present disclosure provide for improved visualization of a patient's anatomy and/or surgery, and/or use of the visualization to provide for improved control of a surgical tool (also referred to herein as a "surgical device" or "surgical instrument").
"Digital surgery" may encompass robotic systems, advanced imaging, advanced instrumentation, artificial intelligence, machine learning, data analysis for performance tracking and benchmarking, connectivity both inside and outside of the Operating Room (OR), and more. Although the various surgical visualization systems described herein may be used in connection with robotic surgical systems, the surgical visualization systems are not limited to use with robotic surgical systems. In some cases, surgical visualization implemented using the surgical visualization system may be performed without a robot and/or with limited robotic assistance and/or optional robotic assistance. Similarly, digital surgery may be performed without a robot and/or with limited and/or optional robotic assistance.
In some cases, surgical systems incorporating surgical visualization systems may enable intelligent dissection in order to identify and avoid critical structures. Critical structures include anatomical structures such as ureters, arteries such as superior mesenteric arteries, veins such as portal veins, nerves such as phrenic nerves and/or tumors, and the like. In other cases, the critical structures may be extraneous structures in the anatomical field, such as surgical devices, surgical fasteners, clamps, tacks, bougies, bands, plates, and other extraneous structures. The critical structures may be determined on a patient-by-patient and/or surgical-by-surgical basis. For example, smart dissection techniques may provide improved intraoperative guidance for dissection and/or critical anatomy detection and avoidance techniques may be utilized to achieve more intelligent decisions.
Surgical systems incorporating surgical visualization systems can implement smart anastomosis techniques that provide more consistent anastomosis at optimal locations with improved workflow. Surgical visualization platforms can be utilized to improve cancer localization techniques. For example, cancer localization techniques may identify and track cancer locations, orientations, and boundaries thereof. In some cases, the cancer localization techniques may compensate for movement of the surgical instrument, patient, and/or anatomy of the patient during the surgical procedure in order to provide guidance to the practitioner back to the point of interest.
The surgical visualization system may provide improved tissue characterization and/or lymph node diagnosis and mapping. For example, tissue characterization techniques may characterize tissue type and health without requiring physical haptics, particularly when dissecting and/or placing a suturing device within tissue. Certain tissue characterization techniques may be used without ionizing radiation and/or contrast agents. With respect to lymph node diagnosis and mapping, the surgical visualization platform may, for example, locate, map, and desirably diagnose the lymphatic system and/or lymph nodes involved in cancerous diagnosis and staging prior to surgery.
During surgery, information available to a practitioner via the "naked eye" and/or imaging system may provide an incomplete view of the surgical site. For example, certain structures (such as structures embedded or buried within an organ) may be at least partially concealed or hidden from view. In addition, certain dimensions and/or relative distances may be difficult to ascertain using existing sensor systems and/or difficult to perceive by the "naked eye". In addition, certain structures may be moved preoperatively (e.g., prior to surgery but after a preoperative scan) and/or intraoperatively. In such cases, the practitioner may not be able to accurately determine the location of critical structures intraoperatively.
The decision process of the practitioner may be hindered when the position of the key structure is uncertain and/or when the proximity between the key structure and the surgical tool is unknown. For example, a practitioner may avoid certain areas in order to avoid accidentally cutting critical structures; however, the avoided area may be unnecessarily large and/or at least partially misplaced. Due to uncertainty and/or over/over cautious operations, a practitioner may not be able to access certain desired areas. For example, excessive caution may cause a practitioner to leave a portion of a tumor and/or other undesirable tissue in an attempt to avoid critical structures, even if critical structures are not in and/or not negatively affected by a clinician working in that particular area. In some cases, the surgical outcome may be improved by increasing knowledge and/or certainty, which may make the surgeon more accurate in terms of the particular anatomical region, and in some cases, make the surgeon less conservative/aggressive.
The surgical visualization system may allow for intra-operative identification and avoidance of critical structures. Thus, the surgical visualization system may enable enhanced intraoperative decision-making and improved surgical results. The surgical visualization system may provide advanced visualization capabilities beyond what the practitioner sees with the "naked eye" and/or beyond what the imaging system can identify and/or communicate to the practitioner. The surgical visualization system may enhance and strengthen information that a medical practitioner is able to know prior to tissue treatment (e.g., dissection, etc.), and thus may improve the results in various circumstances. Thus, the practitioner knows that the surgical visualization system is tracking critical structures that are accessible, for example, during incision, and can be confident to maintain power throughout the surgical procedure. The surgical visualization system may provide instructions to the practitioner for a time sufficient to cause the practitioner to pause and/or slow the surgical procedure and assess proximity to critical structures to prevent accidental damage thereto. The surgical visualization system may provide the practitioner with an ideal, optimized, and/or customizable amount of information to allow the practitioner to confidently and/or quickly move through tissue while avoiding configuring healthy tissue and/or critical knots to be accidentally damaged, and thus minimizing the risk of injury caused by the surgical procedure.
The surgical visualization system is described in detail below. In general, a surgical visualization system may include a first light emitter configured to emit a plurality of spectral waves, a second light emitter configured to emit a light pattern, and a receiver or sensor configured to detect visible light, molecular responses to the spectral waves (spectral imaging), and/or the light pattern. The surgical visualization system may also include an imaging system and a control circuit in signal communication with the receiver and the imaging system. Based on the output from the receiver, the control circuit may determine a geometric surface map (e.g., a three-dimensional surface topography) of the visible surface at the surgical site and a distance (such as a distance to at least a partially hidden structure) relative to the surgical site. The imaging system may communicate the geometric surface map and the distance to the practitioner. In such cases, the enhanced view of the surgical site provided to the practitioner may provide a representation of concealed structures within the relevant environment of the surgical site. For example, the imaging system may virtually augment the hidden structure on geometric surface maps that hide and/or block tissue, similar to lines drawn on the ground to indicate utility lines below the surface. Additionally or alternatively, the imaging system may communicate the proximity of the surgical tool to visible blocking tissue and/or to at least partially concealed structures and/or the depth of concealed structures below the visible surface of blocking tissue. For example, the visualization system may determine a distance relative to the enhancement line on the surface of the visible tissue and communicate the distance to the imaging system.
Throughout this disclosure, unless visible light is specifically mentioned, any reference to "light" can include photons in the visible and/or invisible portions of the electromagnetic radiation (EMR) or EMR wavelength spectrum. The visible spectrum (sometimes referred to as the optical spectrum or the luminescence spectrum) is that portion of the electromagnetic spectrum that is visible to (e.g., detectable by) the human eye, and may be referred to as "visible light" or simply "light". A typical human eye will respond to wavelengths in the air of about 380nm to about 750 nm. The invisible spectrum (e.g., non-emission spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum. The human eye cannot detect the invisible spectrum. Wavelengths greater than about 750nm are longer than the red visible spectrum, and they become invisible Infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380nm are shorter than the violet spectrum and they become invisible ultraviolet, x-ray and gamma-ray electromagnetic radiation.
Fig. 1 illustrates an embodiment of a surgical visualization system 100. The surgical visualization system 100 is configured to create a visual representation of the critical structures 101 within the anatomical field. The critical structure 101 may comprise a single critical structure or a plurality of critical structures. As discussed herein, the critical structure 101 may be any of a variety of structures, such as anatomical structures (e.g., ureters, arteries such as superior mesenteric arteries, veins such as portal veins, nerves such as phrenic nerves, blood vessels, tumors, or other anatomical structures) or foreign structures (e.g., surgical devices, surgical fasteners, surgical clips, surgical tacks, bougies, surgical bands, surgical plates, or other foreign structures). As discussed herein, critical structures 101 may be identified based on different patients and/or different procedures. Embodiments of critical structures and the identification of critical structures using a visualization system are further described in U.S. patent No. 10,792,034, entitled "Visualization Of Surgical Devices," issued on month 10 and 6 of 2020, which is hereby incorporated by reference in its entirety.
In some cases, critical structures 101 may be embedded in tissue 103. Tissue 103 may be any of a variety of tissues, such as fat, connective tissue, adhesions, and/or organs. In other words, critical structures 101 may be positioned below surface 105 of tissue 103. In such cases, the tissue 103 conceals the critical structures 101 from the "naked eye" of the practitioner. Tissue 103 also shields critical structures 101 from view by imaging device 120 of surgical visualization system 100. The critical structures 101 may be partially obscured from view by the practitioner and/or the imaging device 120, rather than fully obscured.
The surgical visualization system 100 may be used for clinical analysis and/or medical intervention. In some cases, the surgical visualization system 100 may be used intraoperatively to provide real-time information to a practitioner during a surgical procedure, such as real-time information regarding proximity data, size, and/or distance. Those skilled in the art will appreciate that the information may not be precisely real-time, but for any of a number of reasons, such as time delays caused by data transmission, time delays caused by data processing, and/or sensitivity of the measurement device, the information may be considered real-time. The surgical visualization system 100 is configured to be used to intra-operatively identify critical structures and/or to facilitate the surgical device avoiding critical structures 101. For example, by identifying the critical structure 101, a practitioner may avoid manipulating the surgical device around the critical structure 101 and/or regions in a predefined proximity of the critical structure 101 during a surgical procedure. For another example, by identifying the critical structure 101, the practitioner may avoid cutting the critical structure 101 and/or cutting near the critical structure, thereby helping to prevent damage to the critical structure 101 and/or helping to prevent surgical devices used by the practitioner from being damaged by the critical structure 101.
The surgical visualization system 100 is configured to incorporate tissue identification and geometric surface mapping in conjunction with a distance sensor system 104 of the surgical visualization system. In combination, these features of the surgical visualization system 100 can determine the location of the critical structures 101 within the anatomical field and/or the proximity of the surgical device 102 to the surface 105 of the visible tissue 103 and/or to the critical structures 101. Further, the surgical visualization system 100 includes an imaging system including an imaging device 120 configured to provide a real-time view of the surgical site. For example, the imaging device 120 may include a spectral camera (e.g., a hyperspectral camera, a multispectral camera, or a selective spectral camera) configured to be able to detect reflected spectral waveforms and generate a spectral cube of an image based on molecular responses to different wavelengths. Views from the imaging device 120 may be provided to a practitioner in real-time, such as on a display (e.g., monitor, computer tablet screen, etc.). The displayed view may be enhanced with additional information based on tissue identification, lateral mapping, and distance sensor system 104. In such cases, the surgical visualization system 100 includes multiple subsystems, namely an imaging subsystem, a surface mapping subsystem, a tissue identification subsystem, and/or a distance determination subsystem. These subsystems may cooperate to provide advanced data synthesis and integration information to the practitioner intraoperatively.
Imaging device 120 may be configured to be capable of detecting visible light, spectral light waves (visible or invisible), and structured light patterns (visible or invisible). Examples of imaging devices 120 include endoscopes, such as endoscopes, arthroscopes, angioscopes, bronchoscopes, choledochoscopes, colonoscopes, cytoscopes, duodenoscopes, enteroscopes, esophago-gastro-duodenal scopes (gastroscopes), laryngoscopes, nasopharyngeal nephroscopes, sigmoidoscopes, thoracoscopes, ureteroscopes, or endoscopes. The speculum may be particularly useful in minimally invasive surgery. In open surgical applications, the imaging device 120 may not include a speculum.
The tissue identification subsystem may be implemented using a spectral imaging system. Spectral imaging systems may rely on imaging such as hyperspectral imaging, multispectral imaging, or selective spectral imaging. An embodiment of hyperspectral imaging of tissue is further described in U.S. patent No. 9,274,047, entitled "SYSTEM AND Method For Gross Anatomic Pathology Using HYPERSPECTRAL IMAGING," published 3/1/2016, which is hereby incorporated by reference in its entirety.
The surface mapping subsystem may be implemented using a light pattern system. Various surface mapping techniques using light patterns (or structured light) for surface mapping may be used in the surgical visualization systems described herein. Structured light is the process of projecting a known pattern (typically a grid or horizontal bars) onto a surface. In some cases, invisible (or imperceptible) structured light may be utilized, where the structured light is used without interfering with other computer vision tasks that the projected pattern may confuse. For example, infrared light or extremely fast visible frame rates alternating between two diametrically opposed patterns may be utilized to prevent interference. Embodiments of surface mapping and surgical systems including a light source and a projector for projecting a light pattern are further described in the following patents: U.S. patent publication No. 2017/0055819 entitled "Set Comprising A Surgical Instrument" published 3/2 in 2017; U.S. patent publication No. 2017/0251900 entitled "Depi System" published at 9/7 of 2017; and U.S. patent application Ser. No. 16/729,751, entitled "Surgical Systems For Generate Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures There", filed 12/30/2019, which is hereby incorporated by reference in its entirety.
The distance determination system may be incorporated into a surface mapping system. For example, structured light may be utilized to generate a three-dimensional (3D) virtual model of the visible surface 105 and determine various distances relative to the visible surface 105. Additionally or alternatively, the distance determination system may rely on time-of-flight measurements to determine one or more distances to tissue (or other structure) identified at the surgical site.
The surgical visualization system 100 also includes a surgical device 102. The surgical device 102 may be any suitable surgical device. Examples of surgical devices 102 include surgical dissectors, surgical staplers, surgical graspers, clip appliers, smoke ventilators, surgical energy devices (e.g., monopolar probes, bipolar probes, ablation probes, ultrasound devices, ultrasound end effectors, etc.), and the like. In some embodiments, the surgical device 102 includes an end effector having opposing jaws extending from a distal end of a shaft of the surgical device 102 and configured to engage tissue therebetween.
The surgical visualization system 100 can be configured to identify the critical structures 101 and the proximity of the surgical device 102 to the critical structures 101. The imaging device 120 of the surgical visualization system 100 is configured to detect light of various wavelengths, such as visible light, spectral light waves (visible or invisible), and structured light patterns (visible or invisible). The imaging device 120 may include multiple lenses, sensors, and/or receivers for detecting different signals. For example, the imaging device 120 may be a hyperspectral, multispectral, or selective-spectrum camera, as described herein. Imaging device 120 may include a waveform sensor 122 (such as a spectral image sensor, a detector, and/or a three-dimensional camera lens). For example, the imaging device 120 may include a right lens and a left lens that are used together to record two-dimensional images simultaneously, and thus generate a three-dimensional image of the surgical site, render a three-dimensional (3D) image of the surgical site, and/or determine one or more distances at the surgical site. Additionally or alternatively, the imaging device 120 may be configured to be capable of receiving images indicative of the topography of visible tissue and the identification and orientation of hidden critical structures, as further described herein. For example, the field of view of imaging device 120 may overlap with a pattern of light (structured light) on surface 105 of tissue 103, as shown in fig. 1.
As in the illustrated embodiment, the surgical visualization system 100 may be incorporated into a robotic surgical system 110. The robotic surgical system 110 may have a variety of configurations, as discussed herein. In the illustrated embodiment, robotic surgical system 110 includes a first robotic arm 112 and a second robotic arm 114. The robotic arms 112, 114 each include a rigid structural member 116 and joints 118, which may include servo motor controls. The first robotic arm 112 is configured to manipulate the surgical device 102 and the second robotic arm 114 is configured to manipulate the imaging device 120. The robotic control unit of robotic surgical system 110 is configured to issue control motions to first robotic arm 112 and second robotic arm 114 that may affect surgical device 102 and imaging device 120, respectively.
In some implementations, one or more of the robotic arms 112, 114 may be separate from the host robotic system 110 used in the surgical procedure. For example, at least one of the robotic arms 112, 114 may be positioned and registered with a particular coordinate system without servo motor controls. For example, a closed loop control system and/or a plurality of sensors for the robotic arms 112, 114 may control and/or register the position of the robotic arms 112, 114 relative to a particular coordinate system. Similarly, the orientations of the surgical device 102 and the imaging device 120 may be registered with respect to a particular coordinate system.
Examples of robotic surgical systems include Ottava TM robotic-assisted surgical systems (Johnson & Johnson of New Brunswick, NJ), daSurgical systems (intuitive surgical Co., inc. (Intuitive Surgical, inc. of Sunnyvale, calif.)), hugo TM robotic-assisted surgical systems (Medun force Co., mirabilis, minnesota (Medtronic PLC of Minneapolis, MN)),Surgical robotic system (CMR surgical Co., ltd (CMR Surgical Ltd of Cambridge, UK) of Cambridge, UK) andPlatform (Auris Health, inc. (Auris Health, inc. of Redwood City, calif.)). Various robotic surgical systems and embodiments of using robotic surgical systems are further described in the following patents: U.S. patent publication No. 2018/0177556, entitled "Flexible Instrument Insertion Using AN ADAPTIVE Force Threshold", filed 12 months 28 days 2016; U.S. patent publication number 2020/0000530, titled "SYSTEMS AND Techniques For Providing Multiple Perspectives During Medical Procedures", filed on 16 days 4 months in 2019; U.S. patent publication No. 2020/0170720, entitled "Image-Based Branch Detection AND MAPPING For Navigation", filed on 7 th day of the year 2 in 2020; U.S. patent publication No. 2020/0188043, entitled "Surgical Robotics System", filed on 12 months 9 of 2019; U.S. patent publication No. 2020/0085316, titled "SYSTEMS AND Methods For Concomitant Medical Procedures", filed on 3 days 9 of 2019; U.S. patent publication number 8,831,782, entitled "patent-Side Surgeon Interface For A Teleoperated Surgical Instrument", filed 7/15/2013; and international patent publication number WO 2014151621, entitled "Hyperdexterous Surgical System", filed on day 13, 3, 2014, which are hereby incorporated by reference in their entirety.
The surgical visualization system 100 also includes a transmitter 106. The emitter 106 is configured to emit a pattern of light, such as stripes, grid lines, and/or dots, to enable a topography or topography of the surface 105 to be determined. For example, projection light array 130 may be used for three-dimensional scanning and registration on surface 105. The projected light array 130 may be emitted from an emitter 106 located on one of the surgical device 102 and/or robotic arms 112, 114 and/or imaging device 120. In one aspect, the surgical visualization system 100 uses the projected light array 130 to determine a shape defined by the surface 105 of the tissue 103 and/or the intraoperative motion of the surface 105. Imaging device 120 is configured to be able to detect projected light array 130 reflected from surface 105 to determine the topography of surface 105 and various distances relative to surface 105.
As in the illustrated embodiment, the imaging device 120 may include an optical waveform transmitter 123, such as by mounting or otherwise attaching the optical waveform transmitter to the imaging device 120. The optical waveform emitter 123 is configured to emit electromagnetic radiation 124 (near infrared (NIR) photons) that can penetrate the surface 105 of the tissue 103 and reach the critical structure 101. The imaging device 120 and the optical waveform transmitter 123 may be capable of being positioned by the robotic arm 114. The optical waveform transmitter 123 is mounted on or otherwise located on the imaging device 122, but in other embodiments may be located on a surgical device separate from the imaging device 120. The corresponding waveform sensor 122 (e.g., an image sensor, a spectrometer, or a vibration sensor) of the imaging device 120 is configured to be able to detect the effects of electromagnetic radiation received by the waveform sensor 122. The wavelength of electromagnetic radiation 124 emitted by optical waveform emitter 123 is configured to enable identification of the type of anatomical and/or physical structure, such as critical structure 101. Identification of critical structures 101 may be accomplished by, for example, spectroscopic analysis, photo-acoustic and/or ultrasound. In one aspect, the wavelength of electromagnetic radiation 124 may be variable. The waveform sensor 122 and the optical waveform transmitter 123 may include, for example, a multispectral imaging system and/or a selective spectral imaging system. In other cases, the waveform sensor 122 and the optical waveform transmitter 123 may comprise, for example, a photoacoustic imaging system.
The distance sensor system 104 of the surgical visualization system 100 is configured to determine one or more distances at a surgical site. The distance sensor system 104 may be a time-of-flight distance sensor system that includes a transmitter (such as the transmitter 106 in the present illustrated embodiment) and includes a receiver 108. In other cases, the time-of-flight emitter may be separate from the structured light emitter. The transmitter 106 may comprise a very small laser source and the receiver 108 may comprise a matched sensor. The distance sensor system 104 is configured to be able to detect "time of flight" or the time it takes for the laser light emitted by the transmitter 106 to bounce back to the sensor portion of the receiver 108. The use of a very narrow light source in the emitter 106 enables the distance sensor system 104 to determine the distance to the surface 105 of the tissue 103 directly in front of the distance sensor system 104.
In the illustrated embodiment, the receiver 108 of the distance sensor system 104 is positioned on the surgical device 102, but in other embodiments, the receiver 108 may be mounted on a separate surgical device rather than on the surgical device 102. For example, the receiver 108 may be mounted on a cannula or trocar through which the surgical device 102 extends to reach the surgical site. In other embodiments, the receiver 108 for the distance sensor system 104 may be mounted on a separate robotic control arm of the robotic system 110 (e.g., on the second robotic arm 114) rather than on the first robotic arm 112 to which the surgical device 102 is coupled, may be mounted on a movable arm operated by another robot, OR mounted on an Operating Room (OR) table OR fixture. In some embodiments, imaging device 120 includes a receiver 108 to allow a line between emitter 106 on surgical device 102 and imaging device 120 to be used to determine a distance from emitter 106 to surface 105 of tissue 103. For example, the distance d e may be triangulated based on the known orientations of the transmitter 106 (on the surgical device 102) and the receiver 108 (on the imaging device 120) of the distance sensor system 104. The three-dimensional position of the receiver 108 may be known and/or intraoperatively registered to the robot coordinate plane.
As in the illustrated embodiment, the position of the transmitter 106 of the distance sensor system 104 may be controlled by a first robotic arm 112 and the position of the receiver 108 of the distance sensor system 104 may be controlled by a second robotic arm 114. In other embodiments, the surgical visualization system 100 may be used separately from a robotic system. In such cases, the distance sensor system 104 may be independent of the robotic system.
In fig. 1, d e is the emitter-tissue distance from emitter 106 to surface 105 of tissue 103, and d t is the device-tissue distance from the distal end of surgical device 102 to surface 105 of tissue 103. The distance sensor system 104 is configured to determine the emitter-tissue distance d e. The device-to-tissue distance d t may be obtained from a known position of the emitter 106 on the surgical device 102, for example, on its axis proximal to the distal end of the surgical device 102, relative to the distal end of the surgical device. In other words, when the distance between the emitter 106 and the distal end of the surgical device 102 is known, the device-tissue distance d t may be determined from the emitter-tissue distance d e. In some embodiments, the shaft of the surgical device 102 can include one or more articulation joints and can be articulated relative to the emitter 106 and jaws at the distal end of the surgical device 102. The articulating configuration may include, for example, a multi-joint vertebral structure. In some implementations, a three-dimensional camera may be used to triangulate one or more distances to the surface 105.
In fig. 1, d w is the camera-critical structure distance from the optical waveform transmitter 123 located on the imaging device 120 to the surface of the critical structure 101, and d A is the depth of the critical structure 101 below the surface 105 of the tissue 103 (e.g., the distance between the portion of the surface 105 closest to the surgical device 102 and the critical structure 101). The time of flight of the optical waveform emitted from the optical waveform emitter 123 located on the imaging device 120 is configured to enable determination of the camera-critical structure distance d w.
As shown in fig. 2, the depth d A of critical structures 101 relative to surface 105 of tissue 103 may be determined by: the distance d y, which is the sum of the distances d e and d A, is determined by triangulating the distance d w and the known position of the emitter 106 on the surgical device 102 and the optical waveform emitter 123 on the imaging device 120 (and thus the known distance d x therebetween). Additionally or alternatively, the time of flight from the optical waveform emitter 123 may be configured to enable determination of the distance from the optical waveform emitter 123 to the surface 105 of the tissue 103. For example, a first waveform (or waveform range) may be used to determine the camera-critical structure distance d w, and a second waveform (or waveform range) may be used to determine the distance to the surface 105 of the tissue 103. In such cases, different waveforms may be used to determine the depth of critical structures 101 below surface 105 of tissue 103.
Additionally or alternatively, the distance d A may be determined by ultrasound, registered Magnetic Resonance Imaging (MRI) or Computed Tomography (CT) scanning. In still other cases, the distance d A may be determined using spectral imaging, as the detection signal received by the imaging device 120 may vary based on the type of material (e.g., the type of tissue 103). For example, fat may decrease the detection signal in a first manner or amount and collagen may decrease the detection signal in a second, different manner or amount.
In another embodiment of the surgical visualization system 160 shown in fig. 3, the surgical device 162 (rather than the imaging device 120) includes an optical waveform transmitter 123 and a waveform sensor 122 configured to detect reflected waveforms. The optical waveform transmitter 123 is configured to transmit waveforms for determining distances d t and d w from a common device, such as the surgical device 162, as described herein. In such cases, the distance d A from the surface 105 of the tissue 103 to the surface of the critical structure 101 may be determined as follows:
dA=dw-dt
The surgical visualization system 100 includes a control system configured to control various aspects of the surgical visualization system 100. Fig. 4 illustrates one embodiment of a control system 133 that may be used as a control system for the surgical visualization system 100 (or other surgical visualization systems described herein). The control system 133 includes a control circuit 132 configured to be in signal communication with a memory 134. The memory 134 is configured to be capable of storing instructions executable by the control circuit 132, such as instructions for determining and/or identifying critical structures (e.g., critical structure 101 of fig. 1), determining and/or calculating one or more distances and/or three-dimensional digital representations, and communicating information to a practitioner. Thus, the instructions stored within memory 134 constitute a computer program product comprising instructions that when executed by a processor cause the processor to perform as described above. Such instructions may also be stored on any computer-readable medium (such as an optical disk, SD card, USB drive, etc., or the memory of a separate device) from which the instructions may be copied into memory 134 or executed directly. The process of copying or directly executing involves the creation of a data carrier signal carrying a computer program product. As in the illustrated embodiment, memory 134 may store surface mapping logic 136, imaging logic 138, tissue identification logic 140, and distance determination logic 141, but memory 134 may store any combination of logic 136, 138, 140, 141 and/or may combine various logic together. The control system 133 also includes an imaging system 142 that includes a camera 144 (e.g., the imaging system includes the imaging device 120 of fig. 1), a display 146 (e.g., a monitor, a computer tablet screen, etc.), and controls 148 for the camera 144 and the display 146. The camera 144 includes an image sensor 135 (e.g., waveform sensor 122) configured to receive signals from various light sources (e.g., visible light, spectral imagers, three-dimensional lenses, etc.) that emit light in various visible and invisible spectrums. The display 146 is configured to be able to depict real, virtual, and/or virtual augmented images and/or information to a practitioner.
In an exemplary implementation, the image sensor 135 is a solid state electronic device containing up to millions of discrete photodetector sites (referred to as pixels). The image sensor 135 technology belongs to one of two categories: charge Coupled Devices (CCDs) and Complementary Metal Oxide Semiconductor (CMOS) imagers, and recently, short Wave Infrared (SWIR) is an emerging imaging technology. Another type of image sensor 135 employs a hybrid CCD/CMOS architecture (sold under the name "sCMOS") and consists of CMOS readout integrated circuits (ROICs) bump bonded to a CCD imaging substrate. The CCD and CMOS image sensor 135 is sensitive to wavelengths in the range of about 350nm to about 1050nm, such as wavelengths in the range of about 400nm to about 1000 nm. Those skilled in the art will appreciate that a value may not be exactly a certain value, but for any of a number of reasons, such as sensitivity of measurement equipment and manufacturing tolerances, a value is considered to be about that value. Generally, CMOS sensors are more sensitive to IR wavelengths than CCD sensors. The solid-state image sensor 135 is based on the photoelectric effect and thus cannot distinguish colors. Thus, there are two types of color CCD cameras: single chip and three chips. Single chip color CCD cameras offer a common low cost imaging solution and use a mosaic (e.g., bayer) optical filter to split the incident light into a series of colors and employ interpolation algorithms to resolve full color images. Each color then points to a different set of pixels. Three-chip color CCD cameras provide higher resolution by employing a prism to direct each portion of the incident spectrum to a different chip. A more accurate color reproduction is possible because each point in the object's space has a separate RGB intensity value, rather than using an algorithm to determine the color. Three-chip cameras provide extremely high resolution.
The control system 133 also includes an emitter (e.g., emitter 106) that includes a spectral light source 150 and a structured light source 152 that are each operatively coupled to the control circuit 133. The single source may be pulsed to emit light in the range of spectral light sources 150 and light in the range of structured light sources 152. Alternatively, a single light source may be pulsed to provide light in the invisible spectrum (e.g., infrared spectrum light) and wavelengths of light over the visible spectrum. The spectral light source 150 may be, for example, a hyperspectral light source, a multispectral light source, and/or a selective spectral light source. The tissue recognition logic 140 is configured to be able to recognize critical structures (e.g., critical structure 101 of fig. 1) via data from the spectral light source 150 received by the image sensor 135 of the camera 144. The surface mapping logic 136 is configured to be able to determine a surface profile of the visible tissue (e.g., tissue 103) based on the reflected structured light. With time-of-flight measurements, the distance determination logic 141 is configured to be able to determine one or more distances to visible tissue and/or critical structures. The output from each of the surface mapping logic 136, tissue identification logic 140, and distance determination logic 141 is configured to be provided to the imaging logic 138 and combined, mixed, and/or overlaid by the imaging logic 138 for transmission to a healthcare practitioner via the display 146 of the imaging system 142.
The control circuit 132 may have a variety of configurations. Fig. 5 illustrates one embodiment of a control circuit 170 that may be used as the control circuit 132 configured to control aspects of the surgical visualization system 100. The control circuitry 170 is configured to enable the various processes described herein. The control circuit 170 includes a microcontroller that includes a processor 172 (e.g., a microprocessor or microcontroller) that is operatively coupled to a memory 174. The memory 174 is configured to store machine executable instructions that, when executed by the processor 172, cause the processor 172 to execute the machine instructions to implement the various processes described herein. Processor 172 may be any one of several single-core or multi-core processors known in the art. Memory 174 may include volatile and nonvolatile storage media. The processor 172 includes an instruction processing unit 176 and an arithmetic unit 178. Instruction processing unit 176 is configured to receive instructions from memory 174.
The surface mapping logic 136, imaging logic 138, tissue identification logic 140, and distance determination logic 141 may have a variety of configurations. Fig. 6 illustrates one embodiment of a combinational logic circuit 180 configured to enable control of aspects of the surgical visualization system 100 using logic components such as one or more of the surface mapping logic 136, the imaging logic 138, the tissue identification logic 140, and the distance determination logic 141. The combinational logic circuit 180 comprises a finite state machine including a combinational logic component 182 configured to receive data associated with a surgical device (e.g., the surgical device 102 and/or the imaging device 120) at an input 184, process the data by the combinational logic component 182, and provide an output 184 to a control circuit (e.g., the control circuit 132).
Fig. 7 illustrates one embodiment of a sequential logic circuit 190 configured to control aspects of the surgical visualization system 100 using logic components such as one or more of the surface mapping logic 136, the imaging logic 138, the tissue identification logic 140, and the distance determination logic 141. Sequential logic circuit 190 includes a finite state machine including combinational logic component 192, memory 194, and clock 196. The memory 194 is configured to be capable of storing the current state of the finite state machine. Sequential logic circuit 190 may be synchronous or asynchronous. The combinational logic 192 is configured to receive data associated with a surgical device (e.g., the surgical device 102 and/or the imaging device 120) at an input 426, process the data by the combinational logic 192, and provide an output 499 to control circuitry (e.g., the control circuitry 132). In some implementations, the sequential logic circuit 190 may include a combination of a processor (e.g., the processor 172 of fig. 5) and a finite state machine to implement the various processes herein. In some implementations, the finite state machine may include a combination of combinational logic circuitry (e.g., combinational logic circuitry 192 of fig. 7) and sequential logic circuitry 190.
Fig. 8 illustrates another embodiment of a surgical visualization system 200. The surgical visualization system 200 is generally similar in construction and use to the surgical visualization system 100 of fig. 1, including, for example, a surgical device 202 and an imaging device 220. The imaging device 220 comprises a spectral light emitter 223 configured to be capable of emitting spectral light of a plurality of wavelengths to obtain a spectral image of, for example, a hidden structure. The imaging device 220 may also include a three-dimensional camera and associated electronic processing circuitry. Surgical visualization system 200 is shown as being used during surgery to identify and facilitate avoiding certain critical structures not visible on surface 205 of organ 203, such as ureters 201a and blood vessels 201b in organ 203 (in this embodiment, the uterus).
The surgical visualization system 200 is configured to determine an emitter-tissue distance d e from an emitter 206 on the surgical device 202 to a surface 205 of the uterus 203 via structured light. The surgical visualization system 200 is configured to extrapolate the device-tissue distance d t from the surgical device 202 to the surface 205 of the uterus 203 based on the emitter-tissue distance d e. The surgical visualization system 200 is also configured to determine a tissue-ureter distance d A from the ureter 201a to the surface 205 and a camera-ureter distance d w from the imaging device 220 to the ureter 201 a. As described herein, for example, with respect to the surgical visualization system 100 of fig. 1, the surgical visualization system 200 is configured to determine the distance d w using, for example, spectral imaging and time-of-flight sensors. In various embodiments, the surgical visualization system 200 can determine (e.g., triangulate) the tissue-to-ureter distance d A (or depth) based on other distances and/or surface mapping logic described herein.
As described above, the surgical visualization system includes a control system configured to control various aspects of the surgical visualization system. The control system may have a variety of configurations. Fig. 9 illustrates one embodiment of a control system 600 for a surgical visualization system, such as the surgical visualization system 100 of fig. 1, the surgical visualization system 200 of fig. 8, or other surgical visualization systems described herein. The control system 600 is a conversion system that integrates spectral signature tissue identification and structured light tissue localization to identify critical structures, particularly when those structures are obscured by tissue, e.g., fat, connective tissue, blood tissue, and/or organs and/or blood, and/or to detect tissue variability, such as distinguishing tumor and/or unhealthy tissue from healthy tissue within an organ.
The control system 600 is configured to be used to implement a hyperspectral imaging and visualization system in which molecular responses are utilized to detect and identify anatomical structures in the surgical field of view. The control system 600 includes conversion logic 648 configured to enable conversion of tissue data into information usable by a surgeon and/or other medical practitioner. For example, variable reflectivity based on wavelength relative to the masking material may be utilized to identify critical structures in the anatomical structure. Furthermore, the control system 600 is configured to be able to combine the identified spectral features and the structured light data in an image. For example, the control system 600 may be used to create three-dimensional datasets for surgical use in a system with enhanced image overlays. Techniques may be used using additional visual information both intra-operatively and pre-operatively. In various embodiments, the control system 600 is configured to provide a warning to a practitioner when one or more critical structures are approached. Various algorithms may be employed to guide robotic automated and semi-automated methods based on surgery and proximity to critical structures.
The projected light array is used by the control system 600 to determine tissue shape and motion intraoperatively. Alternatively, flash lidar may be used for surface mapping of tissue.
The control system 600 is configured to be able to detect critical structures (which may include one or more critical structures, as described above) and provide image overlay of the critical structures, and measure distances to the surface of visible tissue and distances to embedded/buried critical structures. The control system 600 may measure the distance to the surface of the visible tissue or detect critical structures and provide image overlay of the critical structures.
The control system 600 includes a spectrum control circuit 602. The spectrum control circuit 602 may be a Field Programmable Gate Array (FPGA) or another suitable circuit configuration, such as the configurations described with respect to fig. 6, 7, and 8. The spectral control circuit 602 includes a processor 604 configured to receive a video input signal from a video input processor 606. For example, the processor 604 may be configured to be capable of hyperspectral processing and may utilize C/C++ code. For example, the video input processor 606 is configured to be able to receive video inputs of control (metadata) data, such as shutter time, wavelength, and sensor analysis. The processor 604 is configured to process video input signals from the video input processor 606 and provide video output signals to the video output processor 608, which includes hyperspectral video output such as interface control (metadata) data. The video output processor 608 is configured to provide a video output signal to the image overlay controller 610.
The video input processor 606 is operatively coupled to a camera 612 at the patient side via patient isolation circuitry 614. The camera 612 includes a solid-state image sensor 634. The patient isolation circuit 614 may include multiple transformers to isolate the patient from other circuits in the system. The camera 612 is configured to receive intraoperative images through optics 632 and image sensor 634. Image sensor 634 may comprise, for example, a CMOS image sensor, or may comprise another image sensor technology, such as the image sensor technology discussed herein in connection with fig. 4. The camera 612 is configured to be able to output 613 images with 14 bits/pixel signals. Those skilled in the art will appreciate that higher or lower pixel resolutions may be employed. The isolated camera output signal 613 is provided to a color RGB convergence circuit 616, which in the illustrated embodiment employs hardware registers 618 and a Nios2 coprocessor 620 configured to be able to process the camera output signal 613. The color RGB fusion output signals are provided to a video input processor 606 and laser pulse control circuitry 622.
The laser pulse control circuit 622 is configured to control the laser engine 624. The laser engine 624 is configured to output light at a plurality of wavelengths (λ1, λ2, λ3 … … λn) including Near Infrared (NIR). The laser engine 624 may operate in a variety of modes. For example, the laser engine 624 may operate in two modes. In a first mode (e.g., normal operation mode), the laser engine 624 is configured to output an illumination signal. In a second mode (e.g., identification mode), the laser engine 624 is configured to output RGBG and NIR light. In various embodiments, the laser engine 624 may operate in a polarization mode.
Light output 626 from laser engine 624 is configured to illuminate a targeted anatomical structure in intraoperative surgical site 627. The laser pulse control circuit 622 is also configured to control a laser pulse controller 628 for a laser pattern projector 630 configured to project a laser pattern 631 (such as a grid or pattern of lines and/or points) of a predetermined wavelength (λ2) onto the surgical tissue or organ at the surgical site 627. The camera 612 is configured to be able to receive patterned light as well as reflected light output by the camera optics 632. The image sensor 634 is configured to be able to convert the received light into a digital signal.
Color RGB fusion circuit 616 is also configured to output signals to image overlay controller 610 and video input module 636 for reading laser pattern 631 projected by laser pattern projector 630 onto a targeted anatomical structure at surgical site 627. Processing module 638 is configured to process laser pattern 631 and output a first video output signal 640 representative of the distance to visible tissue at surgical site 627. The data is supplied to the image superimposition controller 610. The processing module 638 is also configured to output a second video signal 642 representative of a three-dimensional rendered shape of tissue or organ of the targeted anatomy at the surgical site.
The first video output signal 640 and the second video output signal 642 include data representing the position of the critical structures on the three-dimensional surface model, which is provided to the integration module 643. In conjunction with data from the video output processor 608 of the spectral control circuit 602, the integration module 643 is configured to be able to determine a distance to the buried critical structure (e.g., distance d A of fig. 1) (e.g., via a triangularization algorithm 644), and the distance to the buried critical structure may be provided to the image overlay controller 610 via the video output processor 646. The conversion logic may encompass conversion logic 648, intermediate video monitor 652, and camera 624/laser pattern projector 630 positioned at surgical site 627.
In various cases, pre-operative data 650, such as from a CT or MRI scan, may be employed to register or match certain three-dimensional deformable tissues. Such pre-operative data 650 may be provided to the integration module 643 and ultimately to the image overlay controller 610 so that such information may be overlaid with the view from the camera 612 and provided to the video monitor 652. An embodiment Of registration Of pre-operative data is further described in U.S. patent publication No. 2020/0015907, entitled "Integration Of IMAGING DATA," filed on day 11 and 9 in 2018, which is hereby incorporated by reference in its entirety.
The video monitor 652 is configured to output the integrated/enhanced view from the image overlay controller 610. The practitioner may select and/or switch between different views on one or more displays. On the first display 652a (which in this illustrated embodiment is a monitor), the practitioner may switch between (a) a view in which a three-dimensional rendering of visible tissue is depicted and (B) an enhanced view in which one or more hidden key structures are depicted on the three-dimensional rendering of visible tissue. On a second display 652b (which in the illustrated embodiment is a monitor), the practitioner may switch the distance measurement to one or more surfaces hiding critical structures and/or visible tissue, for example.
The various surgical visualization systems described herein may be used to visualize a variety of different types of tissue and/or anatomical structures, including tissue and/or anatomical structures that may be obscured from visualization by EMR in the visible portion of the spectrum. The surgical visualization system may utilize a spectral imaging system as described above, which may be configured to be able to visualize different types of tissue based on varying combinations of constituent materials of the different types of tissue. In particular, the spectral imaging system may be configured to be able to detect the presence of various constituent materials within the tissue being visualized based on the absorption coefficients of the tissue at various EMR wavelengths. The spectral imaging system may be configured to be able to characterize a tissue type of the tissue being visualized based on a particular combination of constituent materials.
Fig. 10 shows a graph 300 depicting how the absorption coefficients of various biological materials vary across the EMR wavelength spectrum. In graph 300, vertical axis 302 represents the absorption coefficient (in cm -1) of the biological material, and horizontal axis 304 represents the EMR wavelength (in μm). The first line 306 in the graph 300 represents the absorption coefficient of water at various EMR wavelengths, the second line 308 represents the absorption coefficient of protein at various EMR wavelengths, the third line 310 represents the absorption coefficient of melanin at various EMR wavelengths, the fourth line 312 represents the absorption coefficient of deoxyhemoglobin at various EMR wavelengths, the fifth line 314 represents the absorption coefficient of oxyhemoglobin at various EMR wavelengths, and the sixth line 316 represents the absorption coefficient of collagen at various EMR wavelengths. Different tissue types have different combinations of constituent materials, so the tissue types visualized by the surgical visualization system can be identified and distinguished based on the particular combination of constituent materials detected. Accordingly, the spectral imaging system of the surgical visualization system may be configured to emit a plurality of different wavelengths of EMR, determine constituent materials of tissue based on absorption EMR absorption responses detected at the different wavelengths, and then characterize the tissue type based on a particular detected combination of the constituent materials.
Fig. 11 illustrates an embodiment utilizing spectral imaging techniques to visualize different tissue types and/or anatomical structures. In fig. 11, a spectral emitter 320 (e.g., spectral light source 150 of fig. 4) is used by the imaging system to visualize a surgical site 322. EMR emitted by the spectral emitter 320 and reflected from tissue and/or structure at the surgical site 322 is received by an image sensor (e.g., image sensor 135 of fig. 4) to visualize the tissue and/or structure, which may be visible (e.g., at the surface of the surgical site 322) or obscured (e.g., underneath other tissue and/or structure at the surgical site 322). In this embodiment, the imaging system (e.g., imaging system 142 of fig. 4) visualizes the tumor 324, artery 326, and various abnormalities 328 (e.g., tissue that does not conform to known or expected spectral characteristics) based on spectral characteristics characterized by different absorption characteristics (e.g., absorption coefficients) of the constituent materials of each of the different tissue/structure types. The visualized tissues and structures may be displayed on a display screen associated with or coupled to the imaging system (e.g., display 146 of imaging system 142 of fig. 4), on a main display (e.g., main display 819 of fig. 19), on a non-sterile display (e.g., non-sterile displays 807, 809 of fig. 19), on a display of a surgical hub (e.g., display of surgical hub 806 of fig. 19), on a device/instrument display, and/or on another display.
The imaging system may be configured to customize or update the displayed surgical site visualization according to the identified tissue and/or structure type. For example, as shown in fig. 11, the imaging system may display a border 330 associated with the tumor 324 being visualized on a display screen associated with or coupled to the imaging system, on a primary display, on a non-sterile display, on a display of a surgical hub, on a device/instrument display, and/or on another display. The border 330 may indicate the area or amount of tissue that should be resected to ensure complete resection of the tumor 324. The control system of the surgical visualization system (e.g., control system 133 of fig. 4) may be configured to control or update the size of the edge 330 based on the tissue and/or structure identified by the imaging system. In the illustrated embodiment, the imaging system has identified a plurality of anomalies 328 within the field of view (FOV). Accordingly, the control system may adjust the displayed boundary 330 to a first updated boundary 332 having sufficient size to cover the anomaly 328. In addition, the imaging system also identifies an artery 326 that partially overlaps the originally displayed boundary 330 (as indicated by the highlighted region 334 of the artery 326). Thus, the control system may adjust the displayed boundary to a second updated boundary 336 having sufficient dimensions to encompass the relevant portion of the artery 326.
In addition to or instead of the absorption characteristics of the tissue and/or structure described above with respect to fig. 10 and 11, the tissue and/or structure may also be imaged or characterized over the EMR wavelength spectrum according to its reflection characteristics. For example, FIGS. 12, 13 and 14 illustrate various graphs of the reflectivity of different types of tissue or structures at different EMR wavelengths. Fig. 12 is a graphical representation 340 of an illustrative ureter feature versus a mask. Fig. 13 is a graphical representation 342 of an illustrative arterial feature versus mask. Fig. 14 is a graphical representation 344 of an illustrative neural feature versus a mask. The curves in fig. 12, 13 and 14 show the reflectivity of specific structures (ureters, arteries and nerves) as a function of wavelength (nm) with respect to the respective reflectivity of fat, lung tissue and blood at the respective wavelengths. These graphs are for illustrative purposes only, and it should be understood that other tissues and/or structures may have corresponding detectable reflective features that would allow for identification and visualization of the tissues and/or structures.
Selected wavelengths for spectral imaging (e.g., "selective spectral" imaging) may be identified and utilized based on expected critical structures and/or obscurations at the surgical site. By utilizing selective spectral imaging, the amount of time required to obtain a spectral image can be minimized so that information can be obtained in real-time and utilized in surgery. These wavelengths may be selected by the practitioner or by the control circuitry based on user (e.g., practitioner) input. In some cases, the wavelength may be selected based on big data that the machine learning and/or control circuitry may access via, for example, a cloud or a surgical hub.
Fig. 15 illustrates one embodiment of spectral imaging of tissue that is used intraoperatively to measure the distance between a waveform transmitter and critical structures obscured by the tissue. Fig. 15 shows an embodiment of the time-of-flight sensor system 404 utilizing waveforms 424, 425. The time-of-flight sensor system 404 may be incorporated into a surgical visualization system, for example as the sensor system 104 of the surgical visualization system 100 of fig. 1. The time-of-flight sensor system 404 includes a waveform transmitter 406 and a waveform receiver 408 located on the same surgical device 402 (e.g., the transmitter 106 and the receiver 108 located on the same surgical device 102 of fig. 1). The transmitted wave 400 extends from the transmitter 406 to the critical structure 401 (e.g., the critical structure 101 of fig. 1), and the received wave 425 is reflected back from the critical structure 401 by the receiver 408. In the illustrated embodiment, the surgical device 402 is positioned through a trocar 410 that extends into a cavity 407 of a patient. Although a trocar 410 is used in the illustrated embodiment, other trocars or other access devices may be used, or no access device may be used.
The waveforms 424, 425 are configured to be able to penetrate the occluding tissue 403, such as by having wavelengths in the NIR or SWIR spectral wavelengths. A spectral signal (e.g., hyperspectral, multispectral, or selective spectral) or photoacoustic signal is emitted from emitter 406 (as indicated by first distally directed arrow 407) and can penetrate tissue 403 in which critical structures 401 are concealed. The emitted waveform 424 is reflected by the critical structure 401, as indicated by the proximally directed second arrow 409. The received waveform 425 may be delayed due to the distance d between the distal end of the surgical device 402 and the critical structure 401. Waveforms 424, 425 may be selected based on the spectral characteristics of critical structures 401 to target critical structures 401 within tissue 403, as described herein. The transmitter 406 is configured to provide binary signals on and off, as shown in fig. 16, for example, which may be measured by the receiver 408.
Based on the delay between the transmitted wave 424 and the received wave 425, the time-of-flight sensor system 404 is configured to be able to determine the distance d. A time-of-flight timing diagram 430 of the transmitter 406 and receiver 408 of fig. 15 is shown in fig. 16. The delay is a function of distance d, and distance d is given by:
where c = speed of light; t=length of pulse; q1=charge accumulated when light is emitted; q2=charge accumulated when no light is emitted.
The time of flight of the waveforms 424, 425 corresponds to the distance d in fig. 15. In various cases, the additional transmitter/receiver and/or the pulsed signal from the transmitter 406 may be configured to be capable of transmitting a non-penetrating signal. The non-penetrating signal may be configured to enable a determination of a distance from the emitter 406 to the surface 405 of the occluding tissue 403. In various cases, the depth of the critical structures 401 may be determined by:
dA=dw-dt
where d A = depth of critical structures 401; d w = distance from emitter 406 to critical structure 401 (d in fig. 15); and d t = distance from the emitter 406 (on the distal end of the surgical device 402) to the surface 405 of the shielding tissue 403.
Fig. 17 illustrates another embodiment of a time-of-flight sensor system 504 utilizing waves 524a, 524b, 524c, 525a, 525b, 525 c. The time-of-flight sensor system 504 may be incorporated into a surgical visualization system, for example, as the sensor system 104 of the surgical visualization system 100 of fig. 1. The time-of-flight sensor system 504 includes a waveform transmitter 506 and a waveform receiver 508 (e.g., the transmitter 106 and the receiver 108 of fig. 1). The waveform transmitter 506 is positioned on a first surgical device 502a (e.g., the surgical device 102 of fig. 1) and the waveform receiver 508 is positioned on a second surgical device 502 b. The surgical devices 502a, 502b are positioned through a first trocar 510a and a second trocar 510b, respectively, which extend into the cavity 507 of the patient. Although trocars 510a, 510b are used in this illustrated embodiment, other trocars or other access devices may be used, or no access device may be used. The transmitted waves 524a, 524b, 524c extend from the transmitter 506 toward the surgical site, and the received waves 525a, 525b, 525c reflect back to the receiver 508 from various structures and/or surfaces at the surgical site.
The different emitted waves 524a, 524b, 524c are configured to be able to target different types of materials at the surgical site. For example, wave 524a targets the shielding tissue 503, wave 524b targets a first critical structure 501a (e.g., critical structure 101 of fig. 1), which in the illustrated embodiment is a blood vessel, and wave 524c targets a second critical structure 501b (e.g., critical structure 101 of fig. 1), which in the illustrated embodiment is a cancerous tumor. The wavelengths of the waves 524a, 524b, 524c may be in the visible, NIR, or SWIR wavelength spectrum. For example, visible light may reflect from surface 505 of tissue 503 and NIR and/or SWIR waveforms may penetrate surface 505 of tissue 503. In various aspects, a spectral signal (e.g., hyperspectral, multispectral, or selective spectral) or a photoacoustic signal may be emitted from the emitter 506, as described herein. The waves 524b, 524c may be selected based on the spectral characteristics of the critical structures 501a, 501b to target the critical structures 501a, 501b within the tissue 503, as described herein. Photo-acoustic imaging is further described in various U.S. patent applications, which are incorporated by reference in this disclosure.
The emitted waves 524a, 524b, 524c are reflected from the targeted material (i.e., the surface 505, the first critical structure 501a, and the second structure 501b, respectively). The received waveforms 525a, 525b, 525c may be delayed due to the distance d 1a、d2a、d3a、d1b、d2b、d2c.
In a time-of-flight sensor system 504 in which the transmitter 506 and the receiver 508 may be independently positioned (e.g., on separate surgical devices 502a, 502b and/or controlled by separate robotic arms), various distances d 1a、d2a、d3a、d1b、d2b、d2c may be calculated based on the known orientations of the transmitter 506 and the receiver 508. For example, these orientations may be known when the surgical devices 502a, 502b are robotically controlled. Knowledge of the locations of the emitters 506 and the receivers 508 and the time at which the photon stream is targeted to a tissue and the information of this particular response received by the receivers 508 may allow the distances d1a, d2a, d3a, d1b, d2c to be determined. In one aspect, the distance to the obscured critical structures 501a, 501b may be triangulated using the penetration wavelength. Because the speed of light is constant for any wavelength of visible or invisible light, time-of-flight sensor system 504 can determine various distances.
In a view provided to a practitioner, such as on a display, the receiver 508 may be rotated such that the centroid of the target structure in the resulting image remains constant (e.g., in a plane perpendicular to the axis of the selected target structure 503, 501a, or 501 b). Such orientation may rapidly convey one or more relevant distances and/or viewing angles relative to the target structure. For example, as shown in fig. 17, the surgical site is displayed from a perspective in which the critical structure 501a is perpendicular to the viewing plane (e.g., the blood vessels are oriented in/out of the page). Such orientation may be a default setting; however, the view may be rotated or otherwise adjusted by the practitioner. In some cases, the practitioner may switch between different surfaces and/or target structures defining the viewing angle of the surgical site provided by the imaging system.
As in the illustrated embodiment, the receiver 508 may be mounted on a trocar 510b (or other access device) through which the surgical device 502b is positioned. In other embodiments, the receiver 508 may be mounted on a separate robotic arm with a known three-dimensional orientation. In various cases, the receiver 508 may be mounted on a boom separate from the robotic surgical system controlling the surgical device 502a, OR may be mounted to an Operating Room (OR) table OR fixture that may be intraoperatively registered to the robotic coordinate plane. In such cases, the locations of the transmitter 506 and the receiver 508 may be capable of registering with the same coordinate plane such that the distance may be triangulated from the output from the time-of-flight sensor system 504.
The time of flight sensor system in combination with near infrared spectroscopy (NIRS), known as TOF-NIRS, is capable of measuring time resolved profiles of near infrared light with nanosecond resolution, as found in "Time-Of-Flight Near-Infrared Spectroscopy For Nondestructive Measurement Of Internal Quality In Grapefruit"," Journal of the american society of gardening (Journal of THE AMERICAN Society for Horticultural Science), month 5, 2013, volume 138, stages 3-225, 228, the entire contents of which are hereby incorporated by reference.
Embodiments of visualization systems and aspects and uses thereof are further described in the following patents: U.S. patent publication No. 2020/0015923, entitled "Surgical Visualization Platform," filed on day 11, 9, 2018; U.S. patent publication No. 2020/0015900 entitled "control AN EMITTER Assembly Pulse Sequence" filed on day 9 and 11 in 2018; U.S. patent publication No. 2020/0015668 entitled "Singular EMR Source Emitter Assembly" filed on 11/9/2018; U.S. patent publication No. 2020/0015925 entitled "Combination EMITTER AND CAMERA Assemble" filed on 9 and 11 2018; U.S. patent publication No. 2020/00015899 entitled "Surgical Visualization With Proximity Tracking Features" filed on day 11, 9, 2018; U.S. patent publication No. 2020/00015003, entitled "Surgical Visualization Of Multiple Targets", filed on 11/9/2018; U.S. patent No. 10,792,034, entitled "Visualization Of Surgical Devices," filed on 11/9/2018; U.S. patent publication No. 2020/0015897, entitled "Operative Communication Of Light," filed on 11/9/2018; U.S. patent publication No. 2020/0015924 entitled "Robotic Light Projection Tools" filed on 11/9/2018; U.S. patent publication No. 2020/0015898, entitled "Surgical Visualization Feedback System," filed on 11/9/2018; U.S. patent publication No. 2020/0015906, entitled "Surgical Visualization And Monitoring," filed on 11/9/2018; U.S. patent publication No. 2020/0015907 entitled "Integration Of IMAGING DATA" filed on day 11 and 9 in 2018; U.S. patent No. 10,925,598 entitled "Robotically-Assisted Surgical Suturing Systems" filed on 11/9/2018; U.S. patent publication No. 2020/0015901, entitled "Safety Logic For Surgical Suturing Systems," filed on 11/9/2018; U.S. patent publication No. 2020/0015914 entitled "Robotic SYSTEMS WITH SEPARATE Photoacoustic Receivers" filed on day 9 and 11 in 2018; U.S. patent publication No. 2020/0015902, entitled "Force Sensor Through Structured Light Deflection," filed on 11/9/2018; U.S. patent publication No. 2019/0201136, entitled "Method Of Hub Communication," filed 12/4/2018; U.S. patent application Ser. No. 16/729,772, entitled "Analyzing Surgical Trends By A Surgical System," filed 12/30 a 2019; U.S. patent application Ser. No. 16/729,747 entitled "Dynamic Surgical Visualization Systems" filed 12/30/2019; U.S. patent application Ser. No. 16/729,744 entitled "Visualization Systems Using Structured Light" filed 12/30/2019; U.S. patent application Ser. No. 16/729,778, entitled "SYSTEM AND Method For Determining, adjusting, AND MANAGING Resection Margin About A Subject Tissue," filed 12/30/2019; U.S. patent application Ser. No. 16/729,729, entitled "Surgical Systems For Proposing And Corroborating Organ Portion Removals," filed 12/30/2019; U.S. patent application Ser. No. 16/729,778, filed 12/30/2019, entitled "Surgical System For Overlaying Surgical Instrument Data Onto A Virtual Three Dimensional Construct Of An Organ"; U.S. patent application Ser. No. 16/729,751, entitled "Surgical Systems For Generating Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures Thereto", filed 12/30/2019; U.S. patent application Ser. No. 16/729,740 entitled "Surgical Systems Correlating Visualization Data And Powered Surgical Instrument Data" filed 12/30/2019; U.S. patent application Ser. No. 16/729,737 entitled "Adaptive Surgical System Control According To Surgical Smoke Cloud Characteristics" filed 12/30/2019; U.S. patent application Ser. No. 16/729,796 entitled "Adaptive Surgical System Control According To Surgical Smoke Particulate Characteristics" filed 12/30/2019; U.S. patent application Ser. No. 16/729,803, entitled "Adaptive Visualization By A Surgical System," filed 12/30 a 2019; U.S. patent application Ser. No. 16/729,807 entitled "Method Of Using IMAGING DEVICES IN Surgery" filed 12/30/2019; U.S. provisional patent application No. 63/249,652, entitled "Surgical Devices, systems, and Methods Using Fiducial Identification AND TRACKING," filed on 9, 29, 2021; U.S. provisional patent application No. 63/249,658 entitled "Surgical Devices, systems, and Methods for Control of One Visualization with Another," filed on 9/29 of 2021; U.S. provisional patent application No. 63/249,870, entitled "Methods AND SYSTEMS for Controlling Cooperative Surgical Instruments," filed on 9 and 29 of 2021; U.S. provisional patent application No. 63/249,881, titled "Methods and Systems for Controlling Cooperative Surgical Instruments with Variable Surgical Site Access Trajectories", filed on 9/29 of 2021; U.S. provisional patent application No. 63/249,877, entitled "Methods AND SYSTEMS for Controlling Cooperative Surgical Instruments," filed on 9 and 29 of 2021; And U.S. provisional patent application No. 63/249,980, entitled "Cooperative Access," filed on 9/29 of 2021, which is hereby incorporated by reference in its entirety.
Surgical hub
The various visualization or imaging systems described herein may be incorporated into a system that includes a surgical hub. Generally, the surgical hub can be a component of an integrated digital medical system capable of spanning multiple medical facilities and configured to provide integrated comprehensive improved medical care to a large number of patients. The integrated digital medical system includes a cloud-based medical analysis system configured to be capable of interconnection to a plurality of surgical hubs located across a number of different medical facilities. The surgical hub is configured to be interconnectable with one or more elements, such as one or more surgical instruments for performing a medical procedure on a patient and/or one or more visualization systems used during performance of the medical procedure. Surgical hubs provide a wide variety of functions to improve the outcome of medical procedures. Data generated by various surgical devices, visualization systems, and surgical hubs about patients and medical procedures may be transmitted to a cloud-based medical analysis system. This data can then be aggregated with similar data collected from many other surgical hubs, visualization systems, and surgical instruments located at other medical facilities. Various patterns and correlations may be discovered by analyzing the collected data via a cloud-based analysis system. As a result, improvements in the techniques used to generate the data may be generated, and these improvements may then be propagated to various surgical hubs, visualization systems, and surgical instruments. Due to the interconnection of all of the foregoing components, improvements in medical procedures and practices may be found that would otherwise not be found if many of the components were not so interconnected.
Examples of surgical hubs configured to receive, analyze, and output data and methods of using such surgical hubs are further described in the following patents: U.S. patent publication No. 2019/0200844, entitled "Method Of Hub Communication, processing, storage AND DISPLAY", filed 12 months 4 in 2018; U.S. patent publication No. 2019/0200981, titled "Method Of Compressing Tissue Within A Stapling Device And Simultaneously Displaying The Location Of The Tissue Within The Jaws",2018, 12, 4; U.S. patent publication No. 2019/0201046, entitled "Method For Controlling SMART ENERGY DEVICES", filed 12/4/2018; U.S. patent publication No. 2019/0201114, entitled "Adaptive Control Program Updates For Surgical Hubs", filed on 29 days 3 month 2018; U.S. patent publication No. 2019/0201140, entitled "Surgical Hub Situational Awareness", filed on 29 days 3 and 29 months 2018; U.S. patent publication No. 2019/0206004, entitled "INTERACTIVE SURGICAL SYSTEMS WITH Condition Handling Of DEVICES AND DATA Capabilities," filed 3/29/2018; U.S. patent publication No. 2019/0206555, entitled "Cloud-based MEDICAL ANALYTICS For Customization And Recommendations To A User", filed on 29 days 3/3 in 2018; and U.S. patent publication No. 2019/0207857, entitled "Surgical Network Determination Of Prioritization Of Communication,Interaction,Or Processing Based On System Or Device Needs",2018, month 11, and 6, which is hereby incorporated by reference in its entirety.
Fig. 18 illustrates one embodiment of a computer-implemented interactive surgical system 700 that includes one or more surgical systems 702 and a cloud-based system (e.g., cloud 704, which may include remote server 713 coupled to storage 705). Each surgical system 702 includes at least one surgical hub 706 in communication with the cloud 704. In one example, as shown in fig. 18, the surgical system 702 includes a visualization system 708, a robotic system 710, and a smart (or "smart") surgical instrument 712 configured to communicate with each other and/or with the hub 706. The smart surgical instrument 712 may include an imaging device. The surgical system 702 may include M hubs 706, N visualization systems 708, O robotic systems 710, and P intelligent surgical instruments 712, where M, N, O and P are integers greater than or equal to one, which may be equal to or different from any one or more of one another. Various exemplary intelligent surgical instruments and robotic systems are described herein.
The data collected by the surgical hub from the surgical visualization system may be used in any of a variety of ways. In an exemplary embodiment, the surgical hub may receive data from a surgical visualization system used with a patient in a surgical environment (e.g., used in an operating room during performance of a surgical procedure). The surgical hub may use the received data in any of one or more ways, as discussed herein.
The surgical hub may be configured to analyze the received data in real-time using the surgical visualization system and adjust control of one or more of the surgical visualization system and/or one or more intelligent surgical instruments used with the patient based on the analysis of the received data. Such adjustments may include, for example, adjusting one or more operational control parameters of the intelligent surgical instrument, having one or more sensors of the intelligent surgical instrument make measurements to help obtain an understanding of the current physiological condition of the patient and/or the current operational state of the intelligent surgical instrument, and other adjustments. Control and regulation operations of the intelligent surgical instrument will be discussed further below. examples of operational control parameters of the intelligent surgical instrument include motor speed, cutting element speed, time, duration, energy application level, and light emission. Examples of surgical hubs and controlling and adjusting intelligent surgical instrument operation are further described in the previously mentioned patents: U.S. patent application Ser. No. 16/729,772, entitled "Analyzing Surgical Trends By A Surgical System", filed 12/30/2019; U.S. patent application Ser. No. 16/729,747, entitled "Dynamic Surgical Visualization Systems", filed on 12 months 30 days 2019; U.S. patent application Ser. No. 16/729,744, entitled "Visualization Systems Using Structured Light", filed 12 months 30 days 2019; U.S. patent application Ser. No. 16/729,778, entitled "System And Method For Determining,Adjusting,And Managing Resection Margin About A Subject Tissue",2019, 12, 30; U.S. patent application Ser. No. 16/729,729, entitled "Surgical Systems For Proposing And Corroborating Organ Portion Removals", filed 12/30/2019; U.S. patent application Ser. No. 16/729,778, entitled "Surgical System For Overlaying Surgical Instrument Data Onto A Virtual Three Dimensional Construct Of An Organ",2019, 12, 30; U.S. patent application Ser. No. 16/729,751, entitled "Surgical Systems For Generating Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures Thereto",2019, 12, 30, submission; U.S. patent application Ser. No. 16/729,740, entitled "Surgical Systems Correlating Visualization Data And Powered Surgical Instrument Data", filed 12/30/2019; U.S. patent application Ser. No. 16/729,737, entitled "Adaptive Surgical System Control According To Surgical Smoke Cloud Characteristics", filed 12/30/2019; U.S. patent application Ser. No. 16/729,796, entitled "Adaptive Surgical System Control According To Surgical Smoke Particulate Characteristics", filed 12/30/2019; U.S. patent application Ser. No. 16/729,803, entitled "Adaptive Visualization By A Surgical System", filed 12/30/2019; and U.S. patent application Ser. No. 16/729,807, entitled "Method Of use IMAGING DEVICES IN Surgery", filed 12 months 30 in 2019; and U.S. patent application Ser. No. 17/068,857, entitled "Adaptive Responses From SMART PACKAGING Of Drug Delivery Absorbable Adjuncts", filed on day 13 of 10/2020; U.S. patent application Ser. No. 17/068,858, entitled "Drug Administration DEVICES THAT Communicate With Surgical Hubs", filed on day 13 of 10 months 2020; U.S. patent application Ser. No. 17/068,859, entitled "Controlling Operation Of Drug Administration Devices Using Surgical Hubs", filed on day 13 of 10 months 2020; U.S. patent application Ser. No. 17/068,863, entitled "Patient Monitoring Using Drug Administration Devices", filed on day 13 of 10 months 2020; U.S. patent application Ser. No. 17/068,865, entitled "Monitoring And Communicating Information Using Drug Administration Devices", filed on day 13 of 10 months 2020; U.S. patent application Ser. No. 17/068,867, entitled "AGGREGATING AND Analyzing Drug Administration Data," U.S. Pat. No. 10/13, 2020, which is incorporated herein by reference in its entirety.
The surgical hub may be configured to enable visualization of the received data to be provided on a display in the surgical environment such that a practitioner in the surgical environment may view the data and thereby receive an understanding of the operation of the imaging device used in the surgical environment. Such information provided via visualization may include text and/or images.
Fig. 19 illustrates one embodiment of a surgical system 802 that includes a surgical hub 806 (e.g., the surgical hub 706 of fig. 18 or other surgical hubs described herein), a robotic surgical system 810 (e.g., the robotic surgical system 110 of fig. 1 or other robotic surgical systems described herein), and a visualization system 808 (e.g., the visualization system 100 of fig. 1 or other visualization systems described herein). As discussed herein, the surgical hub 806 may be in communication with the cloud. Fig. 19 shows a surgical system 802 for performing a surgical procedure on a patient lying on an operating table 814 in a surgical room 816. The robotic system 810 includes a surgeon's console 818, a patient side cart 820 (surgical robot), and a robotic system surgical hub 822. The robotic system surgical hub 822 is generally configured to be similar to the surgical hub 822 and may communicate with the cloud. In some embodiments, the robotic system surgical hub 822 and the surgical hub 806 may be combined. The patient side cart 820 may manipulate the intelligent surgical tool 812 through a minimally invasive incision in the patient's body while a medical practitioner (e.g., a surgeon, nurse, and/or other medical practitioner) views the surgical site through the surgeon console 818. An image of the surgical site may be obtained by an imaging device 824 (e.g., imaging device 120 of fig. 1 or other imaging devices described herein) that may be maneuvered by the patient side cart 820 to orient the imaging device 824. The robotic system surgical hub 822 may be used to process the image of the surgical site for subsequent display to the surgeon via the surgeon's console 818.
The main display 819 is positioned in a sterile field of the operating room 816 and is configured to be visible to an operator at the operating table 814. In addition, as in the illustrated embodiment, the visualization tower 818 may be positioned outside the sterile zone. Visualization tower 818 includes a first non-sterile display 807 and a second non-sterile display 809 facing away from each other. The visualization system 808, guided by the surgical hub 806, is configured to utilize the displays 807, 809, 819 to coordinate information flow to medical practitioners inside and outside the sterile field. For example, the surgical hub 806 can cause the visualization system 808 to display a snapshot and/or video of the surgical site as obtained by the imaging device 824 on one or both of the non-sterile displays 807 or 809 while maintaining a real-time feed of the surgical site on the main display 819. For example, the snapshot and/or video on non-sterile display 807 or 809 may allow a non-sterile practitioner to perform diagnostic steps related to the surgical procedure.
The surgical hub 806 is configured to route diagnostic inputs or feedback entered by the non-sterile practitioner at the visualization tower 818 to a main display 819 within the sterile field that can be viewed by the sterile practitioner at the operating table 814. For example, the input may be in the form of modifications to the snapshots and/or videos displayed on the non-sterile displays 807 and/or 809, which may be routed through the surgical hub 806 to the main display 819.
The surgical hub 806 is configured to coordinate the flow of information to a display of the intelligent surgical instrument 812, as described in various U.S. patent applications, which are incorporated by reference herein in this disclosure. Diagnostic inputs or feedback entered by a non-sterile operator at the visualization tower 818 may be routed by the surgical hub 806 to a display 819 within the sterile field that may be viewed by the operator of the surgical instrument 812 and/or other medical practitioners in the sterile field.
The intelligent surgical instrument 812 and imaging device 824 (which is also an intelligent surgical tool) are used with the patient during surgery as part of the surgical system 802. Other intelligent surgical instruments 812a that may be used, for example, in surgery (removably coupled to the patient side cart 820 and in communication with the robotic surgical system 810 and the surgical hub 806) are also shown as being available in fig. 19. Non-intelligent (or "dumb") surgical instruments 817 (e.g., scissors, trocars, cannulas, scalpels, etc.) that are not capable of communicating with the robotic surgical system 810 and the surgical hub 806 are also shown as being available in fig. 19.
Operating intelligent surgical instrument
The smart surgical device may have an algorithm stored thereon (e.g., in a memory thereof) configured to be executable on the smart surgical device, such as by a processor thereof, to control operation of the smart surgical device. In some embodiments, the algorithm may be stored on a surgical hub configured to communicate with the intelligent surgical device, such as in a memory thereof, in addition to or in lieu of being stored on the intelligent surgical device.
Algorithms are stored in the form of one or more sets of multiple data points defining and/or representing instructions, notifications, signals, etc., to control the functions of the intelligent surgical device. In some embodiments, the data collected by the smart surgical device may be used by the smart surgical device, for example, by a processor of the smart surgical device, to change at least one variable parameter of the algorithm. As discussed above, the surgical hub may be in communication with the intelligent surgical device, so that data collected by the intelligent surgical device may be transmitted to the surgical hub and/or information collected by another device in communication with the surgical hub may be transmitted to the surgical hub, and data may be transmitted from the surgical hub to the intelligent surgical device. Thus, instead of or in addition to the intelligent surgical device being configured to change the stored variable parameter, the surgical hub may be configured to communicate the changed at least one variable to the intelligent surgical device alone or as part of an algorithm and/or the surgical hub may communicate instructions to the intelligent surgical device to change the at least one variable as determined by the surgical hub.
At least one variable parameter is among the data points of the algorithm, such as included in instructions for operating the intelligent surgical device, and thus each variable parameter can be changed by changing one or more of the stored plurality of data points of the algorithm. After at least one variable parameter has been changed, subsequent execution of the algorithm proceeds in accordance with the changed algorithm. Thus, by taking into account the actual condition of the patient and the actual condition and/or outcome of the surgical procedure in which the intelligent surgical device is being used, the operation of the intelligent surgical device over time may be managed for the patient to increase the beneficial outcome use of the intelligent surgical device. The change to the at least one variable parameter is automatic to improve patient outcome. Accordingly, the smart surgical device may be configured to provide personalized medicine based on the patient and the surrounding conditions of the patient to provide a smart system. In a surgical environment in which the smart surgical device is used during performance of a surgical procedure, automatic change of at least one variable parameter may allow the smart surgical device to be controlled based on data collected during performance of the surgical procedure, which may help ensure that the smart surgical device is used effectively and correctly and/or may help reduce the chance of injuring the patient with injuring critical anatomy.
The at least one variable parameter may be any of a number of different parameters. Examples of variable parameters include motor speed, motor torque, energy level, energy application duration, tissue compression rate, jaw closure rate, cutting element speed, load threshold, and the like.
Fig. 20 illustrates one embodiment of a smart surgical instrument 900 that includes a memory 902 having an algorithm 904 stored therein that includes at least one variable parameter. Algorithm 904 may be a single algorithm or may include multiple algorithms, e.g., separate algorithms for different aspects of the operation of the surgical instrument, where each algorithm includes at least one variable parameter. The intelligent surgical instrument 900 may be the surgical device 102 of fig. 1, the imaging device 120 of fig. 1, the surgical device 202 of fig. 8, the imaging device 220 of fig. 8, the surgical device 402 of fig. 15, the surgical device 502a of fig. 17, the surgical device 502b of fig. 17, the surgical device 712 of fig. 18, the surgical device 812 of fig. 19, the imaging device 824 of fig. 19, or other intelligent surgical instrument. The surgical instrument 900 further includes a processor 906 configured to execute an algorithm 904 to control operation of at least one aspect of the surgical instrument 900. To execute the algorithm 904, the processor 906 is configured to run a program stored in the memory 902 to access a plurality of data points of the algorithm 904 in the memory 902.
The surgical instrument 900 also includes a communication interface 908 (e.g., a wireless transceiver or other wired or wireless communication interface) configured to communicate with another device, such as a surgical hub 910. The communication interface 908 may be configured to allow one-way communication, such as providing data to a remote server (e.g., a cloud server or other server) and/or to a local surgical hub server, and/or receiving instructions or commands from a remote server and/or local surgical hub server, or two-way communication, such as providing information, messages, data, etc. about the surgical instrument 900 and/or data stored thereon, and receiving instructions, such as instructions from a physician; instructions for a remote server for an update to the software, a local surgical hub server for an update to the software, etc.
The surgical instrument 900 is simplified in fig. 20 and may include additional components such as a bus system, a handle, an elongate shaft with an end effector at its distal end, a power source, and the like. The processor 906 may also be configured to execute instructions stored in the memory 902 to generally control the apparatus 900, including other electronic components thereof, such as the communication interface 908, audio speakers, user interface, etc.
The processor 906 is configured to be capable of changing at least one variable parameter of the algorithm 904 such that subsequent execution of the algorithm 904 will occur in accordance with the changed at least one variable parameter. To change at least one variable parameter of the algorithm 904, the processor 906 is configured to be able to modify or update data points of the at least one variable parameter in the memory 902. The processor 906 may be configured to vary at least one variable parameter of the algorithm 904 in real-time during performance of the surgical procedure using the surgical device 900, which may be adapted to real-time conditions.
In addition to or in lieu of the processor 906 changing at least one variable parameter, the processor 906 may be configured to change the algorithm 904 and/or at least one variable parameter of the algorithm 904 in response to instructions received from the surgical hub 910. In some embodiments, the processor 906 is configured to change at least one variable parameter only after communicating with the surgical hub 910 and receiving instructions from the surgical hub, which may help ensure coordinated actions of the surgical instrument 900 with other aspects of the surgical procedure in which the surgical instrument 900 is being used.
In an exemplary embodiment, the processor 906 executes the algorithm 904 to control the operation of the surgical instrument 900, alters at least one variable parameter of the algorithm 904 based on real-time data, and executes the algorithm 904 to control the operation of the surgical instrument 900 after altering the at least one variable parameter.
Fig. 21 illustrates one embodiment of a method 912 of using a surgical instrument 900 that includes varying at least one variable parameter of an algorithm 904. The processor 906 controls 914 the operation of the surgical instrument 900 by executing an algorithm 904 stored in the memory 902. Based on any of the subsequently known data and/or subsequently collected data, the processor 904 changes 916 at least one variable parameter of the algorithm 904, as discussed above. After changing the at least one variable parameter, the processor 906 controls 918 the operation of the surgical instrument 900 by executing the algorithm 904, at which time the at least one variable parameter has been changed. Processor 904 may change 916 at least one variable parameter a number of times during the performance of the surgical procedure, such as zero, one, two, three, etc. During any portion of method 912, surgical instrument 900 can communicate with one or more computer systems (e.g., surgical hub 910, a remote server such as a cloud server, etc.) using communication interface 908 to provide data thereto and/or receive instructions therefrom.
Situational awareness
The operation of the intelligent surgical instrument may vary based on the situational awareness of the patient. The operation of the smart surgical instrument may be manually changed, such as by a user of the smart surgical instrument manipulating the instrument in different ways, providing different inputs to the instrument, ceasing use of the instrument, and so forth. Additionally or alternatively, the operation of the intelligent surgical instrument may be automatically changed by changing the algorithm of the instrument (e.g., by changing at least one variable parameter of the algorithm). As described above, the algorithm may be automatically adjusted without requiring a user input to request a change. Automating adjustments during performance of a surgical procedure may help save time, may allow a practitioner to focus on other aspects of the surgical procedure, and/or may simplify the practitioner's process of using surgical instruments, each of which may improve patient outcome, such as by avoiding critical structures, controlling surgical instruments taking into account the type of tissue used on and/or near the instruments, etc.
The visualization systems described herein may be used as part of a situational awareness system that may be embodied or performed by a surgical hub (e.g., surgical hub 706, surgical hub 806, or other surgical hubs described herein). In particular, characterizing, identifying, and/or visualizing surgical instruments (including their position, orientation, and motion), tissues, structures, users, and/or other things located in a surgical field or operating room may provide context data that may be utilized by a situational awareness system to infer various information, such as the type of surgery or steps thereof being performed, the type of tissue and/or structure that a surgeon or other practitioner is manipulating, and other information. The situational awareness system may then utilize the contextual data to provide an alert to the user, suggest the user to perform a subsequent step or action, prepare the surgical device for use (e.g., activate an electrosurgical generator for use of an electrosurgical instrument in a subsequent step of the surgical procedure, etc.), control a smart surgical instrument (e.g., customize surgical instrument operating parameters of an algorithm, as discussed further below), etc.
While a smart surgical device that includes an algorithm that responds to sensed data (e.g., by changing at least one variable parameter of the algorithm) may be an improvement over a "dumb" device that operates without regard to sensed data, when considered in isolation, some of the sensed data may be incomplete or uncertain, such as in the context of the absence of the type of surgery being performed or the type of tissue being operated on. Without knowledge of the surgical context (e.g., knowledge of the type of tissue being operated on or the type of operation being performed), the algorithm may erroneously or suboptimally control the surgical device given certain context-free sensing data. For example, the optimal manner of algorithm for controlling the surgical instrument in response to particular sensed parameters may vary depending on the particular tissue type being operated on. This is due to the fact that: different tissue types have different characteristics (e.g., tear resistance, cut-off ease) and thus respond differently to actions taken by the surgical instrument. Thus, it may be desirable for the surgical instrument to take different actions even when the same measurement for a particular parameter is sensed. As one example, the optimal manner of controlling a surgical stapler in response to the surgical stapler sensing an unexpectedly high force for closing its end effector will vary depending on whether the tissue type is prone to tearing or resistant to tearing. For tissue that is prone to tearing (such as lung tissue), the control algorithm of the surgical instrument will optimally slow the motor in response to unexpectedly high forces for closure, thereby avoiding tearing the tissue (e.g., changing variable parameters that control motor speed or torque, making the motor slower). For tear resistant tissue (such as stomach tissue), the algorithm of the instrument will optimally accelerate the motor in response to unexpectedly high forces for closure, thereby ensuring that the end effector is properly clamped against the tissue (e.g., changing variable parameters that control motor speed or torque, making the motor faster). Without knowing whether the lung tissue or stomach tissue has been clamped, the algorithm may change suboptimally or not at all.
The surgical hub may be configured to derive information about the surgical procedure being performed based on data received from the various data sources, and then control the modular device accordingly. In other words, the surgical hub may be configured to infer information about the surgical procedure from the received data and then control a modular device operatively coupled to the surgical hub based on the inferred surgical context. The modular device may include any surgical device controllable by a situational awareness system, such as a visualization system device (e.g., camera, display screen, etc.), smart surgical instrument (e.g., ultrasonic surgical instrument, electrosurgical instrument, surgical stapler, smoke extractor, endoscope, etc.). The modular device may include a sensor configured to be able to detect a parameter associated with a patient in which the device is being used and/or a parameter associated with the modular device itself.
The context information derived or inferred from the received data may include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure being performed by the surgeon (or other practitioner), the type of tissue being operated on, or the body cavity of the subject as the surgical procedure. The situational awareness system of the surgical hub may be configured to derive the context information from the data received from the data source in a number of different ways. In an exemplary embodiment, the context information received by the situational awareness system of the surgical hub is associated with a particular control adjustment or set of control adjustments for one or more modular devices. The control adjustments each correspond to a variable parameter. In one example, the situational awareness system includes a pattern recognition system or machine learning system (e.g., an artificial neural network) that has been trained on training data to correlate various inputs (e.g., data from a database, patient monitoring device, and/or modular device) with corresponding background information about the surgical procedure. In other words, the machine learning system may be trained to accurately derive background information about the surgical procedure from the provided inputs. In another example, the situational awareness system may include a look-up table storing pre-characterized context information about the surgery associated with one or more inputs (or input ranges) corresponding to the context information. In response to a query with one or more inputs, the lookup table may return corresponding context information for the situational awareness system to control the at least one modular device. In another example, the situational awareness system includes an additional machine learning system, look-up table, or other such system that generates or retrieves one or more control adjustments for one or more modular devices when providing contextual information as input.
Surgical hubs that include situational awareness systems may provide any number of benefits to a surgical system. One benefit includes improved interpretation of sensed and collected data, which in turn will improve processing accuracy and/or use of data during a surgical procedure. Another benefit is that the situation awareness system of the surgical hub can improve the surgical outcome by allowing the surgical instrument (and other modular devices) to be adjusted for the specific context of each surgical procedure (such as for different tissue types) and verifying the action during the surgical procedure. Yet another benefit is that the situational awareness system may improve the efficiency of the surgeon and/or other practitioner performing the surgical procedure by automatically suggesting the next steps, providing data, and adjusting the display and other modular devices in the operating room according to the particular context of the procedure. Another benefit includes actively and automatically controlling the modular device according to the particular step of the surgical procedure being performed to reduce the number of times the practitioner needs to interact with or control the surgical system during the course of the surgical procedure, such as by the situation awareness surgical hub actively activating the generator to which the RF electrosurgical instrument is connected in the event that a subsequent step of determining the RF electrosurgical instrument requires the use of the RF electrosurgical instrument. Actively activating the energy source allows the instrument to be ready for use as soon as the previous step of the procedure is completed.
For example, a situation-aware surgical hub may be configured to be able to determine the type of tissue being operated on. Thus, upon detecting an unexpectedly high force for closing an end effector of a surgical instrument, the situation-aware surgical hub may be configured to properly accelerate or decelerate a motor of the surgical instrument for a tissue type, for example, by changing or causing a change in at least one variable parameter of an algorithm of the surgical instrument regarding motor speed or torque.
For another example, the type of tissue being operated on may affect the adjustment of the compression rate and load threshold of the surgical stapler for a particular tissue gap measurement. The situational awareness surgical hub may be configured to infer whether the surgical procedure being performed is a thoracic or abdominal procedure, thereby allowing the situational awareness surgical hub to determine whether tissue held by the end effector of the surgical stapler is lung tissue (for thoracic procedures) or stomach tissue (for abdominal procedures). The surgical hub may then be configured to appropriately cause an adjustment of the compression rate and load threshold of the surgical stapler for the tissue type, for example, by changing or causing a change in at least one variable parameter of an algorithm of the surgical stapler with respect to the compression rate and load threshold.
As yet another example, the type of body cavity being operated during an insufflation procedure may affect the function of the smoke extractor. The situational awareness surgical hub may be configured to determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the type of procedure. Since the type of procedure is typically performed in a particular body cavity, the surgical hub may be configured to be able to appropriately control the motor rate of the smoke extractor for the body cavity being operated, for example, by changing or causing a change in at least one variable parameter of the algorithm of the smoke extractor with respect to the motor rate. Thus, the situational awareness surgical hub can provide consistent smoke evacuation for both thoracic and abdominal procedures.
As yet another example, the type of procedure being performed may affect the optimal energy level for the operation of an ultrasonic surgical instrument or a Radio Frequency (RF) electrosurgical instrument. For example, arthroscopic surgery requires higher energy levels because the end effector of an ultrasonic surgical instrument or an RF electrosurgical instrument is submerged in a fluid. The situational awareness surgical hub may be configured to determine whether the surgical procedure is an arthroscopic procedure. The surgical hub may be configured to adjust the RF power level or ultrasonic amplitude of the generator (e.g., adjust the energy level) to compensate for the fluid-filled environment by, for example, changing or causing a change in at least one variable parameter of the instrument and/or the generator that pertains to an algorithm of the energy level. Relatedly, the type of tissue being operated on may affect the optimal energy level at which the ultrasonic surgical instrument or RF electrosurgical instrument is operated. The situation-aware surgical hub may be configured to be able to determine the type of surgical procedure being performed, for example by changing or causing a change in at least one variable parameter of the algorithm of the instrument and/or generator with respect to energy level, and then customize the energy level of the ultrasonic surgical instrument or RF electrosurgical instrument, respectively, according to the expected tissue profile of the surgical procedure. Further, the situation awareness surgical hub may be configured to adjust the energy level of the ultrasonic surgical instrument or the RF electrosurgical instrument throughout the surgical procedure rather than on a procedure-by-procedure basis only. The situation-aware surgical hub may be configured to determine the steps of the surgical procedure being performed or to be performed subsequently, and then update the control algorithms of the generator and/or the ultrasonic or RF electrosurgical instrument to set the energy level at a value appropriate for the desired tissue type in accordance with the surgical steps.
As another example, a situational awareness surgical hub may be configured to determine whether a current or subsequent step of a surgical procedure requires a different view or magnification on a display according to features that a surgeon and/or other practitioner expects to view at a surgical site. The surgical hub is configured to actively change the displayed view accordingly (e.g., as provided by an imaging device for a visualization system) such that the display is automatically adjusted throughout the surgical procedure.
As yet another example, the situational awareness surgical hub is configured to be able to determine which step of the surgical procedure is being performed or will be performed subsequently and whether specific data or a comparison between data is required for that step of the surgical procedure. The surgical hub may be configured to automatically invoke the data screen based on the step of the surgical procedure being performed without waiting for the surgeon or other practitioner to request that particular information.
As another example, the situational awareness surgical hub may be configured to determine whether a surgeon and/or other practitioner made an error or otherwise deviated from an intended course of action during a surgical procedure, for example as provided in a preoperative surgical plan. For example, the surgical hub may be configured to determine the type of surgical procedure being performed, retrieve (e.g., from memory) a corresponding list of steps or order of device use, and then compare the steps being performed or the devices being used during the surgical procedure with the expected steps or devices determined by the surgical hub for the type of surgical procedure being performed. The surgical hub may be configured to provide an alert (visual, audible, and/or tactile) indicating that a particular step in the surgical procedure is performing an unexpected action or is utilizing an unexpected device.
In some cases, operation of a robotic surgical system (such as any of the various robotic surgical systems described herein) may be controlled by a surgical hub based on its situational awareness and/or feedback from its components and/or based on information from a cloud (e.g., cloud 713 of fig. 18).
Embodiments of situational awareness systems and the use of situational awareness systems during the performance of surgery are further described in the following previously mentioned US patents: U.S. patent application Ser. No. 16/729,772, entitled "Analyzing Surgical Trends By A Surgical System", filed 12/30/2019; U.S. patent application Ser. No. 16/729,747, entitled "Dynamic Surgical Visualization Systems", filed on 12 months 30 days 2019; U.S. patent application Ser. No. 16/729,744, entitled "Visualization Systems Using Structured Light", filed 12 months 30 days 2019; U.S. patent application Ser. No. 16/729,778, entitled "System And Method For Determining,Adjusting,And Managing Resection Margin About A Subject Tissue",2019, 12, 30; U.S. patent application Ser. No. 16/729,729, entitled "Surgical Systems For Proposing And Corroborating Organ Portion Removals", filed 12/30/2019; U.S. patent application Ser. No. 16/729,778, entitled "Surgical System For Overlaying Surgical Instrument Data Onto A Virtual Three Dimensional Construct Of An Organ",2019, 12, 30; U.S. patent application Ser. No. 16/729,751, entitled "Surgical Systems For Generating Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures Thereto",2019, 12, 30, submission; U.S. patent application Ser. No. 16/729,740, entitled "Surgical Systems Correlating Visualization Data And Powered Surgical Instrument Data", filed 12/30/2019; U.S. patent application Ser. No. 16/729,737, entitled "Adaptive Surgical System Control According To Surgical Smoke Cloud Characteristics", filed 12/30/2019; U.S. patent application Ser. No. 16/729,796, entitled "Adaptive Surgical System Control According To Surgical Smoke Particulate Characteristics", filed 12/30/2019; U.S. patent application Ser. No. 16/729,803, entitled "Adaptive Visualization By ASurgical System", filed 12/30/2019; and U.S. patent application Ser. No. 16/729,807, entitled "Method Of use IMAGING DEVICES IN Surgery," filed 12 months 30 in 2019.
Integrated anchoring element
In certain embodiments, the surgical anchor system is configured to be used for endoluminal access and to enable atraumatic retraction or manipulation of the surgical site to improve access thereto (e.g., visual and/or operative purposes). Unlike conventional systems (e.g., systems that use laparoscopic deployment instruments such as graspers to grasp delicate external tissue surfaces of an organ), the present surgical anchor system is designed to manipulate the organ using an anchor member that not only has a larger surface area than conventional graspers, but is also configured to apply a manipulation force to an internal tissue layer of the organ that is generally tougher and less delicate than an external tissue layer of the organ. This internal steering force may increase mobilization of the organ at the treatment site, thereby improving access and movement (e.g., for dissection and resection) without damaging the outer tissue layers of the organ or reducing blood flow to the treatment site. The organ may include a plurality of natural body lumens (e.g., bronchioles of the lung), while in other embodiments the organ includes a single natural body lumen (e.g., colon).
In one exemplary embodiment, a surgical anchor system may include a surgical instrument (e.g., an endoscope) configured for endoluminal access, the surgical instrument including an outer sleeve defining a working channel therethrough and at least one channel arm configured to extend through the working channel. The at least one channel arm includes at least one anchor member coupled to the at least one channel arm and configured to be movable between an expanded state and an unexpanded state; and at least one control actuator extending along the at least one channel arm and operably coupled to the at least one anchor member. The at least one control actuator is further operably coupled to a drive system configured to control movement of the at least one channel arm. The at least one anchoring member may be configured to be at least partially disposable within the natural body lumen such that, when in the expanded state, the at least one anchoring member may contact an inner surface of the natural body lumen and thereby anchor the at least one channel arm to the natural body lumen. Thus, movement of the channel arm may selectively manipulate the natural body lumen anchored thereto (e.g., internally manipulated) and thus the organ associated with the natural body lumen.
In another exemplary embodiment, a surgical anchoring system may include a surgical instrument (e.g., an endoscope) configured for endoluminal access, the surgical instrument including a dual-coupled deployable fixation element. The dual-coupled deployable fixation element is configured to interact with both the fixed anatomical location and the movable anatomical location to manipulate and reposition the organ. The surgical instrument can include a first deployable securing element that deploys at a natural body orifice of an organ that serves to secure an anatomical location. The surgical instrument can include a second deployable securing element that deploys at a movable anatomical location spaced apart from the fixed anatomical location. The surgical instrument may be configured to manipulate and reposition the organ to improve access and visibility from opposite sides of the organ wall. Because of the coupling of the first expandable fixation element to the fixed anatomical location, forces and constraints that fix the anatomical location may be transferred to the second expandable fixation element to allow induced lateral forces and movements of the organ.
The term "expanding" is intended to mean increasing the size of the anchoring member by a desired amount by mechanical means or fluid pressure. These terms are not intended to mean that the anchor member must be completely or 100% filled with fluid when the anchor member is "expanded" (however, such embodiments are within the scope of the term "filled"). Similarly, the term "unexpanded" does not necessarily mean that the anchoring member is completely empty or at 0 pressure. Some fluid may be present and the anchor member may have a non-zero pressure in the "unexpanded" state. "unexpanded" anchor member is intended to mean that the anchor member is mechanically collapsed to a size smaller than the expanded size, or does not include fluid under the amount or pressure that would be desired after the anchor member is filled.
An exemplary surgical anchoring system may include a variety of features, as described herein and shown in the accompanying drawings. However, those skilled in the art will appreciate that the surgical anchor system may include only some of these features and/or it may include a variety of other features known in the art. The surgical anchoring systems described herein are intended to represent only certain exemplary embodiments. Furthermore, while surgical anchoring systems are shown and described in connection with the lungs and colon, those skilled in the art will appreciate that these surgical anchoring systems may be used in connection with any other suitable natural body cavity or organ.
Lung resection (e.g., lobectomy) is a surgical procedure that resects all or part of the lung (e.g., one or more lobes). The purpose of performing lung resection is to treat a damaged or diseased lung caused by lung cancer, emphysema, bronchiectasis, or the like. During a lung resection, one or more lungs are first deflated and then one or more incisions are made between the ribs on the patient's side, laparoscopically reaching the lungs. Instruments (such as graspers and laparoscopes) are inserted through the incision. Once an infected or damaged area of the lung is identified, the area is excised from the lung and removed from the incision or incisions. The incision tract and the one or more incisions may be closed, for example, with a surgical stapler or suture.
Because the lung is deflated during surgery, it may be necessary to mobilize the lung or some portion thereof to allow the instrument to reach the surgical site. Such mobilization may be performed by grasping an outer tissue layer of the lung with a grasper and applying force to the lung through the grasper. However, the pleura and nature of the lung are very fragile and therefore can easily tear or tear under the applied force. Additionally, during mobilization, the grasper may cut off blood supply to one or more areas of the lung.
Fig. 22 illustrates an exemplary embodiment of a surgical anchor system 2100 configured for intra-luminal access into a lung 2010. As will be described in greater detail below, the surgical anchoring system 2100 is used to manipulate the lung 2010 through contact with a natural body lumen (e.g., the first bronchiole 2022) within the lung 2010. For simplicity, certain components of the surgical anchor system 2100 and lung 2010 are not shown.
As shown, the lung 2010 includes an outer tissue surface 2012, a trachea 2014, a right bronchus 2016, and a bronchiole 2018. The trachea 2014, right bronchi 2016 and bronchioles 2018 are in fluid communication with each other. In addition, lung 2010 includes upper lobe 2020, which includes a first bronchiole 2022; and a middle lobe 2023 comprising a second bronchiole 2024. As shown in fig. 22, the lung 2010 is in an expanded state and the surgical anchoring system 2100 is initially inserted into the lung 2010. When operating in the chest cavity, the lung 2010 collapses to provide sufficient working space between the chest and the lung to enable the laparoscopic deployment instrument to access and manipulate the lung 2010. In use, as described in more detail below, the surgical anchor system 2100 can manipulate (e.g., mobilize) a portion of the lung 2010.
The surgical anchoring system 2100 includes a surgical instrument 2102 configured for endoluminal access through the trachea 2014 and into the lung 2010. Surgical instruments can have a variety of configurations. For example, in the illustrated embodiment, the surgical instrument 2102 includes an outer sleeve 2103 and first and second channel arms 2106, 2108. Although two channel arms 2106, 2108 are shown, in other embodiments, the surgical instrument can include a single channel arm or more than two channel arms. Outer sleeve 2103 is configured to be inserted through a patient's mouth (not shown) and down trachea 2014. The outer sleeve 2103 includes a working channel 2104 configured to allow the first channel arm 2106 and the second channel arm 2108 to be inserted through the outer sleeve 2103 and into the lung 2010. Thus, the first channel arm 2106 and the second channel arm 2108 may be configured to be movable independently of the working channel 2014.
Each of the first and second channel arms 2106, 2108 may include at least one anchor member coupled to the at least one channel arm and configured to be movable between an expanded state and an unexpanded state. The at least one anchoring member is configured to be disposed at least partially within a second natural body lumen in communication with the first natural body lumen within which the outer sleeve is disposed partially when in the expanded state. In this illustrated embodiment, a first anchor member 2113 (see fig. 23A, 23B, and 24) is coupled to the first channel arm 2106, and a second anchor member 2115 (see fig. 26) is coupled to the second channel arm 2108. In addition, as shown in fig. 22 and 23, the first natural body lumen is the right bronchus 2016 and the second natural body lumen is the first bronchiole 2022.
In addition, each of the first and second channel arms 2106, 2108 further includes a control actuator and a fluid tube extending along the length of the channel arm and further extending from the proximal end 2103p of the outer sleeve 2103. As shown in fig. 22, the first channel arm 2106 includes three control actuators 2106a, 2106b, 2106c and a first fluid tube 2107. The second channel arm 2108 includes three control actuators 2108a, 2108b, 2108c and a second fluid tube 2109. As described in more detail below, the control actuator of each control arm is configured to allow manipulation of the lung 2010, and the fluid tubes 2107, 2109 are configured to provide fluid to the first and second anchor members 2113, 2115 coupled to the first and second channel arms 2106, 2108.
In use, as shown in fig. 22, outer sleeve 2103 is passed through the mouth of a patient (not shown) into trachea 2014. With the outer sleeve in place, the anchor member 2105 is moved to an expanded state in which the anchor member 2105 at least partially contacts the inner surface 2017 of the right bronchus 2016. The outer sleeve 2103 is secured to the trachea 2014 and right bronchus 2016 by contacting the inner surface 2017. The first channel arm 2106 enters the lung 2010 through the right bronchus 2016 via the outer sleeve 2105 and into the first bronchiole 2022 of the upper leaf 2020, and the second channel arm 2108 enters the lung 2010 through the right bronchus 2016 and into the second bronchiole 2024 of the middle leaf 2023. Once the first and second channel arms 2106, 2108 are properly positioned within the first and second bronchi 2022, 2024, respectively, the first and second anchoring members 2113, 2115 may be expanded to at least partially contact the inner surfaces of the bronchioles 2022, 2024. For simplicity, the following description is with respect to the first anchor member 2113. However, those skilled in the art will appreciate that the following discussion also applies to the second anchoring member 2115, which is similar in structure to the first anchoring member 2113 as shown in fig. 26. A detailed partial view of the first anchoring member 2113 is shown in an unexpanded state (fig. 23A) and an expanded state (fig. 23B).
As shown, the first anchoring member 2113 is disposed distally of the distal end 2103d of the outer sleeve 2103 such that the first anchoring member 2113 is positionable within the first bronchus 2022. The first anchoring member 2113 is configured to be movable between an unexpanded state (fig. 23A) and an expanded state (fig. 23B). The first anchoring member 2113 can have a variety of configurations. For example, in some embodiments, the first anchoring member may be an inflatable balloon, while in other embodiments, the first anchoring member may be a mechanically expandable stent.
As shown in fig. 23A and 23B, the first anchoring member 2113 includes three bladders 2113A, 2113B, 2113c disposed about the outer surface of the first fluid tube 2107. The bladders 2113a, 2113b, 2113c are separated from one another (e.g., by control actuators 2106a, 2106b, and 2106c disposed between the bladders 2113a, 2113b, 2113 c). The bladders 2113a, 2113b, 2113c are distended by the ingress of fluid through a first fluid tube 2107 that is in fluid communication with each bladder 2113a, 2113b, 2113c, and are unexpanded by the egress of fluid from the bladders 2113a, 2113b, 2113c through the first fluid tube 2107. In some embodiments, each balloon 2113a, 2113b, 2113c extends along the length of the first channel arm 2106. Alternatively, in certain embodiments, the bladders 2113a, 2113b, 2113c are disposed along only a portion of the length of the first channel arm 2106.
Alternatively or in addition, at least one of the first channel arm 2106 and the second channel arm 2108 may comprise an optical sensor. By way of example, fig. 25 shows a partial view of the distal end 2106d of the first channel arm 2106. As shown, the distal end 2106d of the channel arm 2106 can include a speculum 2114 having an optical sensor 2110 disposed thereon. The optical sensor 2110 can be configured to allow a user to determine the position of the first channel arm 2106 within the lung 2010 and assist the user in positioning the distal tip 2106d within a desired bronchiole (such as the first bronchiole 2022). The view from the optical sensor 2110 may be provided to a user (e.g., surgeon) in real-time, such as on a display (e.g., monitor, computer tablet screen, etc.). The speculum 2114 may also include a lamp 2111 and a working channel and/or fluid channel 2112 configured to permit insertion and extraction of a surgical instrument and/or to permit surgical instrument or fluid to enter and exit a treatment site within the lung 2010. Those skilled in the art will appreciate that the second channel arm may alternatively or additionally include a scope similar to scope 2114 in fig. 25.
In some embodiments, outer sleeve 2103 may include additional elements. For example, as shown in fig. 22, and in greater detail in fig. 25, the outer sleeve 2103 includes an anchoring member 2105 disposed proximate the distal end 2103d of the outer sleeve 2103. In other embodiments, the anchor member 2105 may be disposed at the distal end 2103 d.
Fig. 25 shows a detailed partial view of the distal end 2130d of the outer sleeve 2103 and the channel arms 2106, 2108. As shown, the channel arms 2106, 2108 extend outwardly from the distal end 2103d of the outer sleeve 2103. In some embodiments, the channel arms 2106, 2108 are movable relative to each other and relative to the outer sleeve 2103. As described above, the anchor member 2105 is disposed on the outer sleeve 2103. The anchor member 2105 is configured to be movable between an expanded state and an unexpanded state. In the expanded state, the anchor member 2105 is configured to at least partially contact the inner surface 2017 of the right bronchus 2016. By contacting the inner surface 2017, the outer sleeve 2103 may be secured to the trachea 2014 and the right bronchus 2016. Such fixation may allow for the application of steering forces (e.g., torsional forces) to the lung 2010 through the channel arms 2106, 2108. Thus, the lung 2010 may be mobilized relative to the trachea 2014 and right bronchi 2016.
An increase in the distribution of forces applied to the lung 2010 and a decrease in tissue interaction pressure may be achieved by increasing the inner surface area with which the anchoring member interacts. The anchoring element is configured to expand to an inner diameter of the bronchial tube. The surgical anchoring system acts as a scaffolding system within the lung by expanding to the full inner diameter and extending the passage arms from the distal end of the outer sleeve. By moving the outer sleeve and/or the passage arm, the bronchioles or bronchi are moved, thereby moving the lungs. Because the outer sleeve and the anchoring element are deployed over a large area, the force applied to the lung is not concentrated compared to maneuvering the lung with a small grasper from the laparoscopic side. In addition, the wall strength of the cylindrical ring and bronchi makes it more ideal for instrument interaction for gross lung movement or repositioning without collateral damage to the surrounding softer and weaker pleura and parenchyma.
In an exemplary embodiment, bifurcated portions of the surgical anchor system and extending down from the outer sleeve 2103 to two separate distal branches may be used to better maintain a larger, more triangulated region of the lung 2010. In addition, a portion of the outer sleeve 2103 may be expanded in addition to the channel arms 2106, 2108 extending from the working channel. Additionally, outer sleeve 2103 may include radially expandable elements that will provide additional contact areas within trachea 2014, which will allow surgical anchoring system 2100 to fully control not only the bending of lung 2010, but also its torsion, expansion, and/or compression. This would enable the surgical anchoring system 2100 to not only guide the lung to the correct location and orientation within the chest cavity, but also to control the shape of the lung so that dissection and/or transection can be performed from the chest side.
The anchor member 2105 can have a variety of configurations. For example, in some embodiments, the anchoring member 2105 may be an inflatable anchoring balloon. In embodiments where the anchor member 2105 is an inflatable anchor balloon, the anchor member 2105 is configured to be capable of expanding or collapsing by the ingress or egress of fluid through a fluid tube (not shown) in fluid communication with the anchor member 2105. The fluid tube extends along the length of the outer sleeve 2103 and may be controlled outside the patient's body. In other embodiments, the anchor member 2105 may be a mechanically expandable stent.
With the channel arms 2106, 2108 properly disposed within the bronchioles 2022, 2024, the lung 2010 collapses. This results in a significant reduction in the size of the lung relative to its expanded state. The lung 2010 IS shown in fig. 26 in a collapsed state, wherein the previously expanded state IS represented as a dashed boundary IS. In use, when in the expanded state as shown in fig. 26, the first anchoring member 2113 is configured to at least partially contact the inner surface 2022a of the first bronchiole 2022. This contact secures the first anchoring member 2113 to the first bronchiole 2022 and, thus, to the lung 2010. The first anchoring member 2113 can be alternated between its unexpanded and expanded states by passing fluid through a fluid tube 2107 passing through the length of the channel arm 2106 into or removing fluid from the bladders 2113a, 2113b, 2113c. The fluid flowing into or out of the bladders 2113a, 2113b, 2113c may be any suitable fluid (e.g., saline, carbon dioxide gas, etc.). The proximal-most end (not shown) of the fluid tube 2107 is configured to be coupled to a fluid system that can be used to control the flow of fluid into or out of the bladders 2113a, 2113b, 2113c. The fluid system may include a pump and a fluid reservoir. The pump creates a pressure that pushes fluid into the bladders 2113a, 2113b, 2113c to expand the bladders 2113a, 2113b, 2113c, and creates a suction that pulls fluid out of the bladders 2113a, 2113b, 2113c to collapse the bladders 2113a, 2113b, 2113c. Those skilled in the art will appreciate that the second anchoring member 2115 may be moved between the expanded and unexpanded states within the second bronchiole 2024 in a similar manner as described above with respect to the first anchoring member 2113.
In addition, in use, other surgical instruments 2120, 2122 may be laparoscopically introduced into the chest cavity for visualizing and/or manipulating the lung 2010 from the extra-luminal space. The surgical instruments 2120, 2122 may include a variety of surgical tools, such as graspers 2123, optical sensors 2124, and/or electrosurgical tools 2125. In an exemplary embodiment, where the surgical instrument 2122 is or includes an optical sensor 2124, a user (e.g., surgeon) can visually inspect the collapsed lung 2010 (fig. 26) to perform an incision on the lung 2010 using the grasper 2123 or electrosurgical tool 2125.
Further, in use, when the anchoring members 2113, 2115 are in the expanded state, a steering force may be applied to the lung 2010 by controlling the actuators 2106a, 2106b, 2106c, 2108a, 2108b, 2108 c. In some embodiments, the surgical anchoring system 2100 includes a controller 2050 configured to coordinate movement of the channel arms 2106, 2108 within the bronchioles 2022, 2024 and movement of the at least one instrument 2120, 2122 outside the lung 2010 to prevent tearing of the bronchioles 2022, 2024 or the external tissue surface 2012 of the lung 2010. The controller 2050 is communicatively coupled to robotic arms (not shown) to which the instruments 2120, 2122 are connected and to the actuators 2052, 2054. The actuator 2052 is configured to be able to apply the steering force F 1、F2、F3 to the control actuators 2106a, 2106b, 2106c, and the actuator 2054 is configured to be able to apply the steering force F 4、F5、F6 to the control actuators 2108a, 2108b, 2108 c.
In use, the steering force F 1 is applied to the control actuator 2106a, the steering force F 2 is applied to the control actuator 2106b, the steering force F 3 is applied to the control actuator 2106c, the steering force F 4 is applied to the control actuator 2108a, the steering force F 5 is applied to the control actuator 2108b, and the steering force F 6 is applied to the control actuator 2108 c. By applying a manipulation force to the lung 2010, the horizontal split between the upper leaf 2020 and the middle leaf 2023 can be enlarged to form the gap G. This gap G allows access to the lungs 2010, thereby enabling further distention of the water plane fracture. The steering force causes the channel arms 2106, 2108 to move in opposite directions, thereby causing the upper leaf 2020 to move away from the middle leaf 2023. The anchoring member 2115 in the expanded state within the right bronchus 2016 prevents inadvertent torsion of the lung 2010 when a steering force is applied to the lung 2010. Thus, the lung 2010 may be steered in a single plane to increase the gap G in an efficient manner. Upon completion of the maneuver, the anchoring members 2113, 2115 are deflated and removed from the bronchioles 2022, 2024 and removed through the outer sleeve 2103. The anchor member 2105 is also deflated, which allows the outer sleeve 2103 to also be removed from the trachea 2014, causing little or no damage to the trachea 2014 or lung 2020 when compared to conventional procedures that use only graspers to mobilize the lung 2020.
If the surgeon deploys at least one channel arm 2106, 2108 within the bronchiole and places the grasper 2123 on the laparoscopic side of the lung 2010, both the channel arm and the instrument may be driven together to move in the same direction or in opposite directions. Moving both in the same direction will allow for supported movement of the section gripped between them. Moving both in opposite directions will create tissue tension, which will make dissection or tissue plane separation easier. Movement in the same direction may also be coordinated by a coupled motion or countering manner when a defined sustainable force is provided between the channel arm and the instrument, wherein either of the channel arm or instrument is a driver coupling and the other will be a follower. In some embodiments, other forms of synchronous motion may include a maximum threshold for coupling force, position control, and/or speed matching.
Fig. 27 shows a schematic view of a surgical anchor system 2200 disposed within a collapsed lung 2201. The surgical anchoring system 2200 and collapsed lung 2201 may be similar to the surgical anchoring system 2100 in fig. 22 and 26 and the collapsed lung 2010 in fig. 26, except for the differences described in detail below, and thus common features are not described in detail herein.
Surgical anchor system 2200 includes surgical instrument 2202, outer sleeve 2203, anchor member 2205 coupled to outer sleeve 2203, first channel arm 2206, and second channel arm 2208. The first channel arm 2206 includes control actuators 2206a, 2206b, 2206c extending along a length of the channel arm 2206 and configured to provide steering forces to the lung 2201 (e.g., through the first bronchiole 2022). The second channel arm 2208 includes control actuators 2208a, 2208b, 2208c extending along a length of the second channel arm 2208 and configured to provide steering forces to the lung 2201 (e.g., through the second bronchioles 2024).
As shown in fig. 27, the first passage arm 2206 includes anchor members 2213a, 2213b, 2213c, 2213d, 2213e disposed on a clutch actuator 2207 that extends through the first passage arm 2206. Each of the anchor members 2213a, 2213b, 2213c, 2213d, 2213e is configured to be axially movable along the length of the passage arm 2206. The clutch actuator 2207 is configured to selectively position the anchor members 2213a, 2213b, 2213c, 2213d, 2213e at axial locations along the length of the first passage arm 2206. Like the anchor members 2113, the anchor members 2213a, 2213b, 2213c, 2213d, 2213e each include an inflatable balloon that can be mechanically expanded or filled with fluid through a fluid passage extending through the length of the clutch actuator 2207.
In addition, the second passage arm 2208 includes anchor members 2215a, 2215b, 2215c, 2215d, 2215e disposed on the clutch actuator 2209. Each of the anchor members 2215a, 2215b, 2215c, 2215d, 2215e is configured to be axially movable along the length of the second passage arm 2208. The clutch actuator 2209 is configured to selectively position the anchor members 2215a, 2215b, 2215c, 2215d, 2215e at axial locations along the length of the second passage arm 2208. Like the anchor members 2115, the anchor members 2215a, 2215b, 2215c, 2215d, 2215e each include an inflatable balloon that can be mechanically expanded or filled with fluid through a fluid passage extending through the length of the clutch actuator 2209.
In use, outer sleeve 2203 is inserted and anchor member 2205 is moved to an expanded state to contact inner tissue surface 2022a of right bronchus 2016. Prior to collapse of the lung 2010, the first and second channel arms 2206, 2208 are inserted and disposed within the bronchioles 2022, 2024. After the lung 2010 collapses, the anchoring members 2213a, 2213b, 2213c, 2213d, 2213e, 2215a, 2215b, 2215c, 2215d, 2215e move to an expanded state to contact the inner tissue surface 2022a of the bronchioles 2022, 2024.
With the anchor members 2213a, 2213b, 2213c, 2213d, 2213e, 2215a, 2215b, 2215c, 2215d, 2215e in the expanded state, the clutch actuators 2207, 2209 may be axially displaced relative to the outer sleeve 2103, thereby pushing the clutch actuators 2207, 2209 further into the first and second bronchioles 2022, 2024. In some implementations, the anchor members 2213a, 2213b, 2213c, 2213d, 2213e, 2215a, 2215b, 2215c, 2215d, 2215e can relatively slide along the clutch actuators 2207, 2209 a prescribed amount before being coupled to the clutch actuators 2207, 2209. This allows a space to be formed between each of the anchor members 2213a, 2213b, 2213c, 2213d, 2213e, 2215a, 2215b, 2215c, 2215d, 2215e, thereby tightening the loose tissue around the first bronchiole 2022 and the second bronchiole 2024. As the clutch actuators 2207, 2209 retract from the lung 2201, the gaps between each of the anchor members 2213a, 2213b, 2213c, 2213d, 2213e, 2215a, 2215b, 2215c, 2215d, 2215e decrease, collapsing the tissue surrounding the first and second bronchioles 2022, 2024.
In addition to manipulating the lung 2201 using the control actuator of the first passage arm 2206, the first clutch actuator 2207 can also be used to move the anchor members 2213a, 2213b, 2213c, 2213d, 2213e axially along the length of the first passage arm 2206. By axially moving the anchor members 2213a, 2213b, 2213c, 2213d, 2213e, the upper leaf 2020 and the first bronchiole 2022 are partially expanded to an expanded state by mechanical expansion of the anchor members 2213a, 2213b, 2213c, 2213d, 2213e. As shown, the outer tissue surface 2021 of the upper leaf 2020 at the horizontal split is tensioned due to the axial expansion of the anchoring members 2213a, 2213b, 2213c, 2213d, 2213e when compared to the outer tissue surface 2025 of the middle leaf 2023 bunched due to the collapsed state of the lung 2010. Axial expansion of the anchoring members 2213a, 2213b, 2213c, 2213d, 2213e also places the upper leaf 2020 in a similar shape as when the lung 2201 IS expanded, as indicated by the expanded state line IS.
Similarly, in addition to manipulating the lung 2201 using the control actuator of the second passage arm 2208, the second clutch actuator 2209 can also be used to move the anchor members 2215a, 2215b, 2215c, 2215d, 2215e axially along the length of the second passage arm 2208. By axially moving the anchor members 2215a, 2215b, 2215c, 2215d, 2215e, the middle lobe 2023 and the second bronchiole 2024 may be partially expanded, similar to the upper lobe 2020. Axial expansion of the anchoring members 2215a, 2215b, 2215c, 2215d, 2215e may also place the middle lobe 2023 in a shape similar to that of the lung 2201 when expanded, as shown by the expanded state line IS.
In other embodiments, the amount of axial extension of the anchor members may be guided by the user, but with a force limit corresponding to the amount of force that can be applied between the two anchor members. In addition, there may be a maximum limit to the amount of displacement between the two anchoring members to prevent over-expansion of the organ. In certain embodiments, the anchoring member itself may also have a load limitation by controlling the maximum expansion force used to limit friction. The anchor members may have integrated sensors that will limit the applied force between the anchor members when the anchor members are axially displaced. The radially applied force may be proportional to the applied longitudinal force, preventing accidental diametral stretch damage even when only small fine stretching motions are applied. Alternatively or in addition, the surgical anchoring system can be operated in the form of load/creep control, allowing a predetermined force to be maintained, and then automatically continue to extend in proportion to creep in the tissue of the organ. This will allow the viscoelastic properties of the tissue of the organ to be used to assist in the expansion of the organ rather than to hinder the expansion.
In certain embodiments, the tissue may be scanned with structured light prior to activating the axial movement of the anchoring member, thereby providing a 3D surface model of the pre-stretched anatomy of the organ. The image may be stored and overlaid to stretching conditions of the organ, thereby providing visual information about the nature of the change in shape of the organ, providing insight into the unseen branches of the organ below the external tissue surface.
As noted above, the present surgical anchoring system may be configured to manipulate other natural body cavities or organs. For example, as discussed below, the present surgical anchoring system may be configured to endoscopically manipulate one or more portions of the colon.
Surgery is often the primary treatment for early stage colon cancer. The type of surgery used depends on the stage (extent) of the cancer, its location in the colon, and the goals of the surgery. Some early colon cancers (stage 0 and some early stage I tumors) and most polyps can be removed during colonoscopy. However, if the cancer has deteriorated, a partial resection or colectomy, i.e. a surgical operation to resect all or part of the colon, may be required. In some cases, nearby lymph nodes are also removed. If only a portion of the colon is removed, a hemicoloectomy or a partial coloectomy may be performed. In segmental resections of the colon, the surgeon removes the diseased portion of the colon and a small non-diseased section of the colon on either side. Typically, about one-fourth to one-third of the colon is resected, depending on the size and location of the cancer. The primary resection of the colon is shown in FIG. 28, wherein (i) A-B is a right-half-colon resection, A-C is an enlarged right-half-colon resection, B-C is a transverse colectomy, C-E is a left-half-colon resection, D-E is a sigmoidectomy, D-F is a anterior resection, D-G is a (super) low anterior resection, D-H is an abdominal-perinectomy, A-D is a secondary colectomy, A-E is a total colectomy, and A-H is a recto-colectomy. Once the resection is complete, the remaining complete portion of the colon is reattached.
In laparoscopic assisted colectomy procedures, it is often difficult to obtain a sufficient surgical field. Typically, a cut is made deep in the pelvis, which makes it difficult to obtain adequate visualization of this area. Thus, during mobilization, the lower rectum must be lifted and rotated to access the veins and arteries surrounding both sides of the rectum. During manipulation of the lower rectum, bunching of tissue and/or excessive stretching of tissue may occur. Additionally, intrarectal tumors may cause peripheral pelvic adhesions, and thus may require the release of rectal stumps and mobilization of mesentery and blood supply prior to transection and removal of the tumor.
In addition, as shown in fig. 29, multiple graspers 2300, 2302, 2304, 2306 and laparoscopes 2301 are required to locate the tumor 2308 for removal from the colon 2310. During dissection of the colon 2310, the tumor 2308 should be under tension, which requires grasping and stretching of healthy tissue 2312, 2314, 2316 surrounding the colon 2310. However, because the graspers 2300, 2302, 2304, 2306 exert high grasping forces on the tissues 2312, 2314, 2316, manipulation of the tissues 2312, 2314, 2316 surrounding the tumor 2308 may result in reduced blood flow and trauma. Additionally, during a colectomy, it may be desirable to mobilize the transverse and ascending colon, allowing the intact residual colon to be brought down to connect to the rectum 2318 after transecting and removing the segment of the colon 2310 containing the tumor 2308. Surgical instruments that can be used to safely manipulate the colon to provide the surgeon with better visualization and access to arteries and veins during mobilization will help prevent injury and blood loss to surrounding areas during colectomy.
Fig. 30 and 31 illustrate one embodiment of a surgical anchor system 2400 configured for endoluminal access and manipulation of a colon 2310. As will be described in greater detail below, surgical anchoring system 2400 is used to manipulate and tension a portion (e.g., segment F) of colon 2310. For simplicity, certain components of surgical anchoring system 2400 and colon 2310 are not shown. Although the surgical anchoring system 2400 is shown and described in connection with manipulation of a segment F of the colon 2310, those skilled in the art will appreciate that the surgical anchoring system 2400 may be used to additionally or alternatively expand other segments of the colon 2310.
As shown in fig. 30 and 31, surgical anchoring system 2400 may have a variety of configurations. In some embodiments, surgical anchoring system 2400 includes a tubular member 2402 configured for endoluminal access through a natural orifice (such as rectum 2318) and into colon 2310. The tubular member 2402 includes a central lumen 2404 disposed therein and configured to receive an endoscope. In addition, the tubular member 2402 includes a plurality of working channels formed by working channels 2406a, 2406b, 2406c, 2408a, 2408b, 2408c, 2410a, 2410b, 2410c extending therethrough. In other embodiments, the tubular member may have other suitable configurations and shapes.
Surgical anchor system 2400 also includes an anchor assembly 2418 coupled to tubular member 2402 and extending distally from distal end 2402d of tubular member 2402. Anchor assembly 2418 includes a first anchor member 2420 and a second anchor member 2430. The first anchor member 2420 is coupled to the distal end 2402d of the tubular member 2402 and is configured to engage the first anatomical location and fix the first anatomical location relative to the tubular member 2402 (fig. 32). The first anchor member 2420 includes a first plurality of expandable anchor elements 2422 extending between a proximal collar 2424 and a distal collar 2426. The distal collar 2426 is configured to be axially moveable relative to the proximal collar 2424 such that when the distal collar 2426 is axially moved toward the proximal collar 2424, the expandable anchor element 2422 expands radially outward from the axis of axial movement through the distal collar 2426. By expanding radially outward, the expandable anchor element 2422 is configured to at least partially contact an internal tissue surface of a natural body cavity or organ when in an expanded state.
To axially displace the distal collar 2426 toward the proximal collar 2424, a first plurality of actuators 2412 are connected to the distal collar 2426. The first plurality of actuators 2412 includes actuators 2412a, 2412b, 2412c, wherein the actuator 2412a passes through the working channel 2406a, the actuator 2412b passes through the working channel 2406b, and the actuator 2412c passes through the working channel 2406c. As the actuators 2412a, 2412b, 2412c are tensioned and pulled through or rotated within the working channel, the distal collar 2426 is axially displaced toward the proximal collar 2424, expanding the expandable anchor element 2422. To interact the actuators 2412a, 2412b, 2412c with the distal collar 2426, the actuators 2412a, 2412b, 2412c pass through a plurality of working channels (not shown) within the proximal collar 2424.
Second anchor member 2430 can be movable relative to first anchor member 2420 and positioned distally of first anchor member 2420 by distance D 1. Second anchor member 2430 is configured to engage the second anatomical location and is movable relative to the first anatomical location (fig. 32). Second anchor member 2430 includes a second plurality of expandable anchor elements 2432 extending between proximal collar 2434 and distal collar 2436. The distal collar 2436 is configured to be axially movable relative to the proximal collar 2434 such that when the distal collar 2436 is axially moved toward the proximal collar 2434, the expandable anchor element 2432 expands radially outward from the axis of axial movement through the distal collar 2436. By expanding radially outward, the expandable anchoring element 3432 is configured to at least partially contact an internal tissue surface of a natural body lumen or organ when in an expanded state.
To axially displace the distal collar 2436 toward the proximal collar 2434, a second plurality of actuators 2414 is connected to the distal collar 2436. The second plurality of actuators 2414 includes actuators 2414a, 2414b, 2414c, wherein the actuator 2414a passes through the working channel 2408a, the actuator 2414b passes through the working channel 2408b, and the actuator 2414c passes through the working channel 2408c. As the actuators 2414a, 2414b, 2414c are tensioned and pulled through or rotated within the working channel, the distal collar 2436 is axially displaced toward the proximal collar 2434, expanding the expandable anchoring element 2432. To interact the actuators 2414a, 2414b, 2414c with the distal collar 2436, the actuators 2414a, 2414b, 2414c are passed through the working channel 2438 within the proximal collar 2434, and the plurality of working channels (not shown) within the proximal collar 2424 and the distal collar 2426 of the first anchor element 2420.
As shown in fig. 31 and 32, with first anchor member 2424 and second anchor member 2434 in the expanded state, first anchor member 2424 is engaged with first anatomical location 2320 and second anchor member is engaged with second anatomical location 2322. To axially displace the second anchor member 2430 relative to the first anchor member 2420, the actuators 2416a, 2416b, 2416c are rotated to unscrew the actuators 2416a, 2416b, 2416c from the threaded jackets 2417a, 2417b, 2417 c. Actuator 2416a is threaded within threaded sheath 2417a, actuator 2416b is threaded within threaded sheath 2417b, and actuator 2416c is threaded within threaded sheath 2417 c. Because the actuator and the threaded sheath have complementary threads, rotation of the actuators 2416a, 2416b, 2416c results in an increase in the distance between the first anchor member 2420 and the second anchor member 2430. In some embodiments, one of the actuators 2416a, 2416b, 2416c may rotate more than the other actuator, thereby creating a bending length D 2 between the first anchor member 2420 and the second anchor member 2430 that is greater than D 1.
In use, the curved length D 2 can be used to create tension on one side of the colon 2310, such as at the location of a tumor. Because first anatomical location 2320 is engaged with first anchor member 2420 through expandable anchor element 2422 and second anatomical location 232 is engaged with second anchor member 2430 through expandable anchor element 2432, second anatomical location 2322 is selectively repositioned relative to first anatomical location 2320 when actuators 2416a, 2416b, 2416c are rotated and unthreaded.
As shown in fig. 32, tissue wall 2324 is tensioned to a greater extent than tissue wall 2326 disposed opposite tissue wall 2324. By tensioning the tissue wall 2324 with the tumor 2308 located on the colon 2310, the tumor can be visualized and removed by laparoscopic deployment instruments 2332, 2334. In addition, to further aid in visualizing the tumor 2308, an endoscope 2330 is disposed within the central lumen 2404 of the tubular member 2402.
Sensing surgical instruments
During certain surgical procedures, it may be advantageous to be able to track the position and orientation of certain surgical instruments within a patient. For example, during a colectomy, the mobilized portion of the colon must be aligned and connected with the rectum in order to reattach the colon to the rectum. In certain surgical systems, at least one of the surgical instruments may include an integrated tracking and coordination device that identifies the position of the surgical instruments relative to each other.
In some embodiments, the surgical instrument may include one or more markers (e.g., attachable or integrated markers) that may be used to track the surgical instrument. This may allow the surgical instrument to directly cooperate with the dual sensing and cooperative control system. Thus, the surgical instrument can be inserted directly into the body (e.g., into a natural orifice) without the need for a speculum (e.g., an endoscope), and used similarly to a speculum.
Fig. 33 illustrates another embodiment of a surgical anchor system 2500. The surgical anchor system 2500 includes attachable or integrated markers and sensing means for use with instruments introduced through natural orifices without the need for another speculum that would enable it to cooperate with a dual sensing and cooperative control system.
The surgical anchor system 2500 includes a laparoscopic deployment instrument 2502 having a sensing array 2504. The sensing array 2504 is configured to wirelessly interact with the first collar 2506 and the second collar 2508 in order to align a circular stapler 2510 disposed within the rectum 2318 with an anvil 2512 disposed within the remainder of the colon 2310. First collar 2506 is disposed within circular stapler 2510 and emits magnetic field 2514. Second collar 2508 is disposed on anvil 2512 and emits magnetic field 2516. Both magnetic fields 2514, 2516 can be detected by sense array 2504. Magnetic fields 2514, 2516 are configured to relay position and orientation data regarding circular stapler 2510 and anvil 2512 in order to align colon 2310 with rectum 2318.
The anvil 2512 includes a post 2518 that is grasped by the instrument 2520 in order to mobilize the colon 2310. As the anvil 2512 is moved by the instrument 2520, the sensing array 2504 collects magnetic field data and determines the distance and misalignment of the stapler trocar axis 2522 and anvil trocar axis 2524. When the stapler trocar axis 2522 is aligned with the anvil trocar axis 2524, the anvil 2512 can be positioned over the post 2526 of the circular stapler 2510. When the column 2526 is disposed above the column 2526, the column 2518 may include alignment features 2528. In certain embodiments, once the posts 2518, 2526 are aligned with each other, the circular stapler 2510 can be rotated, thereby coupling the anvil 2512 to the circular stapler 2510 such that the colon 2310 can be stapled to the rectum 2318.
The instrument 2502 can include an optical sensor disposed on its distal end to visualize the treatment area to an external screen visible to the user, thereby helping the user adjust and align the circular stapler from the laparoscopic side to the correct position for anvil attachment.
The surgical anchor systems disclosed herein may be designed to be disposed of after a single use, or may be designed for multiple uses. In either case, however, the surgical anchor system may be reused after at least one use through repair. Repair may include any combination of disassembly of the surgical anchor system, followed by cleaning or replacement of particular parts, and subsequent reassembly steps. In particular, the surgical anchor system can be disassembled, and any number of the particular pieces or fittings of the surgical anchor system can be selectively replaced or removed in any combination. After cleaning and/or replacement of particular accessories, the surgical anchoring system may be reassembled for subsequent use either at a reconditioning facility, or by a surgical team immediately prior to a surgical procedure. Those skilled in the art will appreciate that repair of the surgical anchor system may utilize various techniques for disassembly, cleaning/replacement, and reassembly. The use of such techniques and the resulting prosthetic devices are within the scope of the application.
Instrument control imaging system
The devices, systems, and methods for multi-source imaging provided herein allow for collaborative surgical visualization. Generally, in collaborative surgical visualization, a first imaging system and a second imaging system (e.g., a first endoscopic device and a second endoscopic device) each collecting images of a surgical site are configured to cooperate to provide enhanced imaging of the surgical site. Collaborative surgical visualization may improve visualization of patient anatomy at a surgical site and/or improve control of surgical instruments at the surgical site.
The surgical visualization system may allow for intra-operative identification of critical structures (e.g., diseased tissue, anatomical structures, surgical instruments, etc.). Thus, the surgical visualization system may enable enhanced intraoperative decision-making and improved surgical results. The surgical visualization system may provide advanced visualization capabilities beyond what the practitioner sees with the "naked eye" and/or beyond what the imaging system can identify and/or communicate to the practitioner. The surgical visualization system may enhance and strengthen information that a medical practitioner is able to know prior to tissue treatment (e.g., dissection, etc.), and thus may improve the results in various circumstances. Thus, the practitioner knows that the surgical visualization system is tracking critical structures that are accessible, for example, during incision, and can be confident to maintain power throughout the surgical procedure. The surgical visualization system may provide instructions to the practitioner for a time sufficient to cause the practitioner to pause and/or slow the surgical procedure and assess proximity to critical structures to prevent accidental damage thereto. The surgical visualization system may provide the practitioner with an ideal, optimized, and/or customizable amount of information to allow the practitioner to confidently and/or quickly move through tissue while avoiding configuring healthy tissue and/or critical knots to be accidentally damaged, and thus minimizing the risk of injury caused by the surgical procedure.
The surgical systems provided herein generally include: a first speculum device configured to be capable of transmitting image data of a first scene within its field of view; a second speculum device configured to be capable of transmitting image data of a different second scene within its field of view; a tracking device associated with one of the first or second endoscopic devices and configured to transmit a signal indicative of a position of the one of the first or second endoscopic devices relative to the other of the first or second endoscopic devices; a controller configured to receive the transmitted data and signals, determine a relative distance between the first and second endoscopic devices, and provide a combined image. The combined image may be at least a portion of at least a first and a second endoscopic device in a single scene, and at least one of the first and second endoscopic devices in the combined image is a representative depiction thereof. Thus, the combined image may thereby provide two independent perspectives of the surgical site, which may conveniently allow a practitioner to view only one display rather than multiple displays. In addition, within the one display, the combined image allows the practitioner to coordinate the relative position and/or orientation of at least a first endoscopic device disposed at or near the surgical site.
The first endoscopic device is configured to be at least partially disposable within at least one of a natural body cavity and an organ (e.g., a lung, stomach, colon, or small intestine), and the second endoscopic device is configured to be at least partially disposable outside of at least one of the natural body cavity and the organ. In certain embodiments, the first endoscopic device is an endoscope and the second endoscopic device is a laparoscope. The natural body cavity or organ may be any suitable natural body cavity or organ. Non-limiting examples include the stomach, lung, colon, or small intestine.
The surgical systems provided herein may also be used with various robotic surgical systems, such as those discussed above, and may incorporate various tracking and/or imaging mechanisms, such as Electromagnetic (EM) tracking tips, fiber bragg gratings, virtual tags, fiducial markers, the use of probes, the identification of known anatomy, various 3D scanning techniques (such as using structured light), various sensors and/or imaging systems previously discussed, etc., to aid in tracking movement of instruments, endoscopes, and laparoscopes relative to each other and/or relative to the overall system. The tracking mechanism may be configured to be able to transmit tracking data from both the laparoscope and the endoscope so that the position of either endoscope relative to the other can be determined. In addition, critical structures (e.g., diseased tissue, surgical instruments, anatomical structures) within the field of view of any endoscopic device may be tracked by an endoscope having such critical structures within its field of view. In summary, the surgical system herein can track objects within the field of view of each scope as well as the relative position of each scope. Thus, the entirety of the tracking data allows the system to calculate the distance of critical structures from a scope that does not have critical structures in its field of view based on the tracking data collected by another scope.
In some embodiments, the surgical system may include a tracking device associated with one of the first or second endoscopic devices and configured to transmit a signal indicative of a position of the one of the first or second endoscopic devices relative to the other of the first or second endoscopic devices.
In various embodiments, the surgical systems provided herein include a controller. The surgical system, controller, display and/or various instruments, endoscopes and laparoscopes may also be incorporated into a number of different robotic surgical systems and/or may be part of a surgical hub, such as any of the systems and surgical hubs discussed above. The controller is typically configured to be able to merge the first and second scenes from the endoscope and laparoscope, respectively, to visually create a merged image between the first and second scenes. The controller is configured to receive the tracking data described in detail above and, in combination with the first scene and the second scene, generate a combined image containing at least a representative depiction of the endoscope or laparoscope and any structures within the field of view of the endoscope that are visually obscured by the tissue wall. For example, if the combined image is from the perspective of the endoscope, the combined image is a real-time image stream that the endoscope is viewing, including a superposition of the orientation and position of the laparoscopic deployment surgical instrument and the laparoscope (if present).
In some embodiments, the controller may be configured to be capable of receiving transmitted image data of the first scene and the second scene from the first speculum arrangement and the second speculum arrangement and the transmitted signal from the tracking device; determining a relative distance between the first and second endoscopic devices based on the transmitted signals; and providing a combined image of at least a portion of the at least first and second endoscopic devices in the single scene based on the transmitted image data and the relative distance between the first and second endoscopes, wherein at least one of the first and second endoscopic devices in the combined image is a representative depiction thereof.
An exemplary surgical system may include a variety of features, as described herein and shown in the drawings. However, those skilled in the art will appreciate that the surgical system may include only some of these features and/or it may include a variety of other features known in the art. The surgical systems described herein are intended to represent only certain exemplary embodiments. Furthermore, while the surgical systems are shown and described in connection with the stomach, those skilled in the art will appreciate that these surgical systems may be used in connection with any other suitable natural body cavity or organ.
Surgery is the most common treatment for gastric cancer. When gastric cancer requires surgery, the goal is to remove the entire tumor and a good boundary of healthy stomach tissue surrounding the tumor. Different procedures may be used to remove gastric cancer. The type of procedure used depends on which part of the stomach the cancer is located and the distance it grows in the nearby area. For example, endoscopic Mucosal Resection (EMR) and Endoscopic Submucosal Dissection (ESD) are gastric procedures that can be used to treat some early cancers. These procedures do not require cutting the skin, but rather the surgeon passes the endoscope down the patient's throat and into the stomach. A surgical tool (e.g., MEGADYNE TM tissue dissectors or pencil knives) is then passed through the working channel of the endoscope to remove the tumor and some layers of normal stomach wall below and around it.
Other surgical procedures include a sub-gastric (gastric partial) resection or total gastric resection, which may be performed as an open procedure (e.g., inserting surgical instruments through a large incision in the skin of the abdomen) or as a laparoscopic procedure (inserting surgical instruments into the abdomen through several small incisions). Laparoscopic gastrectomy procedures typically involve insufflating the abdominal cavity with carbon dioxide gas to a pressure of about 15 millimeters of mercury (mm Hg). The abdominal wall is pierced and then a straight tube cannula or trocar, 5mm-10mm in diameter, is inserted into the abdominal cavity. A laparoscope connected to the operating room monitor is used to visualize the operating field and is placed through one of the trocars. Laparoscopic instruments are placed through two or more additional trocars for manipulation by surgeons and surgical assistants to remove desired portions of the stomach.
Fig. 34 shows a schematic depiction of a stomach 3000. Stomach 3000 may include esophageal sphincter 3001, greater curvature 3002, lesser curvature 3003, pyloric sphincter 3004, duodenum 3005, and duodenum jejunum curvature 3006. In addition, stomach 3000 includes an inner tissue wall 3007 and an outer tissue wall 3008. The esophageal sphincter 3001 connects the stomach to the esophagus 3009 and allows the endoscope to pass through the patient's mouth, down the esophagus 3009, and through the esophageal sphincter 3001 to access the luminal space of the stomach 3000. The pyloric sphincter 3004, duodenum 3005, and duodenum jejunum curve 3006 connect the stomach 3000 to the small intestine (not shown).
A conventional surgical procedure to remove a tumor from the stomach is called wedge resection, in which the portion of the stomach where the tumor is located is removed entirely. Fig. 35 and 36 illustrate an exemplary embodiment of a conventional surgical system configured to be used endoluminally and laparoscopically into the stomach 3050 to remove a tumor 3056. Similar to stomach 3000, stomach 3050 includes esophageal sphincter 3051, greater curvature 3052, lesser curvature 3053, pyloric sphincter 3054, duodenum 3055, inner tissue wall 3057, and outer tissue wall 3058. As shown, tumor 3056 is located on the inner tissue wall 3057 of greater curve 3052. The tumor 3056 is located remotely from the esophageal sphincter 3051 and the esophagus 3059 and on the inner tissue wall 3057 of the greater curvature 3052. To remove the tumor 3056, an endoscope 3060 is spatially disposed within the cavity, and a laparoscope 3062 is spatially disposed outside the cavity.
While both the endoscope 3060 and the laparoscope 3062 provide image data to the display so that the surgeon can properly position the endoscope and operate on the stomach 3050, the images from each of the endoscopes are separate, requiring the surgeon to view two different monitors or frame-in-frame arrangements. A problem arises when the endoscope 3060 and laparoscope 3062 must work cooperatively to form the incision line I to remove the wedge W with the tumor 3056 attached. Thus, the surgeon typically relies on their experience or knowledge of the anatomy to ensure that the endoscope 3060 and laparoscope 3062 work cooperatively and are disposed in the correct position on either side of the inner tissue wall 3057 and the outer tissue wall 3058.
For conventional surgical systems, a unified visual image of the connected or engaged surgical treatment site cannot be provided. Instead, the user is required to either monitor multiple displays simultaneously and infer orientation and distance between the speculum with respect to various surgical instruments and/or different speculums involved in the same procedure, or to incorporate additional vision systems into the procedure in an attempt to track the speculums and instruments. The surgical system provided herein avoids these problems by integrating imaging from both the endoscope and the laparoscope into a single visual display to simplify the alignment and deployment of the various surgical instruments and the endoscope.
Fig. 37 illustrates an exemplary embodiment of a surgical system 3100 configured for endoluminal access and laparoscopic access to a stomach 3000. As will be described in greater detail below, the surgical system 3100 can transmit data from different endoscopic devices in order to create a merged image of a single scene at a surgical site, which can include a representative depiction of at least one of the endoscopic devices in the merged image. For simplicity, certain components of surgical system 3100 and stomach 3000 are not shown.
As shown, stomach 3000 includes a tumor 3040 located on greater curvature 3002. When operating on the stomach 3000, it may be necessary to use a laparoscopic deployment instrument to manipulate (e.g., mobilize) the blood vessel 3064 in order to properly access the tumor 3040. In use, as described in greater detail below, the surgical system 3100 can provide a combined image such that the endoscope and laparoscope can operate cooperatively without both of the endoscopes being able to visually see the other in their field of view (e.g., due to the stomach wall being positioned therebetween).
Surgical system 3100 includes an endoscope 3102 configured for endoluminal access through esophagus 3009 and into stomach 3000. The endoscope 3102 may have a variety of configurations. For example, in the illustrated embodiment, the endoscope 3102 includes a first optical sensor 3106 (e.g., a camera) and an illumination element 3108. Alternatively or in addition, the endoscope 3102 can include a working channel (not shown) disposed along the length of the endoscope 3102 to endoluminal delivery of instruments into the stomach 3000. In some embodiments, the endoscope 3102 may include an outer sleeve (not shown) configured to be inserted through a patient's mouth (not shown) and down the esophagus 3009. The outer sleeve may include a working channel configured to allow the endoscope 3102 to be inserted through the outer sleeve and into the stomach 3000. In certain embodiments, the endoscope 3102 can include a working channel extending therethrough. The working channel can be configured to receive one or more surgical instruments and/or allow fluid to pass therethrough to insufflate a cavity or organ (e.g., stomach).
In addition, surgical system 3100 includes a laparoscope 3104 configured for laparoscopic access through an abdominal wall (not shown) and into the extraluminal anatomical space adjacent to stomach 3000. Laparoscope 3104 can have a variety of configurations. For example, in the illustrated embodiment, the laparoscope 3104 includes a second optical sensor 3110 (e.g., a camera) and an illumination element 3112. Alternatively or in addition, the laparoscope 3104 can include working channels (not shown) disposed along the length of the laparoscope 3104 to deliver instruments into the extraluminal space via the laparoscope. In some embodiments, the laparoscope 3104 can be inserted into the extraluminal anatomical space through a trocar or multiport (not shown) positioned within and through the tissue wall. The trocar or multiport may include a port for delivering the laparoscope 3104 and/or other surgical instruments into the extraluminal anatomical space to access the stomach 3000.
As shown in fig. 37, the endoscope 3102 includes a first tracking device 3109 disposed on or within the endoscope 3102. The first tracking device 3109 is configured to transmit a signal indicative of the position of the endoscope 3102 relative to the laparoscope 3004 (e.g., to the controller 3130). In addition, the laparoscope 3104 includes a second tracking device 3113 that is disposed on or within the laparoscope 3104. The second tracking device 3113 is configured to be capable of transmitting a signal indicative of a position of the laparoscope 3104 relative to the endoscope 3102 (e.g., to the controller 3130). In other embodiments, only one of the endoscope 3102 and the laparoscope 3104 includes a tracking device.
Alternatively or in addition, the transmitted signal (or additional transmitted signal) from the first tracking device 3109 may further indicate the orientation of the endoscope 3102 relative to the laparoscope 3004. Alternatively or in addition, the transmitted signal (or additional transmitted signal) from the second tracking device 3113 may further indicate the orientation of the laparoscope 3004 relative to the first endoscopic device.
In some embodiments, the first tracking device 3109 and the second tracking device 3113 are configured to be able to detect a position, an orientation, or both of the endoscope 3102 and the laparoscope 3104, respectively, using magnetic sensing or radio frequency sensing (e.g., when the endoscope 3102 and the laparoscope 3104 are positioned on opposite sides of the tissue wall of the stomach 3000). Alternatively, the first tracking device 3109 and the second tracking device 3113 are configured to be able to detect the position, orientation, or both of the endoscope 3102 and the laparoscope 3104, respectively, using a common anatomical landmark (e.g., when the endoscope 3102 and the laparoscope 3104 are positioned on opposite sides of the tissue wall of the stomach 3000). The first tracking device 3109 and the second tracking device 3113 may each transmit signals to a controller (e.g., controller 3130). For example, various embodiments of magnetic fiducial markers and the use of magnetic fiducial markers in detection locations are further discussed in U.S. patent application Ser. No. 63/249,658, entitled "Surgical Devices, systems, and Methods For Control Of One Visualization With Another," filed on 9, 2021.
As further shown in fig. 37 and 38, the surgical system 3100 includes a first surgical instrument 3114 and a second surgical instrument 3118 each configured for laparoscopic access through the abdominal wall and into the extraluminal anatomical space surrounding the stomach 3000. First surgical instrument 3114 and second surgical instrument 3118 can have a variety of configurations. For example, in the illustrated embodiment, the first surgical instrument 3114 and the second surgical instrument 3118 each include a pair of jaws 3116, 3120, respectively, configured to manipulate the stomach 3000 from the laparoscopic side. Although two surgical instruments 3114, 3118 are shown, in other embodiments, the surgical system 3100 may include one surgical instrument or more than two surgical instruments. In some embodiments, the first surgical instrument 3114 and the second surgical instrument 3118 may be passed through ports of the same trocar and/or multiport device through which the laparoscope 3104 is positioned.
Surgical system 3100 further includes a controller 3130 communicatively coupled to endoscope 3102 and laparoscope 3104 and configured to receive the transmitted image data of the first scene and the second scene from first optical sensor 3106 and second optical sensor 3110, respectively. The controller 3130 is also communicatively coupled to the first tracking device 3109 and the second tracking device 3113 and is configured to receive the transmitted signals from the first tracking device 3109 and the second tracking device 3113, respectively. The controller 3130 is configured to be able to determine at least the relative distance between the endoscope 3102 and the laparoscope 3104 upon receiving the transmitted signals. In certain embodiments, the controller 3130 may also be configured to be able to determine the relative orientation between the endoscope 3102 and the laparoscope 3104.
As shown in fig. 38, the relative distance between the endoscope 3102 and the laparoscope 3104 is shown by a dashed arrow 3122. Based on both the transmitted image data and the relative distance between the endoscope 3102 and the laparoscope 3104, the controller 3130 is configured to provide a combined image to a display, for example, on the first display 3132, the second display 3134, or both, of the surgical system 3100. In this combined image, at least one of the endoscope 3102 and the laparoscope 3104 is a representative depiction thereof.
First display 3132 and second display 3134 may be configured in a variety of configurations. For example, in some embodiments, the first display may be configured to be capable of displaying a first scene and the second display may be configured to be capable of displaying a second scene, and the first display, the second display, or both may be further configured to be capable of displaying a merged image. In another embodiment, the surgical system 3100 can include a third display 3136 (fig. 37) that can be used to display the combined image, and the first display 3132 and the second display 3134 are used only to display the transmitted image data from the optical sensors 3106, 3110, respectively, without any modification. In this embodiment, the surgeon can access real-time scenes from both the endoscope 3102 and the laparoscope 3104 on the first display 3132 and the second display 3134, while also accessing the combined image on the third display 3136.
As described above, the endoscope 3102 includes the first optical sensor 3106. The first optical sensor 3106 is configured to transmit image data of a first scene within the field of view of the endoscope 3102 to the controller 3130. In the illustrated embodiment, the tumor 3040 is located within the field of view of the endoscope 3102. Accordingly, the controller 3130 may determine a relative distance between the endoscope 3102 and the tumor 3040 based on the transmitted image data. As shown in fig. 38, the relative distance between the endoscope 3102 and the tumor 3040 is shown by a dashed arrow 3127. In some embodiments, the relative distance 3127 may be determined by using structured light projected onto the tumor 3040 (e.g., via the illumination element 3108) and tracked by the first optical sensor 3106. Additionally, in some embodiments, the controller 3130 may calculate the relative distance between the laparoscope 3104 and the tumor 3040 based on the determined relative distance 3122 (between the endoscope 3102 and the laparoscope 3104) and the determined relative distance 3127 (between the endoscope 3102 and the tumor 3040).
In addition, the laparoscope 3104 includes a second optical sensor 3110. The second optical sensor 3110 is configured to transmit image data of a second scene within the field of view of the laparoscope 3104 to the controller 3130. The first surgical instrument 3114 and the second surgical instrument 3118 are disposed within a field of view of the laparoscope 3104. Accordingly, the controller 3130 may determine a relative distance between the laparoscope 3104 and each of the first and second surgical instruments 3114, 3118 based on the transmitted image data. In certain embodiments, the controller 3130 can also be configured to determine the relative orientation between the laparoscope 3104 and each of the first and second surgical instruments 3114, 3118.
As shown in fig. 38, the relative distance between the laparoscope 3104 and the first surgical instrument 3114 is shown by dashed arrow 3125, and the relative distance between the laparoscope 3104 and the second surgical instrument 3118 is shown by dashed arrow 3126. In some embodiments, the relative distances 3125, 3126 may be determined by using structured light projected onto the surgical instrument 3114, 3118 (e.g., through the illumination element 3112) and tracked by the second optical sensor 3110.
Based on the relative distance 3122 (between the endoscope 3102 and the laparoscope 3104), the relative distance 3125 (between the laparoscope 3104 and the first surgical instrument 3114), 3126 (between the laparoscope 3104 and the second surgical instrument 3118), 3127 (between the endoscope 3102 and the tumor 3040), the controller 3130 may determine, for example, the relative distance between the endoscope 3102 and each of the first surgical instrument 3114 and the second surgical instrument 3118, the relative distance between the tumor 3040 and each of the first instrument 3114 and the second instrument 3118, and the like. As shown in fig. 38, the relative distance from the endoscope 3102 to the first surgical instrument 3114 is shown by dashed arrow 3123, the relative distance from the endoscope 3102 to the second surgical instrument 3118 is shown by dashed arrow 3124, and the relative distance from the tumor 3040 to the second surgical instrument 3118 is shown by dashed arrow 3128. Based on the determined relative distances 3123, 3124, 3128 and the transmitted image data (e.g., image data of the first scene, the second scene, or both), the controller can create a combined image that is projected onto the first display 3132, the second display 3134, or both. Because each of the instrument sets is imaged directly from its respective camera, and because the system is able to determine the exact type of device (e.g., grasper, cutter) in use (because the instrument has been scanned into the surgical hub or somehow identified to the surgical hub to allow a system to be provided for interacting with the device), the system can create a 3D model reconstruction of each of the instruments. For measured relative distances or at least one coupled 3D axis registration, the system may display the device from the occluded camera and reverse them in the necessary way to show their position, orientation and status in real time. These 3D models may even be modified with details that are imaged directly from the camera that views the occluded collaborative image.
Additionally, in certain embodiments, the controller can also determine a relative orientation between the endoscope 3102 and the laparoscope 3104, a relative orientation of the first instrument 3114 and/or the second instrument 3118 with respect to the endoscope 3102 and/or with respect to the tumor 3040, and the like. Based on the determined relative orientation and the transmitted image data (e.g., image data of the first scene, the second scene, or both), the combined image may also show not only the position of one or more of the endoscope 3102, the laparoscope 3104, the first surgical instrument 3114, the second surgical instrument 3118, and the tumor 3040, but also their orientation. As discussed above, the means for creating a fully generated 3D model of the instrument may be overlaid into an image of the system where the alternative view cannot be seen. Since the representative depiction is a generated image, various properties of the image (e.g., transparency, color) may also be manipulated to allow the system to be clearly displayed not within the real-time visualization video feed, but as a construct from other views. The opposite view may also bring the constructed instrument within its field of view if the user switches between imaging systems. In some embodiments, there is another way to generate these overlays. The occluded image may isolate instruments in its stream from surrounding anatomy, invert the image and align the image to a known common axis point, and then overlay only the real-time image of the occluded view into the non-occluded view camera display feed. Similar to the other representative depictions described above, alternative overlays may be shaded, translucent, or otherwise modified to ensure that the user can discern a directly imaged view from the overlaid view in order to reduce confusion. This can also be done with key aspects of the anatomy (e.g., a tumor can be seen by one camera but cannot be seen by another). The system may utilize a common reference between cameras and display a logo, point of interest or critical surgical anatomy aspect and even highlight it to allow better access and interaction even when occluded from critical aspects.
Fig. 39 shows an exemplary embodiment of a merged image. The combined image shows a real-time first scene within the field of view of the endoscope 3102 with an overlaid representative depiction of a portion of the laparoscopic side of the stomach (e.g., blood vessel 3064, laparoscope 3104, and/or first and second surgical instruments 3114 and 3118). Those skilled in the art will understand that the phrase "representative depiction" as used herein refers to a virtual overlay over an actual depiction from a camera, where the virtual overlay corresponds to the position and orientation of an object disposed within the field of view of the camera but not visible to the camera as a result of an obstacle being disposed between the camera and the object, and the phrase "actual depiction" as used herein refers to an unmodified real-time image or video stream from the camera. Based on the transmitted image data of the first scene in combination with the determined relative distances 3122, 3123, 3124, the controller 3130 may provide a combined image from the perspective of the endoscope 3102, wherein the laparoscope 3104 and the surgical instruments 3114, 3118 are shown as corresponding to representative depictions of their locations in the extraluminal space in real time. In the illustrated embodiment, the representative depiction is shown in dashed outline corresponding to the blood vessel 3064, the laparoscope 3104, and the surgical instruments 3114, 3118. However, other forms of representative depictions may be used, such as simple geometric shapes to represent non-visual instruments and anatomy within the intra-luminal space.
Alternatively or in addition, the controller 3130 may generate the combined image from the perspective of the laparoscope 3104. For example, in fig. 40, the merged image shows a real-time second scene within the field of view of the laparoscope 3104 and an overlaid representative depiction of a portion of the endoscopic side of the stomach (e.g., the tumor 3040 and/or the endoscope 3102). Based on the transmitted image data of the second scene in combination with the determined content, the controller 3130 may provide a combined image from the perspective of the laparoscope 3104, wherein the endoscope 3102 and the tumor 3040 are shown as corresponding in real-time to their representative depictions of position in the intra-luminal space. In the illustrated embodiment, the representative depiction is shown in dashed outline corresponding to the tumor 3040 and endoscope 3102. However, other forms of representative depictions may be used, such as simple geometric shapes to represent non-visual instruments and anatomy within the intra-luminal space.
In some embodiments, monitoring of the internal and external portions of the interconnected surgical instruments may be performed in order to image both the internal and external interactions of the surgical instrument with adjacent surgical instruments. In certain embodiments, the surgical instrument includes an articulation actuation system external to the body. Additionally, the surgical instrument may be configured to be coupled to an electromechanical arm of the robotic system. The tracking means may be used to ensure that the robotic arms of the different instruments do not contact each other outside the body even though the internal instruments may not. The system can be used to control the intended interaction of a laparoscopic deployment instrument and prevent accidental interaction of the laparoscopic deployment instrument by monitoring both in vivo and in vitro aspects of the same instrument.
In other embodiments, coordination of the internal and external views of portions of the surgical instrument may be achieved by two separate imaging systems. This would enable monitoring of external interactions of multiple surgical instruments while controlling and tracking internal interactions of those same surgical instruments. The system may minimize unintended external interactions between surgical instruments while improving the internal operating envelope of the same surgical instrument.
Instrument control imaging system for visualizing an impending surgical procedure
The devices, systems, and methods for multi-source imaging provided herein allow for collaborative surgical visualization that enables instrument coordination of instruments based on a surgical plan for a particular procedure. Generally, the present surgical system provides images of both the endoluminal and extraluminal anatomical spaces, and provides a combined image based on these images in which certain surgical steps performed via the endoscope may be coordinated with known surgical sites in subsequent steps performed via the laparoscope, and vice versa.
For surgery, there is a corresponding surgical plan that the surgeon follows according to the surgical progress. The steps in the surgical plan may be performed in a linear fashion in order to achieve a desired result, such as removing a tumor from the stomach. Through the surgical planning, the following steps are known in advance: (i) The tumor must be partially resected from the inner tissue wall of the stomach; (ii) The stomach must be flipped over to access the tumor from the laparoscopic side, thereby holding the stomach in an upright orientation to prevent gastric acid spillage; and (iii) an incision must be made via laparoscope in order to access the tumor. These pieces of information suggest that two different incisions must be made in the stomach, one for partially removing the tumor and one for making an opening in the stomach wall to access the tumor. Based on the knowledge that two separate incisions must be formed at relatively identical locations, the algorithm may calculate where the first incision and the second incision should be located to align the second incision with the first incision so that the incisions are formed as small and efficient as possible.
In one exemplary embodiment, a surgical system may include: an energy application surgical instrument configured to apply energy to a natural body cavity or organ; a first speculum device configured to be capable of transmitting image data of a first scene within its field of view; a second endoscopic device configured to be capable of transmitting image data of a second scene within its field of view; and a controller configured to be able to receive the transmitted image data of the first scene and the second scene and to provide a combined image of the first scene and the second scene. Thus, the combined image provides two independent perspectives of the surgical site, which allows the practitioner to coordinate the location of energy to be applied to the inner surface of the tissue wall at the surgical site relative to the intended interaction location of the second instrument on the outer surface of the tissue wall in a subsequent surgical step at the surgical site.
The controller is configured to be capable of generating a combined image of the first scene and the second scene. The controller receives an actual depiction from each of the first imaging system and the second imaging system. The actual depiction may be a photograph or a real-time video feed of what each imaging system attached to each endoscopic device is seeing in real-time. Each of the first scene and the second scene depicts certain critical structures that are not visible to other imaging systems. For example, a first imaging system disposed via an endoscope may have a tumor and energy application surgical instrument within its field of view. Additionally, the second imaging system may include a laparoscopic instrument disposed within its field of view. In addition, as will be discussed in more detail, the merged image facilitates coordination of the location of energy to be applied by the energy application surgical instrument to the inner surface of the tissue wall at the surgical site relative to the intended interaction location of the second instrument on the outer surface of the tissue wall in a subsequent surgical step at the surgical site.
In some embodiments, the system will need to be coupled to a "known" point. These known points would likely be fixed aspects (e.g., instrument or speculum device features because they are on a rigid and predictable system) or linked anatomical landmarks (e.g., known anatomical sphincters, ligaments, arteries that can be seen directly from both systems). The tumor may be visible or partially visible in an imaging system. In hollow organ surgery, the tissue wall is typically thin and the tumor is superficial to at least one side of the organ. One example would be lung cancer. In lung cancer, the tumor will be present in the parenchyma of the incision (i.e., from the overlapping side) or in the bronchial wall (i.e., from endoscopic approach). The system will then only need to identify one scope in 3D space with respect to the other scope, or to identify anatomical landmarks that both scopes can see from different perspectives, in order to cover the tumor from the side where the tumor can be seen to the imaging system where the tumor cannot be seen.
The first endoscopic device is configured to be at least partially disposable within at least one of a natural body cavity and an organ (e.g., a lung, stomach, colon, or small intestine), and the second endoscopic device is configured to be at least partially disposable outside of at least one of the natural body cavity and the organ. In certain embodiments, the first endoscopic device is an endoscope and the second endoscopic device is a laparoscope. The natural body cavity or organ may be any suitable natural body cavity or organ. Non-limiting examples include the stomach, lung, colon, or small intestine.
An exemplary surgical system may include a variety of features, as described herein and shown in the drawings. However, those skilled in the art will appreciate that the surgical system may include only some of these features and/or it may include a variety of other features known in the art. The surgical systems described herein are intended to represent only certain exemplary embodiments. Furthermore, while the surgical systems are shown and described in connection with the stomach, those skilled in the art will appreciate that these surgical systems may be used in connection with any other suitable natural body cavity or organ.
Fig. 41 and 42 illustrate an exemplary embodiment of a surgical system 3100 configured for endoluminal access and laparoscopic access to a stomach 3000. The surgical system 3200 may be similar to the surgical system 3100 (fig. 37 and 38) except for the differences described in detail below, and thus, common features are not described in detail herein. For simplicity, certain components of surgical system 3200 and stomach 3000 are not shown.
As shown, stomach 3000 includes esophageal sphincter 3001, greater curvature 3002, lesser curvature 3003, pyloric sphincter 3004, duodenum 3005, and duodenum jejunum curvature 3006. In addition, the stomach includes an inner tissue wall 3007 and an outer tissue wall 3008. As shown, stomach 3000 includes a tumor 3040 located on greater curvature 3002. When operating on the stomach 3000, it may be necessary to use a laparoscopic deployment instrument to manipulate (e.g., mobilize) the blood vessel 3064 in order to properly access the tumor 3040. In use, as described in more detail below, the surgical system 3200 may provide a merged image such that energy application and incision in subsequent surgical steps may be coordinated and visualized.
Surgical system 3200 includes an endoscope 3202 configured for endoluminal access through esophagus 3009 and into stomach 3000. Endoscope 3202 may have a variety of configurations. For example, in the illustrated embodiment, the endoscope 3202 includes an optical sensor 3206 (e.g., a camera) and an illumination element 3208. In addition, the endoscope 3202 includes a working channel 3203 disposed along a length of the endoscope 3202. Working channel 3203 is configured to receive one or more surgical instruments and/or allow fluid to pass therethrough to insufflate a cavity or organ (e.g., stomach). In some embodiments, the endoscope 3202 may include an outer sleeve (not shown) configured to be inserted through a patient's mouth (not shown) and down the esophagus 3009. The outer sleeve may include a working channel configured to allow an endoscope 3202 to be inserted through the outer sleeve and into the stomach 3000.
Surgical system 3200 also includes a laparoscope 3204 configured for laparoscopic access through the abdominal wall (not shown) and into the extraluminal anatomical space adjacent to stomach 3000. Laparoscope 3204 can have a variety of configurations. For example, in the illustrated embodiment, the laparoscope 3204 includes an optical sensor 3210 (e.g., a camera) and an illumination element 3212. Alternatively or additionally, the laparoscope 3204 may include a working channel (not shown) disposed along the length of the laparoscope 3204 to deliver instruments into the extraluminal anatomical space via the laparoscope. In some embodiments, the laparoscope 3204 may be inserted into the extraluminal anatomical space through a trocar or multiport (not shown) positioned within and through the tissue wall. The trocar or multiport may include ports for delivering the laparoscope 3204 and/or other surgical instruments into the extraluminal anatomical space to access the stomach 3000.
As shown in fig. 41 and 42, surgical system 3200 includes an energy-applying surgical instrument 3240 that passes through working channel 3203 of endoscope 3202 and into stomach 3000. While the energy application surgical instrument can have a variety of configurations, in this illustrated embodiment, the energy application surgical instrument 3240 includes a knife 3242 at a distal end thereof. The knife 3242 can have a variety of configurations. For example, in some embodiments, the blade may be in the form of a monopolar RF blade or an ultrasonic blade. Exemplary embodiments of energy-applying surgical instruments that may be used with the present system are further described in U.S. patent No. 10,856,928, which is incorporated herein by reference in its entirety. The GEM knife is Megadyne intelligent monopolar knife. It is an advanced monopolar blade capable of sensing tissue and applying the appropriate RF energy required for a task, just as advanced bipolar or intelligent ultrasound control. Those skilled in the art will appreciate that the type of surgical instrument that includes the end effector and the structural configuration of the surgical instrument will depend at least on the surgical site and the surgical procedure to be performed.
As further shown in fig. 41, the energy applying surgical instrument 3202 includes a force sensor 3209 (e.g., the force sensor 3209 may be coupled to one or more motors (not shown) of the instrument 3202 or a robotic arm (not shown) coupled to the instrument 3202). During use, force sensor 3209 is configured to sense an amount of force applied by knife 3242 to tissue of stomach 3000 as knife 3242 moves (e.g., cuts) through the tissue. The force sensor 3209 is further configured to be capable of transmitting force data to the controller 3230 of the surgical system 3200. The controller 3230 can aggregate the received feedback inputs (e.g., force data), perform any necessary calculations, and provide output data to achieve any adjustments that may be needed (e.g., adjust power level, travel speed, etc.). Additional details regarding force sensor 3209 and controller 3230 are further described in the previously mentioned U.S. patent No. 10,856,928, which is incorporated herein by reference in its entirety. In some implementations, force sensor 3209 may be omitted.
Alternatively or in addition, the controller 3230 is configured to calculate an insertion depth of the blade 3242 of the energy application surgical instrument 3240 within tissue of the stomach 30 based on the transmitted image data from the endoscope 3202 and/or laparoscope 3204. For example, during endoscopic dissection of the stomach wall, the optical sensor 3206 of the laparoscope 3204 may monitor the anatomical site from outside the stomach. Based on this image data transmitted to the controller 3230, the controller 3230 can determine the depth of the knife 3242. This prevents accidental full thickness penetration that could lead to leakage. Additionally, laparoscope 3204 may also monitor heat (via IR wavelengths) and collateral thermal damage (tissue refractive index and composition) to the stomach at the anatomical site where the energy application surgical instrument is active. Such laparoscopic thermal and welding monitoring may be used to further prevent unnecessary damage to the stomach tissue (e.g., to help trigger power adjustments to the energy application surgical instrument). For example, various embodiments of thermal and weld monitoring for preventing unwanted damage to tissue in Surgical Systems are further discussed in U.S. patent application No. 63/249,658, entitled "Surgical Devices, systems, and Methods For Control Of One Visualization With Another," filed on 9, 2021.
The surgical system 3200 includes a first surgical instrument 3214 and a second surgical instrument 3218, each configured for laparoscopic access through the abdominal wall and into the extraluminal anatomical space surrounding the stomach 3000. First surgical instrument 3114 and second surgical instrument 3118 can have a variety of configurations. For example, in the illustrated embodiment, the first surgical instrument 3114 and the second surgical instrument 3118 each include a pair of jaws 3116, 3120, respectively, configured to manipulate the stomach 3000 from the laparoscopic side. Although two surgical instruments 3114, 3118 are shown, in other embodiments, the surgical system 3100 may include one surgical instrument or more than two surgical instruments. In some embodiments, the first surgical instrument 3114 and the second surgical instrument 3118 may be passed through ports of the same trocar and/or multiport device through which the laparoscope 3104 is positioned.
As described above, the endoscope 3202 includes the first optical sensor 3206. The first optical sensor 3106 is configured to transmit image data of a first scene within the field of view of the endoscope 3102 to the controller 3130. In the illustrated embodiment, the tumor 3040 is located within the field of view of the endoscope 3102. As shown in fig. 41, an energy application surgical instrument 3240 is inserted into the working channel of endoscope 3202 and knife 3242 is advanced toward tumor 3040. In conventional surgical systems, the surgeon will use the knife 3242 to partially remove the tumor 3040 based solely on the endoscopic view, and then proceed to blindly perform a partial gastric flip (e.g., using only the laparoscopic view) to remove the tumor 3040 via laparoscopy through an incision in the stomach wall. The surgeon cannot precisely coordinate the endoscopic and laparoscopic incisions and instead roughly estimate where the tumor is located during gastric upset, which may result in an inaccurate incision, removing more tissue than is needed. However, in the present system 3200, because both the endoscope 3202 and the laparoscope 3204 can provide image data from the surgical site in both the endoluminal anatomical space and the extraluminal anatomical space, anatomical boundaries (e.g., where energy is to be applied by the energy application surgical instrument 3240 to partially remove a tumor) can be coordinated with a second incision (e.g., where laparoscopic cutting is to be performed in a subsequent surgical step to remove or separate the tumor 3240 from the stomach).
Surgical system 3200 also includes a controller 3230 that is communicatively coupled to endoscope 3202 and laparoscope 3204. The controller 3230 is configured to be able to receive transmitted image data of the first and second scenes from the first and second optical sensors 3206 and 3210, and provide a combined image of the first and second scenes. This combined image facilitates the coordination of the location of energy to be applied by the energy application surgical instrument 3240 to the inner tissue wall 3057 of the stomach at the surgical site 3245 relative to the intended interaction location of a second instrument (e.g., a cutting instrument 3248 having an end effector 3250 in fig. 43) on the outer tissue wall 3058 of the stomach in a subsequent surgical step at the surgical site 3245.
The controller 3230 is configured to provide a combined image to a display, for example, on the first display 3232, the second display 3234, or both of the surgical system 3200. First display 3232 and second display 3234 may be configured in a variety of configurations. For example, in some embodiments, the first display may be configured to be capable of displaying a first scene and the second display may be configured to be capable of displaying a second scene, and the first display, the second display, or both may be further configured to be capable of displaying a merged image. In another embodiment, surgical system 3200 may include a third display that may be used to display the combined image, and first display 3232 and second display 3234 are used only to display the transmitted image data from first optical sensor 3206 and second optical sensor 3210, respectively, without any modification. In this embodiment, the surgeon can access real-time scenes from both the endoscope 3202 and the laparoscope 3204 on the first display 3232 and the second display 3234, while also accessing the combined image on the third display 3236.
As shown in fig. 41a, display 3232 depicts a scene from endoscope 3202 with optical sensor 3206 having a tumor 3040, energy applying surgical instrument 3240, and knife 3242 in its field of view. Based on subsequent steps of the surgical plan, the controller 3230 can provide a combined image in which a first interaction location 3244, including a start location 3244a and an end location 3244b, is depicted relative to the tumor 3040 as a representation of the expected interaction location of the energy application surgical instrument 3240. The start position 3244a corresponds to a start point of an incision for partially removing the tumor 3040 from the inner tissue wall 3007 of the stomach 3000, and the end position 3244b corresponds to an end point of an incision initiated at the start position 3244 a. Thus, the surgeon will be able to visualize where the incision should begin and end based on the subsequent surgical steps. In some embodiments, one or more of the subsequent steps are based on a surgical plan. In some embodiments, one or more of the subsequent surgical steps may be adjusted based on the actual surgical step relative to the surgical plan that has been performed (e.g., the GPS map destination direction recalculated based on user action during the surgical procedure). In some embodiments, the blood vessel 3064 and the surgical instruments 3216, 3220 may be shown as representative depictions in a combined image, similar to the combined image of fig. 39.
As shown in fig. 42, after applying energy, the tumor 3040 is partially removed from the interior tissue wall 3007 of the stomach using a surgical instrument by applying a knife 3242 to the tissue surrounding the tumor 3040. As shown in fig. 42a, the knife 3242 traverses from a first interaction position 3244a to a second interaction position 3244b. Once the knife 3242 has reached the second interaction position 3244b, the energy application is terminated so as not to completely remove the tumor 3040 from the inner tissue wall 3007.
As described above, surgical system 3200 further includes a controller 3230 communicatively coupled to endoscope 3202 and laparoscope 3204 and configured to receive transmitted image data of the first scene and the second scene from first optical sensor 3206 and second optical sensor 3210, respectively. Similar to tracking devices 3109, 3113, controller 3230 is also communicatively coupled to first tracking device 3252 and second tracking device 3254 disposed within the endoscope and laparoscope, and is configured to be capable of receiving transmitted signals from the first tracking device and the second tracking device, respectively. The controller 3230 is configured to determine at least a relative distance between the endoscope 3202 and the laparoscope 3204 upon receipt of the transmitted signals. In certain embodiments, the controller 3230 can also be configured to determine a relative orientation between the endoscope 3202 and the laparoscope 3204.
In some embodiments, the first tracking device 3252 and the second tracking device 3254 are configured to be able to detect the position, orientation, or both of the endoscope 3202 and the laparoscope 3204, respectively, using magnetic sensing or radio frequency sensing (e.g., when the endoscope 3202 and the laparoscope 3204 are positioned on opposite sides of the tissue wall of the stomach 3000). Alternatively, the first tracking device 3252 and the second tracking device 3254 are configured to be able to detect the position, orientation, or both of the endoscope 3202 and the laparoscope 3204, respectively, using a common anatomical landmark (e.g., when the endoscope 3202 and the laparoscope 3204 are positioned on opposite sides of a tissue wall of the stomach 3000). First tracking device 3252 and second tracking device 3254 may each transmit signals to a controller (e.g., controller 3230). For example, various embodiments of magnetic fiducial markers and the use of magnetic fiducial markers in detection locations are further discussed in U.S. patent application Ser. No. 63/249,658, entitled "Surgical Devices, systems, and Methods For Control Of One Visualization With Another," filed on 9, 2021.
As shown in fig. 43, the relative distance between the endoscope 3202 and the laparoscope 3204 is shown by dashed arrow 3222. The controller 3230 is configured to provide a combined image to a display on, for example, the first display 3232, the second display 3234, or both, of the surgical system 3200 based on both the transmitted image data and the relative distance between the endoscope 3202 and the laparoscope 3204. In this combined image, at least one of the endoscope 3202 and the laparoscope 3204 is a representative depiction thereof.
As described above, the endoscope 3202 includes the first optical sensor 3206. The first optical sensor 3206 is configured to be capable of transmitting image data of a first scene within a field of view of the endoscope 3202 to the controller 3230.
In the illustrated embodiment, the tumor 3040 is located within the field of view of the endoscope 3202. Accordingly, the controller 3230 can determine a relative distance between the endoscope 3202 and the tumor 3040 based on the transmitted image data. As shown in fig. 43, the relative distance between the endoscope 3202 and the tumor 3040 is shown by a dashed arrow 3227. In some embodiments, the relative distance 3227 may be determined by using structured light projected onto the tumor 3040 (e.g., via illumination elements 3208) and tracked by the first optical sensor 3206. Additionally, in some embodiments, the controller 3230 can calculate a relative distance between the laparoscope 3204 and the tumor 3040 based on the determined relative distance 3222 (between the endoscope 3202 and the laparoscope 3204) and the determined relative distance 3227 (between the endoscope 3202 and the tumor 3040).
In addition, the laparoscope 3204 includes a second optical sensor 3210. The second optical sensor 3210 is configured to transmit image data of a second scene within a field of view of the laparoscope 3204 to the controller 3230. The cutting instrument 3248 is disposed within the field of view of the laparoscope 3204. Accordingly, the controller 3230 can determine a relative distance between the laparoscope 3204 and the cutting instrument 3248 based on the transmitted image data. In certain embodiments, the controller 3230 can also be configured to determine a relative orientation between the laparoscope 3204 and the cutting instrument 3248.
As shown in fig. 43, the relative distance between laparoscope 3204 and cutting instrument 3248 is shown by dashed arrow 3225. In some implementations, the relative distance 3225 may be determined using structured light projected onto the cutting instrument 3248 (e.g., through the lighting element 3212) and tracked by the second optical sensor 3210.
Based on relative distance 3222 (between endoscope 3202 and laparoscope 3204), relative distance 3225 (between laparoscope 3204 and cutting instrument 3248), and relative distance 3227 (between endoscope 3202 and tumor 3040), controller 3230 can determine, for example, the relative distances between tumor 3040 and cutting instrument 3248 and the cutting plane of cutting instrument 3248. As shown in FIG. 43, the relative distance from the tumor 3040 to the cutting instrument 3248 is shown by dashed arrow 3223. Based on the determined relative distance 3223 and the transmitted image data (e.g., image data of the first scene, the second scene, or both), the controller may create a combined image that is projected onto the first display 3232, the second display 3234, or both. Because each of the instrument sets is imaged directly from its respective camera, and because the system is able to determine the exact type of device (e.g., grasper, cutter) in use (because the instrument has been scanned into the surgical hub or somehow identified to the surgical hub to allow a system to be provided for interacting with the device), the system can create a 3D model reconstruction of each of the instruments. For measured relative distances or at least one coupled 3D axis registration, the system may display the device from the occluded camera and reverse them in the necessary way to show their position, orientation and status in real time. These 3D models may even be modified with details that are imaged directly from the camera that views the occluded collaborative image.
As shown in fig. 43a, the position of the first interaction location 3244 coordinates with the position of the second interaction location 3246 such that the first interaction location 3244 abuts the second interaction location 3246. The controller 3230 can provide a combined image shown on the display 3234, wherein the second interaction location 3246 is depicted relative to the tumor 3040 and the first interaction location 3244 as a representation of the intended interaction location of the surgical instrument 3218. In the illustrated embodiment, the second interaction location 3246 corresponds to an incision for opening the stomach 3000 after a portion of the stomach has undergone a flip procedure to remove the tumor 3040 from the stomach 30 via laparoscope.
Because of this coordination and alignment of first interaction location 3244 and second interaction location 3246, there is minimal damage to the surrounding tissue of stomach 3000 when using interaction locations 3244, 3246 as guides to form an incision. The second interaction location 3246 can be placed at the exact location of the tumor 3040 even when the tumor is not visible from the laparoscopic side. Since endoscope 3202 is capable of visualizing the tumor, and is in communication with controller 3230. In the illustrated embodiment, the interaction locations 3244, 3246 are shown in dashed outline. However, other forms of representative depictions may be used, such as simple geometric shapes.
In some embodiments, coordination of lesion removal may be achieved with externally supported orientation control via a laparoscopic instrument or retractor. Alternatively or in addition, coordination of lesion removal may be achieved with internally supported balloon orientation controlled closure. For example, a surgical system 3150 may be provided that is configured to be used for lesion removal using endoscopic and laparoscopic methods in combination with an endoscopically supported balloon as shown in fig. 44 a. This is an alternating procedure with both intra-and extra-luminal interactions. Submucosal dissection and isolation was performed within the colon. Dissection was stopped before full week dissection was performed. An incision is then made in the colon wall and the tumor is flipped out into the extra-luminal space. The endocutter is simultaneously allowed to occlude/transect the tumor of the remaining attachment and to occlude the incision defect. This is done to minimize invasiveness and trauma and to seal and remove the tumor, as it is not actually done entirely within the lumen. This requires the same collaboration and interaction from the devices and markers on both sides of the organ wall, which are visible from only one side at a time.
The surgical system 3150 includes a surgical instrument 3152 having a cutting tip 3154. The cutting tip 3154 is disposed at a distal end of the surgical instrument 3152. As shown in fig. 44a, an initial mucosal incision 3158 can be made in the colon 3151 from the endoscopic side by surgical instrument 3152. A mucosal incision 3158 is formed around the lesion 3156 in preparation for removal of the lesion 3156, wherein the mucosal incision 3158 only partially surrounds the lesion 3156. As shown in fig. 44b, after mucosal incision 3158 is made, incision 3160 may be made in the plasmatic layer of colon 3151 that completely surrounds lesion 3156. As shown in fig. 44c, the balloons 3162, 3164 are endoscopically disposed on either side of the region of the colon 3151 where the lesion 3156 is located. Although not shown in fig. 44a and 44b, the balloons 3162, 3164 are present and inflated during the formation of the mucosal and plasmatic incisions 3158, 3160. The balloons 3162, 3164 provide tension to the colon 3151 to allow for cleaner cuts and also reduce the likelihood that the lesion 3156 will contact the contents of the colon 3151 during removal by the "coronary method". As shown in fig. 44d, lesions 3156 may be removed by laparoscopic deployment of instruments 3166, 3168 using mucosal incision 3158 and plasmacytoma incision 3160. With the lesion 3156 removed, the hole 3170 left by the removal may be sutured closed by staples 3172, as shown in fig. 44 e.
Coordinated appliance control system
Surgical systems that allow coordinated imaging, such as the surgical systems described above, may also include coordinating the instruments at specific steps of the operation. Because the surgical system can provide images of both the endoluminal space and the extraluminal space, certain surgical steps may be performed that require both endoscopy and laparoscopy to coordinate with known surgical sites.
The surgical system includes the surgical imaging system described above, which can be used to track and position various endoscopes and instruments disposed on opposite sides of a tissue wall, and provide a combined image. Since the combined image shows the orientation and position of the instruments and the speculum disposed on opposite sides of the tissue wall that are not visible to each speculum, the instruments may be disposed on either side of the tissue wall so as to coordinate the movement of the instruments from either side of the tissue wall.
For surgery, there may be surgical steps that require coordination between the instruments deployed via an endoscope and laparoscope. For example, during surgery to remove a tumor from the stomach, an incision must be made via a laparoscope to access the tumor, and then the tumor must be transferred from the endoluminal space to the extraluminal space for removal. However, the endoscopic deployment instrument and laparoscopic instrument used to pass the tumor through the incision are not visually visible to each other when the switching occurs. However, in conjunction with endoscopic and laparoscopic imaging systems, the instrument may be coordinated to align with the incision in the stomach wall so that the tumor passes through the incision, as the instrument may be visualized through the stomach wall.
In one exemplary embodiment, a surgical system may include a first endoscopic device configured to transmit image data of a first scene. In addition, the second scope apparatus is configured to be capable of transmitting image data of a second scene, the first scene being different from the second scene. The tracking device is associated with one of the first or second endoscopic devices and is configured to transmit a signal indicative of a position of the one of the first or second endoscopic devices relative to the other of the first or second endoscopic devices. The first surgical instrument is configured to interact with an inner side of a target tissue structure. The second surgical instrument is configured to interact with an outside of the target tissue structure. The controller is configured to be able to receive the transmitted image data and the transmitted signal. Based on the transmitted signals and image data, the controller may determine a first relative distance from the first endoscopic device to the second endoscopic device, a second relative distance from the first endoscopic device to a first surgical instrument positioned within the at least one natural body cavity and organ, and a third relative distance from the second endoscope to a second surgical instrument positioned outside of the at least one natural body cavity and organ. The relative movements of the instruments are coordinated based on the determined relative distances.
The controller is further configured to be capable of generating a combined image of the first scene and the second scene. The controller receives an actual depiction from each of the first imaging system and the second imaging system. The actual depiction may be a photograph or a real-time video feed of what each imaging system attached to each endoscopic device is seeing in real-time. Each of the first scene and the second scene depicts certain critical structures that are not visible to other imaging systems. For example, a first imaging system disposed via an endoscope may have a tumor and surgical instrument within its field of view. Additionally, the second imaging system may include a laparoscopic instrument disposed within its field of view. In addition, as will be discussed in more detail, the combined images facilitate coordinating relative movement of both the endoscopic and laparoscopic instruments at the surgical site.
An exemplary surgical system may include a variety of features, as described herein and shown in the drawings. However, those skilled in the art will appreciate that the surgical system may include only some of these features and/or it may include a variety of other features known in the art. The surgical systems described herein are intended to represent only certain exemplary embodiments. Furthermore, while the surgical systems are shown and described in connection with the stomach, those skilled in the art will appreciate that these surgical systems may be used in connection with any other suitable natural body cavity or organ.
Fig. 45 illustrates an exemplary embodiment of a surgical system 3300 configured for endoluminal access and laparoscopic access to the stomach 3000. The surgical system 3300 may be similar to the surgical system 3100 (fig. 37 and 38) except for the differences described in detail below, and thus common features are not described in detail herein. For simplicity, certain components of surgical system 3300 and stomach 3000 are not shown.
As shown, stomach 3000 includes esophageal sphincter 3001, greater curvature 3002, lesser curvature 3003, pyloric sphincter 3004, duodenum 3005, and duodenum jejunum curvature 3006. In addition, the stomach includes an inner tissue wall 3007 and an outer tissue wall 3008. As shown, stomach 3000 includes a tumor 3040 located on greater curvature 3002. When operating on the stomach 3000, it may be necessary to manipulate (e.g., mobilize) the blood vessel 3064, such as by using a laparoscopic deployment instrument, to properly access the tumor 3040. In use, as described in more detail below, the surgical system 3200 may provide a merged image such that energy application and incision in subsequent surgical steps may be coordinated and visualized.
Surgical system 3300 includes an endoscope 3302 configured for endoluminal access through esophagus 3009 and into stomach 3000. The endoscope 3302 may have a variety of configurations. For example, in the illustrated embodiment, the endoscope 3302 includes an optical sensor 3306 (e.g., a camera) and an illumination element 3308. In addition, the endoscope 3302 includes a working channel 3303 disposed along the length of the endoscope 3302. Working channel 3303 is configured to receive one or more surgical instruments and/or allow fluid to pass therethrough to insufflate a cavity or organ (e.g., stomach). In some embodiments, the endoscope 3302 may include an outer sleeve (not shown) configured to be inserted through a patient's mouth (not shown) and into the esophagus 3009. The outer sleeve may include a working channel configured to allow insertion of an endoscope 3302 through the outer sleeve and into the stomach 3000.
Surgical system 3300 also includes a laparoscope 3304 configured for laparoscopic access through the abdominal wall (not shown) and into the extraluminal anatomical space adjacent to stomach 3000. Laparoscope 3304 can have a variety of configurations. For example, in the illustrated embodiment, the laparoscope 3304 includes an optical sensor 3310 (e.g., a camera) and an illumination element 3312. Alternatively or in addition, the laparoscope 3304 can include working channels (not shown) disposed along the length of the laparoscope 3304 to deliver instruments via the laparoscope into the extraluminal anatomical space. In some embodiments, the laparoscope 3304 can be inserted into the extraluminal anatomical space through a trocar or multiport (not shown) positioned within and through the tissue wall. The trocar or multiport may include ports for delivering the laparoscope 3304 and/or other surgical instruments into the extraluminal anatomical space to access the stomach 3000.
The endoscope 3302 includes a tracking device 3309 disposed with the endoscope 3302. The tracking device 3309 is configured to be able to transmit a signal indicative of the position of the endoscope 3302 relative to the laparoscope 3304. In addition, the laparoscope 3304 includes tracking devices 3313 associated with the laparoscope 3304. The tracking device 3313 is configured to transmit a signal indicative of the position of the laparoscope 3304 relative to the endoscope 3302. In some embodiments, the tracking devices 3309, 3313 are configured to detect the position and orientation of the endoscope 3302 and laparoscope 3304 disposed on opposite sides of the tissue wall of the stomach 3000 using magnetic or radio frequency sensing. Alternatively, the tracking devices 3309, 3313 are configured to be able to detect the position and orientation of the endoscope 3302 and laparoscope 3304 disposed on opposite sides of the tissue wall of the stomach 3000 using common anatomical landmarks. The tracking devices 3309, 3313 may determine a relative distance, represented by dashed arrow 3341, that indicates the position of one of the endoscope 3302 and laparoscope 3304 relative to the other.
In some embodiments, the first tracking device 3309 and the second tracking device 3313 are configured to be able to detect the position, orientation, or both of the endoscope 3302 and the laparoscope 3304, respectively, using magnetic sensing or radio frequency sensing (e.g., when the endoscope 3302 and the laparoscope 3304 are positioned on opposite sides of the tissue wall of the stomach 3000). Alternatively, the first tracking device 3309 and the second tracking device 3313 are configured to be able to detect the position, orientation, or both of the endoscope 3302 and the laparoscope 3304, respectively, using a common anatomical landmark (e.g., when the endoscope 3302 and the laparoscope 3304 are positioned on opposite sides of the tissue wall of the stomach 3000). The first tracking device 3309 and the second tracking device 3313 may each transmit a signal to a controller (e.g., controller 3330). For example, various embodiments of magnetic fiducial markers and the use of magnetic fiducial markers in detection locations are further discussed in U.S. patent application Ser. No. 63/249,658, entitled "Surgical Devices, systems, and Methods For Control Of One Visualization With Another," filed on 9, 2021.
As shown in fig. 45, surgical system 3300 includes a surgical instrument 3360 that passes through working channel 3303 of endoscope 3302 and into stomach 3000. While the surgical instrument can have a variety of configurations, in this illustrated embodiment, the surgical instrument 3360 includes a grasper 3362 at its distal end. Those skilled in the art will appreciate that the type of surgical instrument that includes the end effector and the structural configuration of the surgical instrument will depend at least on the surgical site and the surgical procedure to be performed. Although only one surgical instrument 3360 is shown, in other embodiments, surgical system 3300 may include more than one surgical instrument disposed in a working channel of an endoscope.
As further shown in fig. 45, the surgical instrument 3360 includes a force sensor 3319 (e.g., the force sensor 3319 may be coupled to one or more motors (not shown) of the instrument 3360 or a robotic arm (not shown) coupled to the instrument 3360). During use, the force sensor 3319 is configured to sense an amount of force applied by the grasper 3362 to tissue of the stomach 3000 as the grasper 3362 manipulates the tissue. The force sensor 3319 is further configured to transmit force data to the controller 3330 of the surgical system 3300. The controller 3330 may aggregate the received feedback inputs (e.g., force data), perform any necessary calculations, and provide output data to implement any adjustments that may be needed (e.g., adjust power level, travel speed, etc.). Additional details regarding the force sensor 3319 and controller 3330 are further described in the previously mentioned U.S. patent No. 10,856,928, which is incorporated herein by reference in its entirety. In some embodiments, the force sensor 3319 may be omitted.
Surgical system 3300 includes first surgical instrument 3314 and second surgical instrument 3318, each configured for laparoscopic access through the abdominal wall and into the extraluminal anatomical space around stomach 3000. First surgical instrument 3314 and second surgical instrument 3318 may have a variety of configurations. For example, in the illustrated embodiment, the surgical instruments 3314, 3318 include graspers 3316, 3320, respectively. Although two surgical instruments 3314, 3318 are shown, in other embodiments, the surgical system 3300 may include more than two surgical instruments. Surgical instruments 3314, 3318 are configured to be inserted through the abdominal wall and into the extraluminal space to manipulate and/or manipulate the stomach 3000 from the laparoscopic side. In some embodiments, the first surgical instrument 3314 and the second surgical instrument 3318 may be passed through ports of the same trocar and/or multiport device through which the laparoscope 3304 is positioned.
The surgical instrument 3314 includes a force sensor 3317 disposed with the surgical instrument 3314. The force sensor 3317 is configured to sense a force applied by the surgical instrument 3314 to a target tissue structure. In addition, the surgical instrument 3318 includes a force sensor 3321 disposed with the surgical instrument 3318. The force sensor 3321 is configured to sense a force applied to a target tissue structure by the surgical instrument 3318. The controller 3330 is further configured to determine an amount of strain applied to the stomach 3000 by at least one of the surgical instruments 3314, 3318 via the force sensors 3317, 3321.
As described above, the endoscope 3302 includes the optical sensor 3306. The optical sensor 3306 is configured to be capable of transmitting image data of a first scene within the field of view of the endoscope 3302 to the controller 3330. As shown in fig. 45, surgical instrument 3340 is inserted into the working channel of endoscope 3302 and advanced toward tumor 3040. In conventional surgical systems, the surgeon will blindly perform a partial gastric flip (e.g., using only a laparoscopic scenario) to laparoscopically remove the tumor 3040 through an incision in the stomach wall. The surgeon cannot accurately coordinate the endoscopic and laparoscopic instruments and instead rough estimates the location of the tumor and instruments during gastric upset, potentially resulting in inaccurate tumor removal and removal of more tissue than is needed. However, in the present system 3300, since both the endoscope 3302 and the laparoscope 3304 can provide image data from the surgical site of both the endoluminal and extraluminal anatomical spaces, switching of the tumor from the endoluminal space to the extraluminal space through the incision can be coordinated between the two sets of instruments.
As shown in fig. 45, the endoscope 3302 may determine the location of the tumor 3040 and the incision 3340 based on the relative distance 3343 and the location of the surgical instrument 3360, which are sensed by the optical sensor 3306 and determined by the controller 3330. In some embodiments, the relative distance 3343 is determined using structured light projected onto the tumor 3040 and/or surgical instrument 3360 and tracked by optical sensor 3306. In addition, the laparoscope 3304 includes an optical sensor 3310. The optical sensor 3310 is configured to be capable of transmitting image data of the second scene within the field of view of the laparoscope 3304. Surgical instruments 3314, 3318 are disposed within the field of view of the laparoscope 3304. As shown in fig. 45, the laparoscope 3304 can determine the position of the surgical instrument 3318 based on relative distances 3344, which are measured by optical sensors 3310 and determined by the controller 3330. In some embodiments, the relative distance 3344 is determined by using structured light projected onto the surgical instrument 3318 and tracked by the optical sensor 3310.
Surgical system 3300 also includes a controller 3330 that is communicatively coupled to endoscope 3302 and laparoscope 3304. The controller 3330 is configured to be able to receive transmitted image data of the first scene and the second scene from the optical sensors 3306, 3310. The controller 3330 is also configured to determine a relative distance from the endoscope 3302 to the laparoscopic device 3304, indicated by a dashed arrow 3341, a relative distance from the tumor 3040 to the surgical instrument 3318, indicated by a dashed arrow 3342, a relative distance from the endoscope 3302 to the tumor 3040, indicated by a dashed arrow 3343, and a relative distance from the laparoscope 3304 to the surgical instrument 3318 positioned outside the at least one natural body cavity and organ, indicated by a dashed arrow 3344, based on the transmitted signals.
As shown in fig. 46, based on the determined relative distances, the combined image is provided by the controller 3330 to depict a scene within the field of view of the laparoscope 3304 while also covering a representative depiction of objects (such as tumors and surgical instruments 3360) that are disposed only within the field of view of the endoscope 3302. The optical sensor 3310 has surgical instruments 3314, 3318 and an external tissue wall 3007 in its field of view and is unable to visually detect a tumor 3040, endoscope 3302 or surgical instrument 3360.
The controller 3330 is configured to provide the combined image to a display. The display may be configured in a variety of configurations. For example, in some embodiments, the first display may be configured to be capable of displaying a first scene and the second display may be configured to be capable of displaying a second scene, and the first display, the second display, or both may be further configured to be capable of displaying a merged image. In another embodiment, the surgical system 3300 may include a third display that may be used to display the combined image, and the first display and the second display are used only to display the transmitted image data from the first optical sensor 3306 and the second optical sensor 3310, respectively, without any modification. In this embodiment, the surgeon can access real-time scenes from both the endoscope 3302 and the laparoscope 3304 on the first display and the second display, while also accessing the combined image on the third display.
Based on the relative distances 3341, 3342, 3343, 3344 determined by the controller 3330, the controller 3330 may provide a combined image from the perspective of the laparoscope 3304, wherein the endoscope 3302 and surgical instrument 3360 are shown as corresponding in real-time to their representative depictions of position in the intra-luminal space. In the illustrated embodiment, the representative depiction is shown in dashed outline of the endoscope 3302 and surgical instrument 3360. However, other forms of representative depictions may be used, such as simple geometric shapes to represent non-visual instruments and anatomy within the intra-luminal space. By using the combined images, the surgeon can place the surgical instrument in the correct position for manipulation of the stomach 3000. For the mobilized tumor 3040 produced in the combined image, as well as the endoscope 3302 and surgical instrument 3360, the surgical instrument 3360 may be coordinated by movement commands entered by the user to align the partially removed tumor 3040 with the incision made in the stomach wall. The surgical instrument 3318 may also be coordinated by movement commands entered by the user to align the surgical instrument 3318 with the incision 3340 on the laparoscopic side. Thus, when the tumor 3040 passes from the endoluminal space at least partially through the incision 3340 to the extraluminal space by the surgical instrument 3360, the surgical instrument 3318 may grasp the tumor 3040 and facilitate removal of the tumor 3040 from the stomach 3000.
In use, the controller 3330 may be configured to limit movement of the surgical instrument 3314 and the surgical instrument 3318 relative to each other at the target tissue structure (e.g., tumor 3040) based on the transmitted image data of the first and second scenes and the relative distances 3341, 3342 of the robotic arm to which the surgical instrument is attached.
As shown in fig. 47, the controller 3330 is further configured to determine an amount of strain applied to the stomach 3000 by at least one of the surgical instruments 3314, 3318 using the visual markers 3350, 3352 associated with the stomach 3000. The visual indicia 3350, 3352 are at least one of one or more local tissue indicia on the stomach 30, one or more projected light indicia on the stomach 3000, or one or more anatomical aspects of at least one of the stomach 3000. The visual markers 3350, 3352 are detected by the optical sensor 3306 of the endoscope 3302 or the optical sensor 3310 of the laparoscope 3304. In use, the optical sensor 3306 or 3310 senses movement of the visual marker as the visual marker 3350 transitions to the visual marker 3352.
As shown in fig. 48, the surgical system 3400 may be used in an optical temperature sensing method to sense the exterior temperature of the stomach 3000 as ablation occurs internally, thereby ensuring that certain layers of the stomach 3000 are not damaged. The surgical system 3400 may be similar to the surgical system 3300 (fig. 45) except for the differences described in detail below, and thus common features are not described in detail herein. The temperature monitoring method may be used to limit the application of energy to the energy application surgical instrument 3440 via an endoscope. For example, the energy application surgical instrument 3440 is endoscopically disposed through an endoscope 3402. In addition, the laparoscope 3404 and surgical instrument 3418 are disposed in the extraluminal space via a laparoscope. Laparoscope 3404 includes optical sensor 3410 and light 3412.
In use, to remove the lymph node 3080, the energy application surgical instrument 3440 may apply energy to the inner wall 3007 of the stomach 3000. The laparoscopic deployment surgical instrument 3418 may be arranged to grasp the lymph node 3080. When the energy application instrument applies energy to the lymph nodes, the optical sensor 3410 may detect the temperature of the tissue of the stomach 3000 and reduce the amount of energy applied if the temperature becomes too high in order to prevent tissue damage.
In some embodiments, the surgical system 3400 may be used to control medium thickness ablation (e.g., thermal, electrical, or microwave) controlled by one imaging access system by coordinating it with a second system, similar to surgical system 3000, viewed from a different perspective. In addition, after tumor removal, the boundary around the tumor site can be enlarged using final ablation from the endoscope side to ensure complete removal of the cancer. For example, in the case of cancerous tumors close to the esophageal sphincter, it is important to maintain the sphincter to prevent acid reflux from occurring, and thus it is useful to maintain as much healthy tissue as possible to avoid unnecessary distensible dissection and resection.
The surgical systems disclosed herein may be designed to be disposed of after a single use, or may be designed for multiple uses. In either case, however, the surgical system may be reused after at least one use via repair. Repair may include any combination of disassembly of the surgical system, followed by cleaning or replacement of specific parts, and subsequent reassembly steps. In particular, the surgical system may be disassembled, and any number of the particular pieces or accessories of the surgical system may be selectively replaced or removed in any combination. After cleaning and/or replacement of particular accessories, the surgical system may be reassembled for subsequent use either at a reconditioning facility, or by a surgical team immediately prior to a surgical procedure. Those skilled in the art will appreciate that repair of surgical systems may utilize various techniques for disassembly, cleaning/replacement, and reassembly. The use of such techniques and the resulting prosthetic devices are within the scope of the application.
Furthermore, in the present disclosure, similarly-named components in various embodiments typically have similar features, and thus, in particular embodiments, each feature of each similarly-named component is not necessarily set forth entirely. In addition, to the extent that linear or circular dimensions are used in the description of the disclosed systems, devices, and methods, such dimensions are not intended to limit the types of shapes that may be used in connection with such systems, devices, and methods. Those skilled in the art will recognize that equivalent dimensions of such linear and circular dimensions can be readily determined for any geometry. The size and shape of the systems and devices and their components may depend at least on the anatomy of the subject in which the systems and devices are to be used, the size and shape of the components with which the systems and devices are to be used, and the method and procedure in which the systems and devices are to be used.
It should be understood that the terms "proximal" and "distal" are used herein with respect to a user, such as a clinician, grasping the handle of the instrument. It should be understood that the terms "proximal" and "distal" are used herein with respect to the top end (e.g., the end furthest from the surgical site during use) and the bottom end (e.g., the end closest to the surgical site during use) of the surgical instrument, respectively. Other spatial terms such as "anterior" and "posterior" similarly correspond to distal and proximal, respectively. It will also be appreciated that for convenience and clarity, spatial terms such as "vertical" and "horizontal" are used herein with respect to the drawings. However, surgical instruments are used in many orientations and positions, and these spatial terms are not intended to be limiting and absolute.
Herein, a value or range may be expressed as "about" and/or from "about" one particular value to another particular value. When such values or ranges are expressed, other embodiments of the disclosure include the recited particular values and/or from one particular value to another particular value. Similarly, when values are expressed as approximations, by use of the antecedent "about," it will be understood that many values are disclosed herein and that the particular value forms another embodiment. It should also be understood that a number of values are disclosed therein, and that each value is also disclosed herein as "about" that particular value in addition to the value itself. In embodiments, "about" can be used to mean, for example, within 10% of the recited value, within 5% of the recited value, or within 2% of the recited value.
For the purposes of describing and defining the present teachings it is noted that the term "substantially" is utilized herein to represent the inherent degree of uncertainty that may be attributed to any quantitative comparison, value, measurement, or other representation, unless otherwise indicated. The term "substantially" may also be utilized herein to represent the degree by which a quantitative representation may vary from a stated reference without resulting in a change in the basic function of the subject matter at issue.
From the above embodiments, those skilled in the art will recognize additional features and advantages of the present application. Accordingly, the application is not to be limited by what has been particularly shown and described, except as indicated by the appended claims. All publications and references cited herein are expressly incorporated by reference in their entirety. Any patent, publication, or message, in whole or in part, that is said to be incorporated by reference herein is incorporated herein only to the extent that the incorporated material does not conflict with existing definitions, statements, or other disclosure material set forth herein. As such, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference.
Claims (20)
1. A surgical anchor system, the surgical anchor system comprising:
a surgical instrument configured for endoluminal access, the surgical instrument having:
An outer sleeve defining a working channel therethrough, the outer sleeve configured to be at least partially disposed within a first natural body cavity; and
At least one channel arm configured to be extendable through the working channel, the at least one channel arm having:
At least one anchoring member coupled to the at least one channel arm and configured to be movable between an expanded state and an unexpanded state, wherein when in the expanded state, the at least one anchoring member is configured to be at least partially disposed within a second natural body lumen, the second natural body lumen in communication with the first natural body lumen; and
At least one control actuator extending along the at least one channel arm and operatively coupled to the at least one anchoring member, the at least one control actuator operatively coupled to a drive system configured to control movement of the at least one channel arm to selectively manipulate organs associated with the first and second natural body lumens.
2. The surgical anchoring system of claim 1, wherein the surgical instrument includes an anchoring balloon disposed proximal of the distal end of the outer sleeve.
3. The surgical anchoring system of claim 2, wherein the anchoring balloon is configured to expand and at least partially contact an inner surface of the first natural body lumen.
4. The surgical anchoring system of any one of claims 1-3, wherein the at least one anchoring member is configured to expand and at least partially contact an inner surface of the second natural body lumen.
5. The surgical anchoring system of any preceding claim, wherein the at least one channel arm is configured to apply a force to the second natural body lumen via at least one anchoring member so as to manipulate the second natural body lumen relative to the first natural body lumen.
6. A surgical anchoring system according to any preceding claim, wherein the at least one channel arm comprises an optical sensor arranged at a distal end of the at least one channel arm.
7. The surgical anchoring system of any preceding claim, further comprising a controller configured to coordinate movement of the at least one channel arm within the second natural body lumen and movement of at least one instrument outside the second natural body lumen to prevent tearing of the second natural body lumen.
8. A surgical anchoring system according to any preceding claim, wherein the at least one anchoring member is configured to move axially along the length of the channel arm.
9. The surgical anchoring system of claim 8, wherein the at least one anchoring member is configured to be selectively lockable at an axial position along the length of the at least one channel arm by a releasable locking mechanism.
10. A surgical anchor system, the surgical anchor system comprising:
a tubular member configured for intra-luminal access and having a central lumen therein configured to allow an endoscope to pass therethrough;
An anchor assembly coupled to and extending distally from a distal portion of the tubular member, the anchor assembly comprising:
A first anchor member coupled to the tubular member and configured to engage a first anatomical location and fix the first anatomical location relative to the tubular member; and
A second anchor member movable relative to the first anchor member and positioned distal to the first anchor member and configured to engage a second anatomical location movable relative to the first anatomical location;
Wherein movement of the second anchoring member relative to the first anchoring member is effective to selectively reposition the second anatomical location relative to the first anatomical location.
11. The surgical anchoring system of claim 10 wherein the first anchoring member includes a plurality of first expandable anchoring elements and a plurality of first working channels extending through the first anchoring member.
12. The surgical anchoring system of claim 11, wherein the second anchoring member includes a plurality of second expandable anchoring elements and a second plurality of working channels extending through the second anchoring member.
13. The surgical anchoring system of claim 12, wherein the tubular member further comprises a third plurality of working channels extending through the tubular member.
14. The surgical anchoring system of claim 13, further comprising a plurality of first actuators passing through a first working channel of said first plurality of working channels of said first anchoring member and a first working channel of said third plurality of working channels of said tubular member.
15. The surgical anchoring system of claim 14, wherein the plurality of first actuators are configured to rotate to expand the first plurality of expandable anchoring elements.
16. The surgical anchoring system of any one of claims 13-15, further comprising a plurality of second actuators passing through a second working channel of the first plurality of working channels of the first anchoring member, a first working channel of the second plurality of working channels of the second anchoring member, and a second working channel of the third plurality of working channels of the tubular member.
17. The surgical anchoring system of claim 16, wherein the plurality of second actuators are configured to rotate to expand the second plurality of expandable anchoring elements.
18. The surgical anchor system of any one of claims 13-17 further comprising a plurality of third actuators passing through a third working channel of the first plurality of working channels of the first anchor member and a third working channel of the third plurality of working channels of the tubular member and terminating at a proximal surface of the second anchor member.
19. The surgical anchoring system of claim 18, wherein the plurality of third actuators are configured to rotate to axially displace the second anchoring member relative to the first anchoring member.
20. A surgical anchoring system according to claim 18 or claim 19, wherein the plurality of third actuators are configured to extend, retract, or bend to selectively reposition the second anatomical location relative to the first anatomical location.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US63/249,980 | 2021-09-29 | ||
US17/449,767 US20230095278A1 (en) | 2021-09-29 | 2021-10-01 | Surgical Anchoring Systems for Endoluminal Access |
US17/449,767 | 2021-10-01 | ||
PCT/IB2022/059109 WO2023052958A1 (en) | 2021-09-29 | 2022-09-26 | Surgical anchoring systems for endoluminal access |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118317739A true CN118317739A (en) | 2024-07-09 |
Family
ID=91733853
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202280078170.4A Pending CN118317739A (en) | 2021-09-29 | 2022-09-26 | Surgical anchoring system for endoluminal access |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118317739A (en) |
-
2022
- 2022-09-26 CN CN202280078170.4A patent/CN118317739A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11992200B2 (en) | Instrument control surgical imaging systems | |
JP2024537059A (en) | Instrument-controlled imaging system for visualization of future surgical procedures | |
JP2024536150A (en) | Method and system for controlling coordinated surgical instruments - Patents.com | |
CN118317739A (en) | Surgical anchoring system for endoluminal access | |
CN118338862A (en) | Instrument control imaging system for visualizing an impending surgical procedure | |
CN118284383A (en) | Coordinated appliance control system | |
US20230105509A1 (en) | Surgical devices, systems, and methods using multi-source imaging | |
JP2024537055A (en) | Surgical fixation systems for endoluminal access | |
JP2024537060A (en) | Collaborative Equipment Control System | |
WO2023052955A1 (en) | Instrument control surgical imaging systems | |
JP2024536151A (en) | Method and system for controlling coordinated surgical instruments - Patents.com | |
WO2023052962A1 (en) | Methods and systems for controlling cooperative surgical instruments | |
WO2023052938A1 (en) | Methods and systems for controlling cooperative surgical instruments | |
CN118042993A (en) | Method and system for controlling a collaborative surgical instrument | |
CN118284368A (en) | Surgical system with devices for endoluminal and extraluminal access | |
CN118302122A (en) | Surgical system for independently insufflating two separate anatomical spaces | |
CN118139578A (en) | Surgical devices, systems, and methods using multi-source imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |