Nothing Special   »   [go: up one dir, main page]

CN113795214A - Input controls for robotic surgery - Google Patents

Input controls for robotic surgery Download PDF

Info

Publication number
CN113795214A
CN113795214A CN202080034358.XA CN202080034358A CN113795214A CN 113795214 A CN113795214 A CN 113795214A CN 202080034358 A CN202080034358 A CN 202080034358A CN 113795214 A CN113795214 A CN 113795214A
Authority
CN
China
Prior art keywords
sensor
input
control
surgical tool
proximity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080034358.XA
Other languages
Chinese (zh)
Inventor
C·W·戴林格
G·W·约翰森
C·J·谢伊布
J·S·斯韦兹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cilag GmbH International
Original Assignee
Cilag GmbH International
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/354,417 external-priority patent/US11666401B2/en
Priority claimed from US16/354,420 external-priority patent/US20200289228A1/en
Priority claimed from US16/354,422 external-priority patent/US11992282B2/en
Application filed by Cilag GmbH International filed Critical Cilag GmbH International
Publication of CN113795214A publication Critical patent/CN113795214A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/28Surgical forceps
    • A61B17/29Forceps for use in minimally invasive surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • A61B18/1442Probes having pivoting end effectors, e.g. forceps
    • A61B18/1445Probes having pivoting end effectors, e.g. forceps at the distal end of a shaft, e.g. forceps or scissors at the end of a rigid rod
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/77Manipulators with motion or force scaling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/02Hand grip control means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00057Light
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00057Light
    • A61B2017/00061Light spectrum
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00199Electrical control of surgical instruments with a console, e.g. a control panel with a display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00207Electrical control of surgical instruments with hand gesture control or hand gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00212Electrical control of surgical instruments using remote controls
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00221Electrical control of surgical instruments with wireless transmission of data, e.g. by infrared radiation or radiowaves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/0042Surgical instruments, devices or methods, e.g. tourniquets with special provisions for gripping
    • A61B2017/00424Surgical instruments, devices or methods, e.g. tourniquets with special provisions for gripping ergonomic, e.g. fitting in fist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/0091Handpieces of the surgical instrument or device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/0091Handpieces of the surgical instrument or device
    • A61B2018/00916Handpieces of the surgical instrument or device with means for switching or controlling the main function of the instrument or device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/0091Handpieces of the surgical instrument or device
    • A61B2018/00916Handpieces of the surgical instrument or device with means for switching or controlling the main function of the instrument or device
    • A61B2018/00958Handpieces of the surgical instrument or device with means for switching or controlling the main function of the instrument or device for switching between different working modes of the main function
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • A61B2034/742Joysticks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/061Measuring instruments not otherwise provided for for measuring dimensions, e.g. length
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • A61B2090/065Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/067Measuring instruments not otherwise provided for for measuring angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0801Prevention of accidental cutting or pricking
    • A61B2090/08021Prevention of accidental cutting or pricking of the patient or his organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0807Indication means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0807Indication means
    • A61B2090/0811Indication means for the position of a particular part of an instrument with respect to the rest of the instrument, e.g. position of the anvil of a stapling instrument
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0813Accessories designed for easy sterilising, i.e. re-usable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B46/00Surgical drapes
    • A61B46/10Surgical drapes specially adapted for instruments, e.g. microscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/60Supports for surgeons, e.g. chairs or hand supports
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/98Identification means for patients or instruments, e.g. tags using electromagnetic means, e.g. transponders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Robotics (AREA)
  • General Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Automation & Control Theory (AREA)
  • Plasma & Fusion (AREA)
  • Otolaryngology (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an input control device. The input control device includes a central portion coupled to a multi-axis force and torque sensor configured to receive input control motions from a surgeon. The central portion is flexibly supported on the base. The input control device also includes a rotary joint coupled to the rotation sensor. The input control device is configured to provide control motions to the robotic arm and/or robotic tool based on input controls detected by the multi-axis force and torque sensor and the rotation sensor.

Description

Input controls for robotic surgery
Background
Surgical systems often incorporate an imaging system that may allow a clinician to view the surgical site and/or one or more portions thereof on one or more displays, such as a monitor. The display may be local and/or remote to the operating room. The imaging system may include a scope having a camera that views the surgical site and transmits the view to a display viewable by the clinician. Imaging systems may be limited by the information they are able to identify and/or communicate to the clinician. For example, certain imaging systems may not be able to identify certain hidden structures, physical contours, and/or dimensions within three-dimensional space intraoperatively. Additionally, some imaging systems may not be able to communicate and/or convey certain information to the clinician during the procedure.
The robotic system may be actuated or remotely controlled by one or more clinicians located at the console. The input motion at the console may correspond to actuation of the robotic arm and/or a robotic tool coupled thereto. In various instances, the robotic system and/or clinician may rely on the views and/or information provided by the imaging system to determine desired robotic actuation and/or corresponding appropriate input movements. Certain imaging systems are unable to provide certain visualization data and/or information, which may present challenges and/or limitations to the decision-making process of the clinician and/or control of the robotic system.
Disclosure of Invention
In various aspects, a control system for a surgical robot is disclosed, the control system comprising: a base; a central portion flexibly supported by the base; a wrist longitudinally offset from and rotationally coupled to the central portion; a multi-axis sensor arrangement configured to be able to detect a user input force applied to the central portion; a rotation sensor configured to be able to detect a user input motion applied to the wrist; a memory; and a processor communicatively coupled to the memory. The processor is configured to receive a plurality of first input signals from the multi-axis sensor arrangement, provide a plurality of first output signals to the surgical robot based on the plurality of first input signals, receive a plurality of second input signals from the rotation sensor, and provide a plurality of second output signals to the surgical robot based on the plurality of second input signals.
In various aspects, a control system for a surgical robot is disclosed, the control system comprising: a first control input comprising a flexibly supported joystick; a memory; and a control circuit communicatively coupled to the memory. The memory stores instructions executable by the control circuit to cause the control system to switch between a first mode and a second mode, receive a plurality of first input signals from the first control input, scale the plurality of first input signals by a first multiplier in the first mode, and scale the plurality of first input signals by a second multiplier in the second mode. The second multiplier is different from the first multiplier.
In various aspects, a control system for a surgical robot is disclosed, the control system comprising: a first input comprising a flexibly supported joystick and a multi-axis force and torque sensor arrangement configured to be able to detect a user input force and torque applied to the flexibly supported joystick; a second input comprising a rotary joint and a rotation sensor configured to be able to detect a user input motion applied to the rotary joint; and a control unit. The control unit is configured to provide a first plurality of output signals to the surgical robot based on actuation of the first input and a second plurality of output signals to the surgical robot based on actuation of the second input.
In various aspects, a control system is disclosed, the control system comprising: a robotic surgical tool; a tissue proximity detection system configured to be capable of intra-operatively detecting a distance between the robotic surgical tool and an anatomical structure; and a user input device. The user input device includes: a base including a force sensor; a forearm support member movably coupled to the base; a shaft extending distally from the forearm support; a handpiece extending distally from the shaft; and a jaw sensor configured to detect pivotal movement of the jaws. The forearm support is movable relative to the base within a travel zone and the handpiece includes jaws. The forearm support member, the shaft and the hand piece are capable of moving together as a collective unit as the forearm support member moves relative to the base within the zone of travel. The user input device also includes a displacement sensor configured to detect movement of the collective unit. The control system also includes a control circuit communicatively coupled to the force sensor, the displacement sensor, and the jaw sensor. The control circuit is configured to receive a first input signal from the force sensor, a second input signal from the displacement sensor, a third input signal from the jaw sensor, and switch the user input device from a first mode to a second mode in response to an input from the tissue proximity detection system indicating that the distance between the robotic surgical tool and the anatomical structure decreases to less than a threshold distance. The first input signal controls movement of the robotic surgical tool in the first mode, and the second input signal and the third input signal control movement of the robotic surgical tool in the second mode.
In various aspects, a control system is disclosed that includes an organization proximity detection system and a user input device. The user input device includes; a base; a forearm support member movably coupled to the base; a shaft extending distally from the forearm support; a handpiece extending distally from the shaft; and a plurality of sensors. The forearm support is movable relative to the base within a travel zone and the handpiece includes a jaw configured to pivot relative to the shaft. The plurality of sensors includes: a first sensor arrangement configured to be able to detect a user input force to the base; a second sensor arrangement configured to be able to detect a displacement of the forearm support; and a third sensor arrangement configured to detect pivotal movement of the jaw. The control system also includes control circuitry configured to be capable of receiving a proximity data signal from the tissue proximity detection system, receiving a first input signal from the first sensor arrangement, receiving a second input signal from the second sensor arrangement, receiving a third input signal from the third sensor arrangement, and causing the user input device to switch from the first mode to the second mode in response to the proximity data signal from the tissue proximity detection system indicating a predefined tissue proximity. The first input signal controls movement of the robotic surgical tool in the first mode, and the second input signal and the third input signal control movement of the robotic surgical tool in the second mode.
In various aspects, a user input device for controlling a robotic surgical tool is disclosed, the user input device comprising: a base comprising a first sensor arrangement; and a forearm support member movably coupled to the base. The forearm support member is moveable relative to the base within the travel zone and the forearm support member includes a second sensor arrangement. The user input device also includes control circuitry configured to receive a first input signal from the first sensor arrangement, receive a second input signal from the second sensor arrangement, and switch the user input device between a first mode in which the first input signal controls movement of the robotic surgical tool and a second mode in which the second input signal controls movement of the robotic surgical tool.
In various aspects, a control system for a robotic surgical tool is disclosed, the control system comprising: an untethered handpiece, the untethered handpiece comprising a body; a joystick extending from the body; a rotatable shaft extending from the body; and a plurality of sensors including a body sensor embedded in the body and configured to be able to detect a motion of the body in a three-dimensional space; a multi-axis force sensor configured to be able to detect a force applied to the joystick; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body. The control system also includes a control circuit communicatively coupled to the plurality of sensors and the proximity detection system. The control circuit is configured to receive a proximity signal from the proximity detection system, receive an input control signal from the plurality of sensors, switch between a coarse motion mode and a fine motion mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold, provide a coarse motion control signal to the robotic surgical tool based on the input control signal from the multi-axis force sensor in the coarse motion mode, and provide a fine motion control signal to the robotic surgical tool based on the input control signals from the body sensor and the axis sensor in the fine motion mode. The proximity signal indicates a proximity of the robotic surgical tool to tissue.
In various aspects, a control system for a robotic surgical tool is disclosed, the control system comprising: an untethered handpiece, the untethered handpiece comprising a body; an actuator extending from the body; a rotatable shaft extending from the body; and a plurality of sensors including a body sensor embedded in the body and configured to be able to detect a motion of the body in a three-dimensional space; a force sensor configured to be able to detect a force applied to the actuator; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body. The control system further includes a proximity detection system configured to detect proximity of the robotic surgical tool to tissue; and a control circuit communicatively coupled to the plurality of sensors and the proximity detection system. The control circuit is configured to receive a proximity signal from the proximity detection system, receive an input control signal from the plurality of sensors, switch between a first mode and a second mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold value, provide a first motion control signal to the robotic surgical tool based on the input control signal from the force sensor in the first mode, and provide a second motion control signal to the robotic surgical tool based on the input control signals from the body sensor and the shaft sensor in the second mode. The proximity signal indicates the proximity of the robotic surgical tool to tissue and scales the first motion control signal based on the proximity signal.
In various aspects, a control system for a robotic surgical tool is disclosed, the control system comprising an untethered handpiece comprising a body; an actuation arm extending proximally from the body; a shaft extending distally from the body; and a plurality of sensors including a body sensor embedded in the body and configured to be able to detect a motion of the body in a three-dimensional space; an arm sensor configured to be capable of detecting a force applied to the actuator arm; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body. The control system also includes a control circuit communicatively coupled to the plurality of sensors and the proximity detection system. The control circuit is configured to receive a proximity signal from the proximity detection system, receive an input control signal from the plurality of sensors, switch between a coarse motion mode and a fine motion mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold, provide an output control signal to the robotic surgical tool based on the input control signal from the arm sensor in the coarse motion mode, and provide an output control signal to the robotic surgical tool based on the input control signals from the body sensor and the shaft sensor in the fine motion mode. The proximity signal indicates a proximity of the robotic surgical tool to tissue.
In various aspects, a control system for a robotic surgical tool is disclosed, the control system comprising an untethered handpiece comprising: a coarse motion controller comprising a multi-axis sensor; a precision motion controller comprising an embedded motion sensor; and control circuitry communicatively coupled to the multi-axis sensor, the embedded motion sensor, and the proximity detection system. The control circuitry is configured to receive a proximity signal from the proximity detection system indicative of a proximity of the robotic surgical tool to tissue, and switch between a coarse motion mode in which the robotic surgical tool is controlled with the input control signal from the coarse motion controller and a fine motion mode in which the robotic surgical tool is controlled with the input control signal from the fine motion controller in response to receiving the proximity signal indicative of the proximity decreasing to less than a threshold.
Drawings
The novel features of the various aspects are set forth with particularity in the appended claims. However, the aspects described, both as to organization and method of operation, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings, in which:
Fig. 1 is a plan view of a robotic surgical system for performing a surgical procedure in accordance with at least one aspect of the present disclosure.
Fig. 2 is a perspective view of a surgeon console of the robotic surgical system of fig. 1, according to at least one aspect of the present disclosure.
Fig. 3 is a schematic view of a robotic surgical system according to at least one aspect of the present disclosure.
Fig. 4 is a perspective view of a surgeon console of a robotic surgical system according to at least one aspect of the present disclosure.
Fig. 5 is a perspective view of a user input device at a surgeon's console according to at least one aspect of the present disclosure.
Fig. 6 is a perspective view of a user input device for a robotic surgical system according to at least one aspect of the present disclosure.
Fig. 7 is a plan view of the user input device of fig. 6 in accordance with at least one aspect of the present disclosure.
Fig. 8 is a rear elevation view of the user input device of fig. 6, in accordance with at least one aspect of the present disclosure.
Fig. 9 is a side elevation view of the user input device of fig. 6, in accordance with at least one aspect of the present disclosure.
Fig. 10 is a perspective view of a user's hand engaging the user input device of fig. 6 in accordance with at least one aspect of the present disclosure.
Fig. 11 is a rear elevation view of a user's hand engaging the user input device of fig. 6, in accordance with at least one aspect of the present disclosure.
Fig. 11A is a control logic flow diagram for the user input device of fig. 6 in accordance with at least one aspect of the present disclosure.
Fig. 11B is a table depicting control parameters of the operating mode of the user input device of fig. 6 in accordance with at least one aspect of the present disclosure.
Fig. 11C illustrates a control circuit configured to control aspects of the user input device of fig. 6, in accordance with at least one aspect of the present disclosure.
Fig. 11D illustrates a combinational logic circuit configured to control aspects of the user input device of fig. 6, in accordance with at least one aspect of the present disclosure.
Fig. 11E illustrates a sequential logic circuit configured to control aspects of the user input device of fig. 6 in accordance with at least one aspect of the present disclosure.
Fig. 12 is a perspective view of an end effector of a surgical tool operatively controllable by control motions provided to the user input device of fig. 6, according to at least one aspect of the present disclosure.
Fig. 12A is a perspective view of the end effector of fig. 12 depicting the end effector in an articulated configuration, in accordance with at least one aspect of the present disclosure.
Fig. 13A and 13B depict the end effector of the surgical tool and the user input device of fig. 6 in a corresponding open configuration, in accordance with at least one aspect of the present disclosure, wherein fig. 13A is a plan view of the end effector and fig. 13B is a plan view of the user input device.
Fig. 14A and 14B depict the end effector and user input device of fig. 13A and 13B in a corresponding partially closed configuration, where fig. 14A is a plan view of the end effector and fig. 14B is a plan view of the user input device, in accordance with at least one aspect of the present disclosure.
Fig. 15A and 15B depict the end effector and user input device of fig. 13A and 13B in a corresponding closed configuration, where fig. 15A is a plan view of the end effector and fig. 15B is a plan view of the user input device, in accordance with at least one aspect of the present disclosure.
Fig. 16 is a perspective view of a workspace comprising the two user input devices of fig. 6 positioned on a surface in accordance with at least one aspect of the present disclosure.
Fig. 17 is another perspective view of the workspace of fig. 16 in accordance with at least one aspect of the present disclosure.
Fig. 17A is a detail view of a portion of the workspace of fig. 17, in accordance with at least one aspect of the present disclosure.
Fig. 18 is an exploded perspective view of an input device including first and second plate members, a light shield, a detent arrangement, and a cover, according to at least one aspect of the present disclosure.
Fig. 19 is an exploded top perspective view of the first and second plate members and light shield of fig. 18, according to at least one aspect of the present disclosure.
Fig. 20 is an exploded bottom perspective view of the first and second plate members and light shield of fig. 19 according to at least one aspect of the present disclosure.
Fig. 21 is a plan view of pin members of the stop arrangement of fig. 18 positioned in openings in the second plate member of fig. 18 in a rotated configuration in accordance with at least one aspect of the present disclosure.
Fig. 22 is a cross-sectional elevation view of the first and second plate members, light shield, detent arrangement, and cover of fig. 18 in an angled configuration in accordance with at least one aspect of the present disclosure.
Fig. 23 is a cross-sectional elevation view of a user input device according to at least one aspect of the present disclosure.
Fig. 24 is a schematic view of a surgical visualization system including an imaging device and a surgical device configured to identify critical structures below a tissue surface in accordance with at least one aspect of the present disclosure.
Fig. 25 is a schematic diagram of a control system for a surgical visualization system configured to receive input from a user input device, according to at least one aspect of the present disclosure.
Fig. 26 illustrates control circuitry configured to control aspects of a surgical visualization system according to at least one aspect of the present disclosure.
Fig. 27 illustrates combinational logic circuitry configured to control aspects of a surgical visualization system, in accordance with at least one aspect of the present disclosure.
Fig. 28 illustrates a sequential logic circuit configured to control aspects of a surgical visualization system in accordance with at least one aspect of the present disclosure.
FIG. 29 depicts triangulation to determine depth d of critical structures below a tissue surface, in accordance with at least one aspect of the present disclosureASchematic representation of (a).
Fig. 30 is a schematic view of a surgical visualization system configured to identify critical structures below a tissue surface, wherein the surgical visualization system includes a means for determining a depth d of the critical structures below the tissue surface, in accordance with at least one aspect of the present disclosureAThe pulsed light source of (1).
Fig. 31 is a schematic view of a surgical visualization system including a three-dimensional camera configured to identify critical structures embedded within tissue in accordance with at least one aspect of the present disclosure.
Fig. 32A and 32B are views of key structures captured by the three-dimensional camera of fig. 31, where fig. 32A is a view from a left lens of the three-dimensional camera and fig. 32B is a view from a right lens of the three-dimensional camera, according to at least one aspect of the present disclosure.
FIG. 33 is a diagram in accordance with at least one aspect of the present disclosure31, wherein a camera-critical structure distance d from the three-dimensional camera to the critical structure can be determinedw
Fig. 34 is a schematic diagram of a surgical visualization system utilizing two cameras to determine the position of an embedded critical structure in accordance with at least one aspect of the present disclosure.
Fig. 35A is a schematic view of a surgical visualization system utilizing a camera that is moved axially between a plurality of known orientations to determine the orientation of an embedded critical structure, in accordance with at least one aspect of the present disclosure.
Fig. 35B is a schematic view of the surgical visualization system of fig. 35A, wherein the camera is moved axially and rotationally between a plurality of known orientations to determine the orientation of the embedded critical structure, in accordance with at least one aspect of the present disclosure.
Fig. 36 is a schematic view of a control system for a surgical visualization system according to at least one aspect of the present disclosure.
Fig. 37 is a schematic view of a structured light source for a surgical visualization system, according to at least one aspect of the present disclosure.
Fig. 38-40 depict exemplary hyperspectral identifying features for distinguishing anatomical structures from occlusions in accordance with at least one aspect of the present disclosure, where fig. 38 is a graphical representation of ureter features and occlusions, fig. 39 is a graphical representation of arterial features and occlusions, and fig. 40 is a graphical representation of neural features and occlusions.
Fig. 41 is a schematic diagram of a Near Infrared (NIR) time-of-flight measurement system including a transmitter (emitter) and a receiver (sensor) positioned on a common device configured to be capable of sensing distance to critical anatomical structures, in accordance with at least one aspect of the present disclosure.
FIG. 42 is a schematic illustration of the transmitted waves, the received waves, and the delay between the transmitted and received waves of the NIR time-of-flight measurement system of FIG. 41, according to at least one aspect of the present disclosure.
Fig. 43 illustrates a NIR time-of-flight measurement system configured to be able to sense distances from different structures, including transmitters (transmitters) and receivers (sensors) on separate devices, in accordance with at least one aspect of the present disclosure.
Fig. 44 is a perspective view of an input control device for a robotic surgical system according to at least one aspect of the present disclosure.
Fig. 45 is another perspective view of the input control device of fig. 44, in accordance with at least one aspect of the present disclosure.
Fig. 46 is a front elevation view of the input control device of fig. 44, in accordance with at least one aspect of the present disclosure.
Fig. 47 is a side elevational view of the input control device of fig. 44 in a first configuration shown in solid lines, and further depicting the input control device in a second configuration shown in phantom lines, wherein a lower portion or base of the input control device remains stationary and an upper portion of the input control device is displaced along the longitudinal axis between the first and second configurations, in accordance with at least one aspect of the present disclosure.
Fig. 48 is a perspective view of a user's hand and forearm engaging the input control device of fig. 44, in accordance with at least one aspect of the present disclosure.
Fig. 49 is a front elevation view of a user's hand and forearm engaging the input control device of fig. 44, in accordance with at least one aspect of the present disclosure.
Fig. 50 is a logic diagram of a control circuit utilized in conjunction with the input control device of fig. 44 in accordance with at least one aspect of the present disclosure.
Fig. 51 is a perspective view of an input control device for a robotic surgical system, according to at least one aspect of the present disclosure.
Fig. 52 is a rear elevation view of the input control device of fig. 51, according to an aspect of the present disclosure.
Fig. 53 is a perspective view of an input control device for a robotic surgical system, according to at least one aspect of the present disclosure.
Fig. 54 is a side elevational view of the input control device of fig. 53 in a first configuration shown in solid lines, and further depicting the input control device in a second configuration shown in phantom lines, wherein a lower portion or base of the input control device remains stationary and an upper portion of the input control device is displaced along the longitudinal axis between the first and second configurations, in accordance with at least one aspect of the present disclosure.
Fig. 55 is a perspective view of a user's hand and forearm engaging the input control device of fig. 53, in accordance with at least one aspect of the present disclosure.
Fig. 56 is a side elevation view of a user's hand and forearm engaging the input control device of fig. 53, in accordance with at least one aspect of the present disclosure.
Fig. 57 is a perspective view of an input control device according to at least one aspect of the present disclosure.
Fig. 58 is another perspective view of the input control device of fig. 57, in accordance with at least one aspect of the present disclosure.
Fig. 59 is a front view of the input control device of fig. 57, according to at least one aspect of the present disclosure.
Fig. 59A is another elevation view of the input control device of fig. 57 with certain features removed for clarity, and schematically depicting control circuitry therein, in accordance with at least one aspect of the present disclosure.
Fig. 60 is a perspective view of a user's hand engaged with the input control device of fig. 57 and positioned to deliver input control motions to its various input controllers in accordance with at least one aspect of the present disclosure.
Fig. 61 is a front view of a user's hand engaged with the input control device of fig. 57 and positioned to deliver input control motions to its various input controllers in accordance with at least one aspect of the present disclosure.
Fig. 62 is a hypothetical graphical representation of the input control sensitivity of the input control device of fig. 57 relative to tissue proximity in accordance with at least one aspect of the present disclosure.
Fig. 63 is a perspective view of an input control device positioned in a starting position within a coarse motion region in accordance with at least one aspect of the present disclosure.
Fig. 64 is another perspective view of the input control device of fig. 63, in accordance with at least one aspect of the present disclosure.
Fig. 65 is a perspective view of a user's hand engaging the input control device of fig. 63, in accordance with at least one aspect of the present disclosure.
Fig. 66 is another perspective view of a user's hand engaging the input control device of fig. 63, in accordance with at least one aspect of the present disclosure.
Fig. 67 is a front view of a user's hand engaging the input control device of fig. 63, in accordance with at least one aspect of the present disclosure.
Fig. 68 is a hypothetical graphical representation of a robot tool velocity relative to a displacement of the input control of fig. 63 within the coarse motion region of fig. 63 in accordance with at least one aspect of the present disclosure.
Detailed Description
The applicant of the present application owns the following U.S. patent applications filed on 2019, 3, 15, each of which is incorporated herein by reference in its entirety:
Attorney docket number END9052USNP1/180620-1 entitled "INPUT CONTROLS FOR rolling bearing surgy";
attorney docket number END9052USNP2/180620-2 entitled "DUAL MODE CONTROLS FOR rolling SURGERY surgy";
attorney docket number END9052USNP3/180620-3 entitled "MOTION CAPTURE CONTROLS FOR recording surgy";
attorney docket number END9053USNP1/180621-1 entitled "rolling SURGICAL SYSTEMS WITH MECHANISMS FOR SCALING surface adapting TO TISSUE PROXIMITY";
attorney docket number END9053USNP2/180621-2 entitled "ROBOTIC SURGICAL SYSTEMS WITH MECHANISMS FOR SCALING CAMERA MAGNIFICATION ACCORDING TO PROXIMITY OF SURGICAL TOOL TO TISSUE";
attorney docket number END9053USNP3/180621-3 entitled "rolling minor SYSTEMS WITH SELECTIVELY locked END efffectors";
attorney docket number END9053USNP4/180621-4 entitled "selective VARIABLE RESPONSE OF short MOTION OF basic rolling SYSTEMS";
attorney docket number END9054USNP1/180622-1, entitled "SEGMENTED CONTROL INPUTS FOR SURGICAL ROBOTIC SYSTEMS";
Attorney docket number END9055USNP1/180623-1 entitled "ROBOTIC SURGICAL CONTROLS HAVALING FEEDBACK CAPABILITIES";
attorney docket number END9055USNP2/180623-2 entitled "ROBOTIC SURGICAL CONTROLS WITH FORCE FEEDBACK"; and
attorney docket number END9055USNP3/180623-3 entitled "JAW COORDINATION OF ROBOTIC SURGICAL CONTROLS".
The applicant of the present application also owns the following U.S. patent applications filed 2018, 9, 11, each of which is incorporated herein by reference in its entirety:
U.S. patent application 16/128,179 entitled "SURGICAL VIUALIZATION PLATFORM";
U.S. patent application 16/128,180 entitled "CONTROLLING AN EMITTER ASSEMBLY PULSE SEQUENCE";
U.S. patent application 16/128,198 entitled "Singul EMR SOURCE EMITTER ASSEMBLY";
U.S. patent application 16/128,207 entitled "association EMITTER AND CAMERA association" for an association;
U.S. patent application 16/128,176 entitled "SURGICAL VISUALIZATION WITH PROXIMITY TRACKING FEATURES";
U.S. patent application 16/128,187 entitled "SURGICAL VIUALIZATION OF MULTIPLE TARGETS";
U.S. patent application 16/128,192 entitled "visibility OF SURGICAL DEVICES";
U.S. patent application 16/128,163 entitled "operating COMMUNICATION OF LIGHT";
U.S. patent application 16/128,197 entitled "rolling LIGHT PROJECTION TOOLS";
U.S. patent application 16/128,164 entitled "SURGICAL VIUALIZATION FEEDBACK SYSTEM";
U.S. patent application 16/128,193 entitled "SURGICAL VIUALIZATION AND MONITORING";
U.S. patent application 16/128,195 entitled "INTEGRATION OF IMAGING DATA";
U.S. patent application No. 16/128,170 entitled "ROBOTIC-ASSISTED SURGICAL SUTURING SYSTEMS";
U.S. patent application 16/128,183 entitled "SAFETY LOGIC FOR SURGICAL SUTURING SYSTEMS";
U.S. patent application 16/128,172 entitled "rolling SYSTEM WITH SEPARATE photo economical RECEIVER"; and
U.S. patent application 16/128,185 entitled "FORCE SENSOR method STRUCTURED LIGHT diffusion".
The applicant of the present application also owns the following U.S. patent applications filed 2018 on month 3 and 29, each of which is incorporated herein by reference in its entirety:
U.S. patent application 15/940,627 entitled "DRIVE ARRANGEMENTS FOR ROBOT-associated regulated SURGICAL PLATFORMS";
U.S. patent application 15/940,676 entitled "AUTOMATIC TOOL ADJUSTMENT FOR ROBOT-ASSISTED SURGICAL PLATFORMS";
U.S. patent application 15/940,711 entitled "SENSING ARRANGEMENTS FOR ROBOT-associated regulated SURGICAL PLATFORMS"; and
U.S. patent application 15/940,722 entitled "CHARACTERIZATION OF TISSUE IRREGULARITIES THROUGH THE USE OF MONO-CHROMATIC LIGHT REFRACTIVITY".
Before explaining aspects of a robotic surgical platform in detail, it should be noted that the illustrative examples are not limited in application or use to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented or incorporated in other aspects, variations and modifications, and may be practiced or carried out in various ways. Furthermore, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative examples for the convenience of the reader and are not for the purpose of limitation. Moreover, it is to be understood that one or more of the following described aspects, expressions of aspects, and/or examples may be combined with any one or more of the other following described aspects, expressions of aspects, and/or examples.
Robot system
An exemplary robotic system 110 is depicted in fig. 1. The robotic system 110 is a Minimally Invasive Robotic Surgery (MIRS) system typically used to perform minimally invasive diagnostics or surgery on a patient 112 lying on an operating table 114. The robotic system 110 includes a surgeon's console 116 for use by a surgeon 118 during surgery. One or more assistants 120 may also participate in the procedure. The robotic system 110 may also include a patient side cart 122 (i.e., a surgical robot) and an electronics cart 124. As the surgeon 118 views the surgical site through the console 116, the surgical robot 122 may manipulate at least one removably coupled tool assembly 126 (hereinafter "tool") through a minimally invasive incision in the patient 112. Images of the surgical site may be obtained by an imaging device, such as a stereoscopic endoscope 128, which may be manipulated by the surgical robot 122 to orient the endoscope 128. Alternative imaging devices are also contemplated.
The electronics cart 124 may be used to process images of the surgical site for subsequent display to the surgeon 118 via the surgeon's console 116. In some cases, the electronics of the electronics cart 124 may be incorporated into another structure in the operating room, such as the operating table 114, the surgical robot 122, the surgeon's console 116, and/or another control station. The number of robotic tools 126 that are disposable will typically depend on such factors as the diagnostic or surgical procedure and the space constraints within the operating room. If it is necessary to change one or more of the robotic tools 126 used during the procedure, the assistant 120 may remove the robotic tool 126 from the surgical robot 122 and replace it with another tool 126 from a tray 130 in the operating room.
Referring primarily to fig. 2, the surgeon's console 116 includes a left eye display 132 and a right eye display 134 for presenting the surgeon 118 with coordinated stereoscopic views of the surgical site that enable depth perception. The console 116 also includes one or more input control devices 136 that, in turn, cause the surgical robot 122 to manipulate one or more tools 126. The input control device 136 may provide the same degrees of freedom as its associated tool 126 to provide the surgeon with the perception that the remote presence or input control device 136 is integral with the robotic tool 126 so that the surgeon has a strong sense of directly controlling the robotic tool 126. To this end, position, force, and tactile sensations from the robotic tool 126 may be transmitted back to the surgeon's hand through the input control device 136 using position sensors, force sensors, and tactile feedback sensors. The surgeon's console 116 may be located in the same room as the patient 112 so that the surgeon 118 may directly monitor the procedure, be personally on site if necessary, and may talk directly to the assistant 120 rather than over a telephone or other communication medium. However, the surgeon 118 may be located in a different room, completely different building, or other remote location than the patient 112, thereby allowing for remote surgery. A sterile field may be defined around the surgical site. In various instances, the surgeon 118 may be positioned outside of the sterile field.
Referring again to fig. 1, the electronics cart 124 may be coupled with the endoscope 128, and may include a processor to process the captured images for subsequent display, such as to the surgeon on the surgeon's console 116 or on another suitable display located locally and/or remotely. For example, when using the stereoscopic endoscope 128, the electronics cart 124 may process the captured images to present the surgeon with a coordinated stereoscopic image of the surgical site. Such coordination may include alignment between the opposing images, and may include adjusting a stereoscopic working distance of the stereoscopic endoscope. As another example, image processing may include using previously determined camera calibration parameters to compensate for imaging errors of the image capture device, such as optical aberrations. In various instances, the robotic system 110 may incorporate a surgical visualization system as further described herein such that an enhanced view of the surgical site including hidden critical structures, three-dimensional topography, and/or one or more distances may be communicated to the surgeon at the surgeon's console 116.
Fig. 3 schematically illustrates a robotic surgical system 150, such as the MIRS system 110 (fig. 1). As discussed herein, a surgeon's console 152, such as surgeon's console 116 (fig. 1 and 2), may be used by the surgeon to control a surgical robot 154, such as surgical robot 122 (fig. 1), during minimally invasive surgery. The surgical robot 154 may use an imaging device, such as a stereoscopic endoscope, to capture images of the surgical site and output the captured images to an electronics cart 156, such as the electronics cart 124 (fig. 1). As discussed herein, the electronics cart 156 may process the captured image in various ways prior to any subsequent display. For example, the electronics cart 156 may overlay the captured images with a virtual control interface before displaying the combined images to the surgeon via the surgeon's console 152. The surgical robot 154 may output the captured images for processing outside of the electronics cart 156. For example, the surgical robot 154 may output the captured images to a processor 158, which may be used to process the captured images. These images may also be processed by a combination of the electronics cart 156 and the processor 158, which may be coupled together to process the captured images collectively, sequentially, and/or combinations thereof. One or more separate displays 160 may also be coupled with the processor 158 and/or the electronics cart 156 for local and/or remote display of images, such as images of a surgical site or other related images.
The reader will understand that a variety of robotic tools may be used with the surgical robot 122, and that an exemplary robotic tool is described herein. Referring again to fig. 1, the surgical robot 122 is shown providing for manipulation of three robotic tools 126 and an imaging device 128, such as a stereoscopic endoscope for capturing images of a surgical site. The steering is provided by a robotic mechanism having a plurality of robotic joints. The imaging device 128 and the robotic tool 126 may be positioned and manipulated through an incision in the patient so as to maintain a remote center of motion or virtual pivot at the incision to minimize the size of the incision. The image of the surgical site may include an image of the distal end of the robotic tool 126 while located within a field of view (FOV) of the imaging device 128. Each tool 126 is detachable from and carried by a respective surgical manipulator at a distal end of one or more of the robotic joints. The surgical manipulator provides a movable platform for moving the entire tool 126 relative to the surgical robot 122 via movement of the robotic joints. The surgical manipulator also provides power to operate the robotic tool 126 using one or more mechanical and/or electrical interfaces. In various instances, one or more motors may be housed in the surgical manipulator to generate the control motions. One or more transmission devices may be employed to selectively couple the motors to various actuation systems in the robotic tool.
The robotic system described above is further described in U.S. patent application 15/940,627 entitled "DRIVE ARRANGEMENTS FOR ROBOT-associated minor platformes" filed on 29/3/2018, which is incorporated herein by reference in its entirety. Alternative robotic systems are also contemplated.
Referring now to FIG. 4, a surgeon's console or control unit 250 is shown. The surgeon's console 250 may be used in conjunction with the robotic system to control any two surgical tools coupled to the robotic system. These surgical tools may be controlled by a handle assembly 256 of the surgeon's console 250. For example, the handle assembly 256 and the robotic arm have a master-slave relationship such that movement of the handle assembly 256 produces corresponding movement of the surgical tool. The controller 254 receives input signals from the handle assembly 256, calculates corresponding movements of the surgical tool, and provides output signals to move the robotic arm and the surgical tool.
The handle assembly 256 is located near a surgeon's chair 258 and is coupled to the controller 254. The controller 254 may include one or more microprocessors, memory devices, drivers, etc., that convert input information from the handle assembly 256 into output control signals that move the robotic arm and/or actuate the surgical tool. The surgeon's chair 258 and handle assembly 256 may be located in front of the video console 248, which may be connected to an endoscope to provide video images of the patient. The surgeon's console 250 may also include a screen 260 coupled to the controller 254. The screen 260 may display a Graphical User Interface (GUI) that allows the surgeon to control various functions and parameters of the robotic system.
Each handle assembly 256 includes a handle/wrist assembly 262. The handle/wrist assembly 262 has a handle 264 coupled to a wrist 266. The wrist 266 is connected to a forearm link 268 that slides along a slide bar 270. The sliding bar 270 is pivotally connected to an elbow joint 272. The elbow joint 272 is pivotally connected to a shoulder joint 274 that is attached to the controller 254. A surgeon seated at surgeon's console 250 may provide input control motions to handle assembly 256 to effect movement and/or actuation of a surgical tool communicatively coupled thereto. For example, the surgeon may advance the forearm link 268 along the slide bar 270 to advance the surgical tool toward the surgical site. Rotation at the wrist 266, elbow joint 272, and/or shoulder joint 274 may effect rotation and/or articulation of the surgical tool about the corresponding axis. The robotic system and surgeon's console 250 are further described in U.S. patent 6,951,535 entitled "TELE-MEDICINE SYSTEM THAT TRANSMITS AN ENTIRE STATE OF A SUBSYSTEM" published on 10/4/2005, the entire disclosure OF which is incorporated herein by reference.
A handle assembly for use at a surgeon's console is further depicted in fig. 5. The handle assembly of fig. 5 includes a control input wrist 352 and a touch sensitive handle 325. Control input wrist 352 is a gimbal device that pivotally supports touch sensitive handle 325 to generate control signals for controlling the robotic surgical manipulator and the robotic surgical tool. The pair of control input wrists 352 and the touch-sensitive handle 325 may be supported by a pair of control input arms in the workspace of the surgeon's console.
Control input wrist 352 includes first, second, and third gimbal members 362, 364, and 366, respectively. The third gimbal member 366 is rotatably mounted to the control input arm. The touch sensitive handle 325 includes a tubular support structure 351, a first grip 350A, and a second grip 350B. First and second clamps 350A, 350B are supported at one end by tubular support structure 351. The touch-sensitive handle 325 is rotatable about an axis G. The clamps 350A, 350B may be squeezed or tightened together around the tubular support structure 351. The "tightening" or gripping freedom in the clamp is indicated by arrows Ha and Hb.
The touch-sensitive handle 325 is rotatably supported by the first gimbal member 362 by way of the rotary joint 356 g. The first gimbal member 362 is in turn rotatably supported by the second gimbal member 364 by means of the rotary joint 356 f. Similarly, the second gimbal member 364 is rotatably supported by the third gimbal member 366 using the rotary joint 356 e. In this way, the control input wrist 352 allows the touch sensitive handle 325 to move and orient in the workspace using three degrees of freedom.
Movements in gimbals 362, 364, 366 of control input wrist 352 that reorient touch-sensitive handle 325 in space may be converted into control signals to control the robotic surgical manipulator and robotic surgical tool. Movement in the grips 350A and 350B of the touch-sensitive handle 325 may also be converted into control signals to control the robotic surgical manipulator and the robotic surgical tool. In particular, the squeezing motion of jaws 350A and 350B in their degrees of freedom of movement indicated by arrows Ha and Hb may be used to control the end effector of the robotic surgical tool.
To sense movement in the touch sensitive handle 325 and generate control signals, sensors may be mounted in the handle 325 and the first gimbal member 362 of the control input wrist 352. Exemplary sensors may be, for example, pressure sensors, hall effect transducers, potentiometers, and/or encoders. The ROBOTIC SURGICAL system and handle assembly OF fig. 5 are further described in U.S. patent 8,224,484 entitled "METHODS OF USER INTERFACE WITH ALTERNATIVE TOOL FOR rotary SURGICAL TOOLs", published on 7, month 17, 2012, the entire disclosure OF which is incorporated herein by reference.
Existing robotic systems may incorporate surgical visualization systems, as further described herein. In such cases, additional information about the surgical site may be determined and/or communicated to a clinician in the operating room, such as to a surgeon positioned at the surgeon's console. For example, the clinician may view a realistic augmented view of the surgical site that includes additional information such as various contours of the tissue surface, hidden critical structures, and/or one or more distances relative to the anatomical structures. In various instances, the proximity data may be utilized to improve one or more operations of and/or control of a robotic surgical system, as further described herein.
Input control device
Referring again to the robotic system 150 in fig. 3, the surgeon's console 152 allows the surgeon to provide manual input commands to the surgical robot 154 to effect control of the surgical tool and its various actuations. Movement of the input control device by the surgeon at the surgeon's console 152 within the predefined working volume or working envelope causes a corresponding movement or operation of the surgical tool. For example, referring again to fig. 2, the surgeon may engage each input control device 136 with one hand and move the input control device 136 within the working envelope to provide control motions to the surgical tool. The surgeon's console (e.g., surgeon's console 116 in fig. 1 and 2 and surgeon's console 250 in fig. 4) may be expensive and require a large footprint. For example, the working volume of user input devices at the surgeon's console (e.g., handle/wrist assembly 262 in fig. 4 and control input wrist 352 and touch-sensitive handle 325 in fig. 5) may require a large footprint, which may impact available space, training modalities, and collaboration protocols in an Operating Room (OR), for example. For example, such a large footprint may preclude the option of having multiple control stations in the OR (such as additional control stations for training OR for use by an assistant). In addition, such size and volume (bulkiness) surgeon consoles may not be convenient to reposition in or move between operating rooms, for example.
Ergonomics is an important consideration for surgeons who may spend many hours performing surgery each day and/or stay at the surgeon's console. Excessive, repetitive motion during surgery can cause fatigue and chronic injury to the surgeon. It may be desirable to maintain a comfortable posture and/or body position while providing input to the robotic system. However, in some cases, the surgeon's posture and/or position may be affected to ensure proper positioning of the surgical tool. For example, surgeons often tend to twist their hands and/or stretch their arms for long durations. In one instance, the coarse control motions used to move the surgical tool to the surgical site can cause the surgeon's arms to inadvertently stick together and/or protrude excessively when reaching the surgical site. In some cases, the poor ergonomic posture achieved during the coarse control motion may be maintained during subsequent fine control motions (e.g., while manipulating tissue at the surgical site), which may further exacerbate the poor ergonomics of the surgeon. Existing input control devices present a general method without regard to the surgeon's anthropometry; however, the ergonomic impact on the surgeon may vary and the architecture of existing input control devices may place a greater burden on certain body types.
In some cases, the input control device may be constrained within an operating envelope that defines its range of motion. For example, the configuration of the linkage on the surgeon's console and/or input control device may limit the range of motion of the input control device. In some cases, the input control device may reach the end of its range of motion before the surgical tool is properly positioned. In such cases, a clutch mechanism may be required to reposition the input control device within the working envelope to complete the positioning of the surgical tool. For example, a hypothetical working envelope 280 is shown in fig. 4. In various instances, a surgeon may be required to actuate a clutch (typically in the form of a foot pedal or an additional button on the handle of the input control device) to temporarily disengage the input control device from the surgical tool as the input control device is repositioned to a desired position within the working envelope. This non-surgical motion of the surgeon, by virtue of being an arm motion of the surgeon at the surgeon's console, may be referred to as a "rowing" motion that properly repositions the user input device within the working envelope. Upon release of the clutch, movement of the input control device may again control the surgical tool.
Clutching the input control device to maintain proper position within the working envelope can place additional cognitive load on the surgeon. In such cases, the surgeon is required to constantly monitor the position and orientation of his/her hand relative to the boundaries of the working envelope. In addition, the clutching or "stroking" motion can be cumbersome for the surgeon and this monotonous, repetitive motion does not match a similar workflow of the surgical procedure outside the context of robotic surgery. Clutching also requires the surgeon to match the previous orientation of the handle when re-engaging the system. For example, upon completing a complex range of motion for the surgeon to "swipe" or clutch the input control device back to a comfortable starting position, the surgeon and/or surgical robot must match the orientation of the handle of the input control device in the starting position with the previous orientation of the handle in the extended position, which can be challenging and/or require complex logic and/or mechanics.
The need for a clutch mechanism may also limit the availability of controls on the handle of the input control device. For example, the clutch actuator may occupy a valuable real estate (real estate) on the handle, which may cognitively and physically limit the availability of other controls on the handle. In turn, the complexity of other subsystems, such as foot pedals (pedal boards), is increased and may require the surgeon to utilize multiple input systems to accomplish a simple task.
A non-clutched alternative to such an input control device may, for example, reduce the footprint and cost of the surgeon's console, improve the surgeon's ergonomic experience, eliminate the physical and cognitive burden associated with clutching, and/or provide additional real estate on the input control device for additional input controls. An exemplary clutchless input control device is further described herein. Such non-clutched input control devices may be used with a variety of robotic systems. Furthermore, as further described herein, the clutchless input control device may utilize information from various distance determination subsystems also disclosed herein. For example, real-time structured light and three-dimensional shape modeling may inform logic components of such non-clutched input control devices such that a first mode and/or first set of controls are enabled outside a predefined distance of the anatomical surface and/or critical structures and a second mode and/or second set of controls are enabled within a predefined distance of the anatomical structure and/or critical structures. Various tissue proximity applications are further described herein.
Referring now to fig. 6-11, an input control device 1000 is shown. The input control device 1000 is a clutchless input control device, as further described herein. The input control device 1000 may be utilized at a surgeon's console or at a workspace of a robotic surgical system. For example, input control device 1000 may be incorporated into a surgical system, such as surgical system 110 (fig. 1) or surgical system 150 (fig. 3), to provide control signals to a surgical robot and/or a surgical tool coupled thereto. The input control device 1000 includes input controls for moving the robotic arm and/or surgical tool in three-dimensional space. For example, a surgical tool controlled by input control device 1000 may be configured to move and/or rotate relative to X, Y and the Z-axis.
An exemplary surgical tool 1050 is shown in fig. 12. The surgical tool 1050 is a grasper including an end effector 1052 having opposed jaws 1054 configured to releasably grasp tissue. By having the surgical tool 1050 along its Xt、YtAnd ZtThe shaft translates to manipulate the surgical tool 1050 in three dimensions. The surgical tool 1050 also includes a plurality of joints such that the surgical tool may be rotated and/or articulated to a desired configuration. The surgical tool 1050 may be configured to surround an X defined by a longitudinal axis of the surgical tool 1050tThe axis rotating or rolling about a direction parallel to YtThe first articulation axis of the shaft being rotated or articulated and about a plane parallel to ZtThe second articulation axis of the shaft rotates or articulates. Around XtShaft roll corresponds to the end effector 1052 being in direction RtAbout a first articulation axis corresponding to the end effector 1052 being in the direction PtAnd about a second articulation axisNodal movement corresponding to direction TtA yaw or torsional motion.
An input control device (such as input control device 1000) may be configured to control translation and rotation of the end effector 1052. To control such movements, the input control device 1000 includes corresponding input controls. For example, the input control device 1000 includes at least six degree of freedom input controls for the surgical tool 1050 along X in three-dimensional space t、YtAnd ZtShaft movement for the end effector 1052 about XtThe shaft rolls and is used to articulate the end effector 1052 about the first and second articulation axes. In addition, the input control device 1000 includes an end effector actuator for actuating opposing jaws of the end effector 1052 to manipulate or clamp tissue. Additional features of the input control device 1000 related to a surgical tool (such as the surgical tool 1050) will be further described herein.
Referring again to fig. 6-11, the input control device 1000 includes a multi-dimensional spatial joint 1006 having a central portion 1002 supported on a base 1004. The base 1004 is structured to rest on a surface, such as a table or work surface at a surgeon's console/workspace or at a patient's bedside. The base 1004 defines a circular base having contoured edges; however, alternative geometries are envisaged. The base 1004 may remain in a fixed, stationary position relative to the underlying surface when input controls are applied thereto. In some cases, the base 1004 may be releasably secured and/or clamped to the underlying surface using fasteners, such as threaded fasteners. In other cases, fasteners may not be required to hold the base 1004 to the underlying surface. In various instances, the base 1004 may include an adhesive or cohesive bottom surface and/or suction features (e.g., suction cups or magnets) for gripping an underlying surface. In some cases, the base 1004 may include a ribbed and/or grooved bottom surface for engaging a complementary underlying support surface.
The spatial joint 1006 is configured to receive a multi-dimensional manual input from a surgeon (e.g., a surgeon's hand or arm) that corresponds to a control motion of the surgical tool in a multi-dimensional space. The central portion 1002 of the spatial joint 1006 is configured to be able to receive input forces in multiple directions, such as along and/or about X, Y and the Z-axis. The central portion 1002 may comprise a cylinder, shaft, or hemisphere that is raised, lowered, and rotated, such as protruding from the base 1004. The central portion 1002 is flexibly supported relative to the base 1004 such that the cylinder, shaft, and/or hemisphere is configured to be able to move or float within a small predefined area upon receipt of a force control input thereto. For example, the central portion 1002 may be a floating shaft supported on the base 1004 by one or more elastomeric members (such as springs). The central portion 1002 may be configured to be able to move or float within a predefined three-dimensional volume. For example, an elastomeric coupling may allow the central portion 1002 to move relative to the base 1004; however, the limiting plates, pins, and/or other structures may be configured to limit the range of motion of the central portion 1002 relative to the base 1004. In one aspect, movement of the central portion 1002 relative to the base 1004 from a center or "home" position can be permitted in a range of about 1.0mm to about 5.0mm in any direction (up, down, left, right, back, and forward). In other cases, movement of the central portion 1002 relative to the base 1004 may be limited to less than 1.0mm or greater than 5.0 mm. In some cases, the central portion 1002 may move about 2.0mm in all directions relative to the base 1004. In various cases, the spatial joint 1006 may resemble a multi-dimensional mouse or a spatial mouse. An exemplary spatial mouse is provided by 3Dconnexion, inc, and is described at www.d3connexion.com.
In various instances, the space joint 1006 includes a multi-axis force and/or torque sensor arrangement 1048 (see fig. 8 and 9) configured to detect input forces and moments applied to the central portion 1002 and transferred to the space joint 1006. The sensor arrangement 1048 is positioned on one or more surfaces at the interface between the central portion 1002 and the base 1004. In other cases, the sensor arrangement 1048 may be embedded in the central portion 1002 or the base 1004. In other cases, the sensor arrangement 1048 can be positioned on a floating member that is positioned intermediate the central portion 1002 and the base 1004.
For example, the sensor arrangement 1048 may include one or more resistive strain gauges, optical force sensors, optical distance sensors, miniature cameras with dimensions in the range of about 1.0mm to about 3.0mm, and/or time-of-flight sensors utilizing pulsed light sources. In one aspect, the sensor arrangement 1048 comprises a plurality of resistive strain gauges configured to be able to detect different force vectors applied thereto. For example, the strain gauges may define a wheatstone bridge configuration. Additionally or alternatively, the sensor arrangement 1048 may comprise a plurality of photosensors, such as a measurement unit comprising a position sensitive detector illuminated by light emitting elements (such as LEDs). Alternative force detection sensor arrangements are also envisaged. Exemplary multi-dimensional input devices and/or sensor arrangements are further described in the following references, which are incorporated herein by reference in their entirety:
U.S. Pat. No. 4,785,180 entitled "OPTOELECTRIC SYSTEM HOUSED IN A PLASTIC SPHERE" published on 15/11/1988;
U.S. patent 6,804,012 entitled "ARRANGEMENT FOR THE DETECTION OF RELATIVE MOVEMENTS OR RELATIVE POSITION OF TWO OBJECTS" published on 12.10.2004;
european patent application 1,850,210 published on 31.10.2007 under the name "OPENTELECTRONIC DEVICE FOR DETERMINING RELATIVE MOVEMENTS OR RELATIVE POSITIONS OF TWO OBJECTS";
U.S. patent application publication 2008/0001919 entitled "USER INTERFACE DEVICE" published on 3.1.2008; and
U.S. patent 7,516,675 entitled "JOYSTICK SENSOR APPATUS" published on 4-month and 14-month of 2009.
Referring again to the input control device 1000 of fig. 6-11, a joystick 1008 extends from the central portion 1002. The force applied to the central portion 1002 via the joystick 1008 defines the input motion of the sensor arrangement 1048. For example, a sensor arrangement 1048 (fig. 8 and 9) in the base 1004 can be configured to detect input forces and moments applied to the joystick 1008 by the surgeon. The lever 1008 may be spring biased toward a center or starting position in which the lever 1008 is aligned with the Z-axis (i.e., a vertical axis passing through the lever 1008, the center portion 1002, and the spatial joint 1006). Actuating (e.g., pushing and/or pulling) the lever 1008 in any direction away from the Z-axis can be configured to "drive" the end effector of the associated surgical tool in a corresponding direction. When the external driving force is removed, the lever 1008 can be configured to return to a center or starting position and can stop the movement of the end effector. For example, the central portion 1002 and the lever 1008 may be spring biased toward a starting position.
In each case, the spatial joint 1006 and the joystick 1008 coupled thereto define a six degree-of-freedom input control. Referring now again to the end effector 1052 of the surgical tool 1050 in FIG. 12, the force in the X direction against the lever 1008 of the input control device 1000 corresponds to the end effector 1052 being moved along its X directiontDisplacement of the shaft (e.g., longitudinally), a force in the Y-direction to the lever 1008 corresponding to the end effector 1052 being along its Y-directiontDisplacement of the axis (e.g., laterally), and a force in the Z direction to the joystick 1008 corresponds to the end effector 1052 being along ZtDisplacement of the shaft (e.g., vertical/up and down). In addition, the force (moment force R) on the lever 1008 about the X-axis causes the end effector 1052 to pivot about XtRotation of the shaft (e.g. in direction R)tUp rolling motion about the longitudinal axis), a force (moment force P) about the Y-axis to the lever 1008 causes the end effector 1052 to rotate about YtArticulation of the shaft (e.g. direction P)tUp) and a force (moment force T) to the joystick 1008 about the Z-axis causes the end effector 1052 to move about Z of the end effectortArticulation of the shaft (e.g. direction T)tA yaw or torsional motion). In such cases, the input control device 1000 includes a six degree-of-freedom joystick configured to receive and detect six degrees-of-freedom forces along the X, Y and Z axes and moments about the X, Y and Z axes. These forces may correspond to translational inputs to the end effector 1052 of the associated surgical tool 1050 and these moments may be Corresponding to the rotational input. A six degree of freedom input device will be described further herein. Additional joints that may be supported by the joystick 1008 provide additional degrees of freedom (e.g., for actuating jaws of the end effector or rolling the end effector about a longitudinal axis), as further described herein.
In various instances, the input control device 1000 includes a wrist or joint 1010 offset from the spatial joint 1006. The wrist 1010 is offset from the spatial joint 1006 by a shaft or lever 1012 extending along a shaft axis S, which is parallel to the X-axis in the configuration shown in fig. 6. For example, the lever 1008 may extend vertically upright from the central portion 1002 and the base 1004, and the lever 1008 may support the shaft 1012.
As further described herein, the spatial joint 1006 may define input control motions for multiple degrees of freedom. For example, the spatial joint 1006 may define input control motions for translation of the surgical tool in three-dimensional space and articulation of the surgical tool about at least one axis. The scrolling motion may also be controlled by input to the spatial joint 1006, as further described herein. Further, wrist 1010 may define an input control motion for at least one degree of freedom. For example, the wrist 1010 may define an input control motion for a roll motion of the end effector. In addition, the wrist 1010 may support an end effector actuator 1020 (which will be described further herein) to apply opening and closing motions to the end effector.
In some cases, the roll, yaw, and pitch motions of the input control device 1000 are translatable motions defining corresponding input control motions of the associated end effector. In various instances, the input control device 1000 may utilize an adjustable zoom and/or gain such that the motion of the end effector may be scaled in relation to the control motion delivered at the wrist 1010.
In one aspect, input control device 1000 includes a plurality of mechanical joints, which may be, for example, elastically coupled components, sliders, journaled shafts, hinges, and/or rotational bearings. These mechanical joints include a first joint 1040 (at the spatial joint 1006) intermediate the base 1004 and the central portion 1002 (which allows the central portion 1002 to oppose to the central portion 1002Base 1004 rotates and tilts) and a second joint 1044 (which allows rotation of wrist 1010 relative to joystick 1008). In various circumstances, the six degrees of freedom (e.g., three-dimensional translation and rotation about three different axes) of the robotic end effector may be controlled by, for example, user input at only these two joints 1040, 1044. With respect to movement at first joint 1040, central portion 1002 may be configured to float relative to base 1004 at an elastic coupling, as further described herein. With respect to the second joint 1044, the wrist 1010 may be rotatably coupled to the shaft 1012 such that the wrist 1010 may rotate about the shaft axis S in the direction R (fig. 6). Rotation of the wrist 1010 relative to the shaft 1012 may correspond to a rolling motion of the end effector about a central tool axis, such as end effector 1052 about X tThe rolling of the shaft. Rotating wrist 1010 by the surgeon to roll the end effector can provide control of the roll motion at the surgeon's fingertip and correspond to first-person perspective control of the end effector (i.e., from the surgeon's perspective, at the jaw of a remotely located end effector that is "positioned" at the surgical site). Such arrangements and perspectives can be utilized to provide precise control motions to input control device 1000 during portions of a surgical procedure (e.g., a precise motion mode), as further described herein.
The various rotary joints of the input control device may include sensor arrangements configured to be able to detect the rotary input control applied thereto. Wrist 1010 may include a rotation sensor (e.g., sensor 1049 in fig. 25) which may be, for example, a rotational force/torque sensor and/or transducer, a rotational strain gauge and/or strain gauge on a spring, a rotational encoder, and/or an optical sensor to detect rotational displacement at the joint.
In some cases, the input control device 1000 can include one or more additional joints and/or hinges to impart rotational input motions corresponding to the articulation of the end effector. For example, the input control device 1000 may include a hinge along the shaft 1012 and/or between the shaft 1012 and the joystick 1008. In one case, hinged input motions at such joints can be detected by another sensor arrangement and converted to rotational input control motions of the end effector, such as yaw or pitch articulation motions of the end effector. Such an arrangement requires one or more additional sensor arrangements and would increase the mechanical complexity of the input control device.
The input control device 1000 also includes an end effector actuator 1020. The end effector actuator 1020 includes opposing fingers 1022 that extend from the wrist 1010 toward the joystick 1008 and the central portion 1002 of the spatial joint 1006. Opposing fingers 1022 extend distally beyond the space joint 1006. In such cases, wrist 1010 is proximal to spatial joint 1006 and distal ends 1024 of opposing fingers 1022 are distal to spatial joint 1006, which mirrors, for example, jaws positioned distal to an articulation joint of a robotic tool. Applying an actuation force to opposing fingers 1022 includes input control of the surgical tool. For example, referring again to fig. 12, application of a compressive force to the opposing fingers 1022 can close and/or clamp the jaws 1054 of the end effector 1052 (see arrow C in fig. 12). In various circumstances, application of an expansion force can open and/or release the jaws 1054 of the end effector 1052, such as for expansion of an anatomical task. The end effector actuator 1020 may include at least one sensor for detecting input control motions applied to the opposing fingers 1022. For example, the end effector actuator may include a displacement sensor and/or a rotary encoder for detecting input control motions applied to pivot the opposing fingers 1022 relative to the shaft 1012.
In various instances, the end effector actuator 1020 may include one or more loops 1030 that are sized and positioned to receive a surgeon's fingers. For example, referring primarily to fig. 10 and 11, the surgeon's thumb T is positioned through one of the rings 1030 and the surgeon's middle finger M is positioned through the other ring 1030. In such instances, the surgeon may pinch and/or deploy his thumb T and middle finger M to actuate the end effector actuator 1020. In other cases, the structure of the ring 1030 may be designed to receive more than one finger, and different fingers may engage the rings depending on the arrangement of the ring 1030. In various circumstances, the finger ring 1030 may facilitate expansion of anatomical functionality and/or translation of the robotic tool up or down (i.e., application of a vertical force, for example, at the spatial joint 1006). In some cases, ring 1030 may define a complete ring; however, in other cases, a partial ring (e.g., a semicircle) may be utilized. In other cases, the end effector actuator 1020 may not include the ring 1030. For example, the end effector actuator 1020 may be spring biased outward such that no loop is required to pull the opposing fingers 1022 apart, such as for expanding anatomical functions.
The opposing fingers 1022 of the end effector actuator 1020 define a line of symmetry aligned with a longitudinal shaft axis S along which the shaft 1012 extends when the fingers 1022 are in the unactuated position. The line of symmetry is parallel to the X-axis through the multi-dimensional spatial joint 1006. Further, the center axis of the lever 1008 is aligned with the line of symmetry. In each case, the movement of the opposing fingers 1022 may be independent. In other words, the opposing fingers 1022 may be asymmetrically displaced relative to the longitudinal shaft axis S during actuation. The displacement of the opposing fingers 1022 may depend on, for example, the force applied by the surgeon. With certain surgical tools, the jaws of the end effector can be pivoted about an articulation axis such that the various closed positions of the jaws are not longitudinally aligned with the shaft of the surgical tool. Further, in some instances, it may be desirable to hold one jaw stationary (such as against delicate tissue and/or critical structures) and move the other jaw relative to the non-moving jaw. For example, to accommodate such closing motions, the range of motion of the opposing fingers 1022 on the input control device 1000 can be greater than the range of motion of the jaws of the end effector. For example, referring to fig. 12A, the surgical tool 1050 'is shown in an articulated configuration in which the jaws may be clamped together out of alignment with the longitudinal shaft axis of the surgical tool 1050'. In such instances, the jaws, and thus the fingers 1022 (fig. 6-11) on the input control device 1000, will be asymmetrically actuated to move the jaws of the end effector 1052 into the closed configuration.
Referring now to fig. 13A-15B, the various control motions applied to the end effector actuator 1020 and the corresponding actuation of the end effector 1062 are illustrated. The end effector 1062 includes opposing jaws 1064 that are movable between an open configuration (fig. 13A), an intermediate configuration (fig. 14A), and a closed configuration (fig. 15A) as the opposing fingers 1022 of the end effector actuator 1020 move between the open configuration (fig. 13B), the intermediate configuration (fig. 14B), and the closed configuration (fig. 15B), respectively.
The input control device 1000 also includes at least one additional actuator, such as actuation buttons 1026, 1028, that can provide additional control at the surgeon's fingertip. For example, the actuation buttons 1026, 1028 are positioned on the joystick 1008 of the input control device 1000 such that the surgeon can contact the buttons 1026, 1026 with a finger (such as the index finger I). The actuation buttons 1026, 1028 may correspond to buttons for activating surgical tools, such as firing, extending, activating, translating, and/or retracting a scalpel, energizing one or more electrodes, adjusting energy modularity, affecting diagnostics, biopsy sampling, ablation, and/or other surgical tasks. In other cases, actuating the buttons 1026, 1028 may provide input to the imaging system to adjust the view of the surgical tool, such as zoom in/out, pan, track, tilt (tilting), and/or rotate. In some cases, the actuator may be positioned in a different location than the actuation buttons 1026, 1028, such as for use by a thumb or another finger. Additionally or alternatively, the actuator may be provided on, for example, a touch screen and/or a foot switch.
Referring now primarily to fig. 10 and 11, a user is configured to be able to position his or her hand relative to input control device 1000 such that wrist 1010 is proximal to spatial joint 1006. More specifically, the user's palm is positioned near wrist 1010 and the user's fingers extend distally toward joystick 1008 and central portion 1002 of spatial joint 1006. Distally extending fingers 1022 (for actuating jaws) and actuation buttons 1026, 1028 (for actuating surgical functions at the jaws) are distal to the spatial joint 1006 and wrist 1010. This configuration mirrors that of the surgical tool in which the end effector is distal of the more proximal articulation joint and/or rotatable shaft and thus provides an intuitive arrangement that facilitates surgeon training and adoption of input control device 1000.
In various instances, a clutchless input control device including a six degree of freedom input control, an end effector actuator, and an additional actuation button may define an alternative geometry for the input control device 1000. In other words, a clutchless input control device does not specify a particular form of joystick assembly for the input control device 1000. Instead, a wide variety of interfaces may be designed based on formative testing and user preferences. In various circumstances, the robotic system may allow a user to choose from a number of different forms to select the style that best suits his/her needs. For example, pliers, a pistol, a ball, a pen, and/or a hybrid grip and other input controls may be supported. Alternative designs are further described herein and in various commonly owned patent applications, which are incorporated by reference in their entirety.
In each case, the input control of the input control device 1000 is segmented between a first control motion and a second control motion. For example, a first control motion and/or parameter thereof may be actuated in a first mode and a second control motion and/or parameter thereof may be actuated in a second mode. The mode may be based on factors provided by the surgeon and/or the surgical robotic control system and/or detected during the surgical procedure. For example, the pattern may depend on the proximity of the surgical tool to the tissue, such as the proximity of the surgical tool to the tissue surface and/or to critical structures. Various distance determination systems for determining proximity to one or more exposed and/or at least partially hidden critical structures will be further described herein.
In one case, referring now to fig. 25, the input control device 1000 is communicably coupled to a control circuit 832 of a control system 833, which will be further described herein. In the control system 833, the control circuitry 832 may receive input signals from the input control device 1000, such as feedback detected by various sensors therein and related to control inputs at the joystick 1008 and/or wrist 1010 and/or outputs from various sensors thereon (e.g., the sensor arrangement 1048 and/or the rotation sensor 1049 at the wrist 1010). For example, signals detected by sensor arrangement 1048 (i.e., the multi-axis force and torque sensors of space joint 1006) may be provided to control circuitry 832. In addition, a signal detected by sensor 1049 (i.e., a rotation sensor of wrist 1010) may be provided to control circuit 832. The memory 834 of the control system 833 further includes control logic components for implementing input control provided to the input control device 1000 and detected by various sensors (e.g., sensors 1048 and 1049).
Referring now to fig. 11A, control logic component 1068 of input control device 1000 may implement first mode 1070 when the distance determined by the distance determination subsystem is greater than or equal to the threshold distance and may implement second mode 1072 when the distance determined by the distance determination subsystem is less than the threshold distance. The control logic may be used, for example, in control circuit 832, control circuit 1400 (fig. 11C), combinatorial logic 1410 (fig. 11D), and/or sequential logic 1420 (fig. 11E), where inputs are provided by inputs to input control device 1000 (fig. 6-11) and/or the surgical visualization system or distance determination subsystem thereof, as further described herein.
For example, turning to fig. 11C, the control circuit 1400 may be configured to control aspects of the input control device 1000 in accordance with at least one aspect of the present disclosure. The control circuitry 1400 may be configured to implement various processes described herein. The control circuit 1400 may include a microcontroller including one or more processors 1402 (e.g., microprocessors, microcontrollers) coupled to at least one memory circuit 1404. Memory circuit 1404 stores machine-executable instructions that, when executed by processor 1402, cause processor 1402 to execute the machine instructions to implement the various processes described herein. Processor 1402 may be any of a variety of single-core or multi-core processors known in the art. The memory circuit 1404 may include volatile storage media and nonvolatile storage media. Processor 1402 may include an instruction processing unit 1406 and an arithmetic unit 1408. The instruction processing unit 1406 may be configured to receive instructions from the memory circuit 1404 of the present disclosure.
Fig. 11D illustrates a combinational logic circuit 1410 that may be configured to control aspects of the input control device 1000 in accordance with at least one aspect of the present disclosure. Combinatorial logic circuitry 1410 may be configured to enable various processes described herein. Combinatorial logic circuitry 1410 may include a finite state machine including combinatorial logic component 1412 configured to receive data associated with input control device 1000 (fig. 6-11) and the surgical visualization system and/or its distance determination subsystem from inputs 1414, process the data through combinatorial logic component 1412, and provide output 1416.
Fig. 11E illustrates a sequential logic circuit 1420 configured to control aspects of the input control apparatus 1000 (fig. 6-11), in accordance with at least one aspect of the present disclosure. For example, sequential logic circuit 1420 or combinational logic component 1422 may be configured to enable the various processes described herein. The sequential logic circuit 1420 may include a finite state machine. The sequential logic circuit 1420 may include, for example, a combinational logic component 1422, at least one memory circuit 1424, and a clock 1429. The at least one memory circuit 1424 can store a current state of the finite state machine. In some cases, the sequential logic circuit 1420 may be synchronous or asynchronous. The combinatorial logic 1422 is configured to receive data associated with the input control device 1000 (fig. 6-11) and the surgical visualization system and/or the distance determining subsystem thereof from the input 1426, process the data through the combinatorial logic 1422, and provide an output 1428. In other aspects, a circuit may comprise a combination of a processor (e.g., processor 1402 in fig. 11C) and a finite state machine to implement various processes herein. In other aspects, the finite state machine may comprise a combination of combinational logic circuitry (e.g., combinational logic circuitry 1410 in fig. 11D) and sequential logic circuitry 1420. Control circuitry similar to control circuitry 1400, 1410, and 1420 may also be utilized to control various aspects of the surgical robot and/or surgical visualization system, as further described herein.
In various instances, the input control device 1000 is configured to be capable of operating in different modes (such as a coarse mode and a fine mode). The variation of the control motion in the different modes can be accomplished by selecting a preset zoom profile. For example, control motions of the multi-dimensional spatial joint 1006 may be employed for coarse mode amplification such that a small force on the spatial joint 1006 causes significant displacement of the end effector. Further, the control motions employed with wrist 1010 may be reduced for a precise mode such that a large moment at wrist 1010 causes a fine rotational displacement of the end effector. The preset zoom profile may be user-selected and/or dependent on, for example, the type and/or complexity of the surgical procedure and/or the experience of the surgeon. Alternative modes of operation and settings are also contemplated.
Referring again to fig. 11A, in some cases, the first mode 1070 may correspond to a coarse control mode and the second mode 1072 may correspond to a fine control mode. One or more user inputs to spatial joint 1006 may correspond to a control input that affects a coarse motion of the surgical tool in first mode 1070, such as a large displacement of the surgical tool toward the surgical site. One or more inputs to wrist 1010 may define a rotational displacement of the surgical tool, such as a roll rotational displacement of the surgical end effector at the surgical site. These segmented controls may be selectively locked such that the scrolling rotational input at wrist 1010 is disabled during portions of the surgical procedure and one or more inputs at spatial joint 1006 are disabled during other portions of the surgical procedure. For example, it may be desirable to lock the rolling rotational input during the first mode 1070 (such as when the surgical end effector is positioned outside a threshold proximity zone around the surgical site and/or critical structure). Further, in various instances, the control motion of spatial joint 1006 and/or wrist 1010 may be scaled up or down based on input from the distance determination system. The zoom parameters that control motion provided to spatial joint 1006 and wrist 1010 may be different in first mode 1070 and second mode 1072. For example, the speed of the robotic tool may be slowed during the fine motion mode and accelerated during the coarse motion mode.
Referring now to FIG. 11B, a table illustrating zoom scenarios in various operating modes is depicted. An input control device, such as input control device 1000 (fig. 6-11), may be configured to receive at least six different inputs (e.g., input a, input B, etc.) corresponding to the six degrees of freedom of a surgical tool coupled thereto. These inputs may be scaled based on the mode of operation (e.g., first mode 1070, second mode 1072, etc.) determined by the input to the control circuitry, such as proximity data from a distance determination subsystem of the surgical visualization system. The first rule list 1074 includes first control parameters for controlling the surgical tool based on input from the input control device 1000. The second rule list 1076 includes second control parameters for controlling the surgical tool based on input from the input control device 1000. In some cases, such as when the input is "locked," the variable values in the rule lists 1074, 1076 may be zero. Additional modes and additional rules/control parameters may be envisaged.
In various aspects, the coarse motion described in the present disclosure is a coarse translational motion characterized by a speed selected from a range of about 3 inches/second to about 4 inches/second. In at least one example, the coarse translational motion according to the present disclosure is about 3.5 inches/second. In various aspects, in contrast, the fine motion described in the present disclosure may be a fine translational motion characterized by a speed of less than or equal to 1.5 inches/second. In various aspects, the fine motion described in the present disclosure may be a fine translational motion characterized by a speed selected from the range of about 0.5 inches/second to about 2.5 inches/second.
In various aspects, the coarse motion described in this disclosure is a coarse rotational motion characterized by a speed selected from a range of about 10 radians/second to about 14 radians/second. In at least one example, the coarse rotational motion according to the present disclosure is about 12.6 radians/second. In various aspects, in contrast, the fine motion described in the present disclosure may be a fine rotational motion characterized by a speed selected from a range of about 2 radians/second to about 4 radians/second. In at least one example, the fine rotational motion according to the present disclosure is about 2.3 radians/second.
In various aspects, the coarse motion of the present disclosure is two to six times the fine motion. In various aspects, the coarse motion of the present disclosure is three to five times the fine motion.
As described herein, the spatial joint 1006 may define input control motions for six degrees of freedom. For example, the spatial joint 1006 may define input control motions for non-rotational translation of the surgical tool in three-dimensional space and rotation of the surgical tool about three different axes. In such cases, the joystick 1008 is configured to be able to receive input in three-dimensional space and about three axes of rotation. In addition, the end effector actuator 1020 (e.g., a jaw closure mechanism) is built into a six degree of freedom joystick assembly that includes the joystick 1008 and associated sensors in the base 1004. The input control motions from the spatial joint 1006 may be selectively locked and/or scaled during different portions of the surgical procedure.
An exemplary six degree-of-freedom input control device 1100 is depicted in fig. 18-22. In various instances, such input devices may be incorporated into a user input device of a surgical robot, such as input control device 1000 (fig. 6-11). The input control device 1100 comprises a frame or base 1101, which typically remains stationary on a surface (such as a table or work surface) during use, and a cover 1102, which is movably mounted on the base 1101 and forms an input mechanism by which a user can input movements, which are detected and interpreted by the input control device 1100. In particular, the cover 1102 of the input control device 1100 is designed to be grasped by a user and manipulated relative to the base 1101 to generate a desired input. To determine the relative movement or position of the cover 1102 and the base 1101, the input control device 1100 comprises a first plate member 1110 fixed relative to the base 1101 of the input control device 1100, a second plate member 1120 resiliently mounted in spaced relation to the first plate member 1110 and adapted for movement or displacement relative thereto, and a plurality of opto-electronic measurement units 1118 for determining the relative movement or displacement between the first plate member 1110 and the second plate member 1120. The second plate member 1120 is resiliently connected to the first plate 1110 by a plurality of equally spaced coil spring elements 1106.
Each measurement unit 1118 for determining the relative movement and/or position of the first plate 1110 and the second plate 1120 includes a light emitting element in the form of an Infrared Light Emitting Diode (ILED)1113 (fig. 18 and 19) protruding from the upper side of the first plate 1110 and a Position Sensitive Infrared Detector (PSID)1123 (fig. 20) mounted on the lower side of the second plate 1120 and facing the first plate 1110. Furthermore, a light-shielding housing 1130 is disposed between the first panel 1110 and the second panel 1120 for effectively housing the ILED 1113 and for shielding the PSID 1123 from any undesired or extraneous light that might otherwise affect the accuracy of the readings provided by the PSID 1123.
The light shielding housing 1130 has a generally hollow structure defining a plurality of cavities 1131 therein that form individual optical path channels between each ILED 1113 on the first panel 1110 and its corresponding PSID 1123 mounted on the second panel 1120. Further, as shown in fig. 19, the light shielding housing 1130 includes slit diaphragms 1132 formed in a top wall 1133 thereof such that each slit diaphragm 1132 is arranged in an optical path between the ILED 1113 and a corresponding PSID 1123 of intended illumination of the ILED 1113.
The light shielding housing 1130 is thus configured to be able to define a plurality of light beam paths between the ILED 1113 on the first panel 1110 and the PSID 1123 on the second panel 1120, such that each light beam path is arranged to extend at an angle in the range of about 30 ° to about 60 ° (and preferably at about 45 °) relative to the plane of the first panel 1110 (i.e. relative to a substantial reference plane of the input control device 1100). Further, the beam paths defined by the optical path channels 1131 formed along each side of the light shielding housing 1130 thus extend in three separate intersecting planes corresponding to the planes of the housing sides. That is, the beam paths of two measurement units 1118 with a common PSID 1123 may be considered to lie in the same plane. The light-shielding housing 1130 is thus designed to form a three-dimensional array of beam paths between the ILEDs 1113 and the PSIDs 1123. This in turn provides a particularly compact optoelectronic device 1100 while also providing great flexibility in modifying the shape of the light-shielding housing 1130.
With further reference to fig. 20, since each PSID 1123 is illuminated by two separate ILEDs 1113, each side of the generally three-sided light-shielding housing 1130 is divided into two separate light-path channels 1131 by a central dividing wall 1114. In this way, each PSID 1123 is illuminated by its two separate ILEDs 1113 via two separate slit diaphragms 1132. Each slit 1132 provides optical communication with the associated PSID for only one of the ILEDs 1113. That is, each ILED 1113 is provided with its own dedicated slit diaphragm 1132. Each pair of slit diaphragms 1132 is arranged substantially parallel and extends substantially perpendicular to the light-sensitive portion of the associated PSID 1123.
Referring primarily to fig. 18, the optoelectronic device 1100 further comprises a stop arrangement 1140 designed to provide a physical barrier to movement or displacement of the second plate 1120 relative to the first plate 1110 beyond certain predetermined limits. The stop arrangement 1140 thereby prevents any inadvertent overloading of the input control device 1100 during use. The detent arrangement 1140 includes a plate-like connecting member 1142 and a pin member 1141.
The diameter of the opening or bore 1124 formed through the second plate 1120 is substantially larger than the diameter of the pin members 1141 they receive. In an intermediate position of the second plate 1120 relative to the first plate 1110, each pin member 1141 may be positioned substantially centrally in its respective hole 1124 through the second plate 1120. By virtue of the elastic deformability of the three helical spring elements 1106 connecting the plate members 1110, 1120, the second plate 1120 is able to move laterally and rotationally within limits defined by the sides of the hole 1124 and the pin member 1141 in a plane parallel to the first plate 1110. As shown in fig. 21, when the second plate 1120 is rotated counterclockwise from its neutral position relative to the first plate 1110 against the bias of the coil spring element 1106, the edge of the hole 1124 eventually engages the side of the pin member 1141, which in turn acts as a stop and prevents further rotation of the second plate 1120. The same effect naturally occurs with clockwise rotation or lateral translation of the second plate 1120. In various instances, elastomeric elements 1107, for example in the form of foam blocks, may form cushions for pin members 1141 of detent arrangement 1140.
With particular reference to fig. 22, when a tilting (i.e., rotational) movement is applied to the second plate 1120 (via the cover 1102) as shown, the second plate 1120 will deflect until the second plate 1120 engages the plate-like connecting member 1142 in the angled peripheral region 1143 after a predetermined amount of tilting has occurred. The fixed plate-like connecting member 1142 contacting or engaging the angled peripheral region 1143 serves to stop further relative movement of the second plate 1120 in this direction. Simultaneously or even alternatively, the upper inner surface of the cover 1102 may engage a corresponding angled peripheral region 1143 of the plate-like connecting member 1142, as indicated in fig. 22. The first plate 1110, the shutter housing 1130, and the stop arrangement 1140 may all remain stationary relative to the frame of the input control device 1100, while the cover 1102 and the second plate 1120 move relative to the frame during operation of the apparatus. The input control DEVICE 1100, as well as various alternative designs and/OR features thereof, are further described in European patent application 1,850,210, entitled "optional DEVICE FOR DETERMINING RELATIVE move OR RELATIVE position OF TWO OBJECTS," published on 31/10/2007, which is incorporated herein by reference in its entirety.
Some input control devices, such as the input device at the surgeon console 116 in fig. 1 and 2, may be bulky and require a large footprint within the operating room. In addition, as long as the surgeon is still actively involved in the surgery, it may be desirable for the surgeon to stay in a predefined position (e.g., sitting at the surgeon's console 116). In addition, the ergonomics of the input control devices may be less than ideal for many surgeons and may be difficult to adjust and/or customize, which may cause injury to the health and longevity of the surgeon's profession and/or lead to fatigue in surgical cases.
Compact input control devices requiring a smaller footprint may be incorporated into the adjustable workspace instead of the surgeon's console 116. The adjustable workspace may allow for a range of positioning of the input control device. In various instances, one or more compact input control devices may be positioned and/or moved around the operating room, such as near the patient table and/or within the sterile field, so that the surgeon may select a preferred location to control the robotic surgery without being limited to a predefined location at a bulky surgeon console. Furthermore, the adaptability of the compact input control device may allow the input control device to be positioned at an adjustable workspace.
For example, referring now to fig. 16-17A, an input control device 1000 is incorporated into a surgeon's adjustable workspace 1080. Adjustable workspace 1080 includes a surface or table 1082 and a monitor 1088 for viewing the surgical procedure via an endoscope. Table 1082 and/or monitor 1088 may be repositioned at a different height. In various circumstances, a first height may be selected so that the surgeon may stand at table 1082, and at a different time, a second height may be selected so that the surgeon may sit at table 1082. Additionally or alternatively, the sitting and standing heights may be adjusted for different surgeons. Further, table 1082 may be movable relative to monitors 1088, and monitors 1088 may be movable relative to table 1082. For example, table 1082 and/or monitor 1088 may be supported on releasably lockable wheels or casters. Similarly, the chair may move relative to table 1082 and monitor 1088. In such cases, the X, Y and Z positions of the various components of the adjustable workspace 1080 can be customized by the surgeon.
Table 1082 includes a foot pedal 1086; in other cases, however, foot switch panel 1086 may not be incorporated into table 1082. In some cases, foot-switch plate 1086 may be separate from table 1082 such that the position of foot-switch plate 1086 relative to table 1082 and/or the chair may also be adjusted.
In various instances, the adjustable workspace 1080 may be modular and moved toward a patient table or bedside. In such cases, adjustable workspace 1080 may be covered with a sterile barrier and positioned within the sterile field. Adjustable workspace 1080 may house and/or support a processor and/or computer for effecting teleoperation of the surgical robot by input to input control device 1000 at adjustable workspace 1080. In addition, table 1082 includes a platform or surface 1084 that is adapted to support the surgeon's arm/wrist with limited mechanical adjustment thereto.
Due to the smaller size and reduced range of motion of the input control device 1000 and the adjustability of the working space 1080, the surgeon's console may define a small profile and require a smaller footprint in the operating room. Smaller consoles can provide more space in the operating room. In addition, the smaller footprint may allow multiple users (e.g., experienced surgeons and inexperienced surgeons or trainees (such as medical students or resident doctors)) to perform surgical procedures together in close proximity, which may facilitate training. Small input control devices may be used, for example, with stimulators or real systems, and may be remote from the operating room and/or at the robotic surgical system.
Referring primarily to fig. 16 and 17A, adjustable workspace 1080 also supports additional ancillary devices. For example, keyboard 1090 and touchpad 1092 are supported on surface 1084 of table 1082. Alternative auxiliary devices such as conventional computer mice and other imaging and diagnostic equipment such as registered Magnetic Resonance Imaging (MRI) or Computed Tomography (CT) scan data, images and medical history are also contemplated. The auxiliary devices may control a graphical user interface on monitor 1088, and input control device 1000 may control teleoperation of the surgical robot. In such cases, these two different control inputs allow the surgeon to control teleoperational functions using the clutch-less input control device 1000 while utilizing more conventional techniques to interface with the graphical user interface on the monitor 1088. Thus, the user can interact with the application on monitor 1088 while the surgical robot is teleoperated. In addition, the dual, separate control inputs create a clear cognitive distinction between the teleoperational environment and the graphical user interface environment.
In various circumstances, an adjustable workspace for use by the surgeon may be desirable. For example, the surgeon may wish to be unconstrained and/or unconstrained by a predefined location at the surgeon's console, as further described herein. In some cases, the surgeon may wish to reposition during the surgical procedure. For example, a surgeon may wish to "swipe in" quickly during a surgical procedure and enter a sterile field in order to view the surgical procedure and/or patient in person rather than on a video monitor. Furthermore, the surgeon may not wish to relinquish control of the surgical robot when the surgeon is repositioned.
Moving the input control device may allow the surgeon to reposition, even enter a sterile field, during the surgical procedure. The mobile input control device may be, for example, modular and compatible with different docking stations within an operating room. In various instances, the moving portion of the input control device can be a single use device that can be sterilized for use within the sterile field.
By way of example, referring now to fig. 23, an input control apparatus 1200 is shown. The input control device 1200 includes a base 1204 that is similar in many respects to the base 1004 of the input control device 1000. The input control device 1200 may include a multi-axis force and torque sensor 1203 as described herein configured to detect forces and moments applied to the base 1204 by a modular joystick member 1208, which is similar in many respects to the joystick 1008. A modular lever member 1208 is releasably inserted in the base 1204 to apply a force for detection by the sensor 1203 housed therein. A shaft 1212, similar in many respects to shaft 1012, extends from lever 1208 and supports at least one movable finger 1222, similar in many respects to finger 1022. Similar to input control device 1000, input control device 1200 can further include a wrist rotatably coupled to a modular joystick member 1208 that can be rotated to provide control motions (such as roll control motions) for the surgical end effector. For example, the shaft 1212 may include a wrist member at its proximal end 1225.
In operation, the input control device 1200 may be engaged by a surgeon's hand. The force applied by the surgeon's hand is detected and corresponding signals are communicated to the control unit for controlling the robotic surgical tool in signal communication with the input control device 1200. In such instances, the forces applied in the X, Y and Z directions can correspond to translation of the end effector of the surgical tool in the X, Y and Z directions, and the moment about the X, Y and Z axes can correspond to rotation of the end effector about the X, Y and Z axes. In various instances, control of the input control device 1200 may be segmented based on the detected input and/or the position of the end effector at the surgical site (e.g., proximity to anatomical and/or critical structures).
The input control device 1200 includes separable components including a base 1204 that is separable from a modular joystick component 1208. In some cases, the modular lever member 1208 can nest and/or fit within the opening 1205 in the base 1204. In various instances, the lever 1208 and the base 1204 can be mechanically and electrically coupled. In various instances, the opening 1205 in the base 1204 can include a registration key that allows the joystick member 1208 to be received within the opening 1205 in a set angular orientation such that the position of the modular joystick member 1208 relative to the base 1204 is known.
In various instances, the modular joystick member 1208 and the base 1204 can include a communication module that enables communication therebetween. Since the communication does not require high power signals, the near field communication protocol may be utilized in various circumstances. Sterile barrier 1230 may extend between modular components of input control apparatus 1200. Sterility barrier 1230 is, for example, a thin, flexible drape positioned between modular components. Near field communication signals may pass through this layer of material. Sterile barrier 1230 may define, for example, a drape or sheet that covers base 1204. In one aspect, the drape may comprise a thin element of plastic or elastomeric material for positioning, placement, and force transfer.
In some cases, the base 1204 can be positioned in the sterile field during a surgical procedure. For example, the base 1204 may be mounted to a bedrail 1232 and/or an examination table adjacent to the patient. In some cases, the base 1204 may be a reusable or multi-use component of the input control device 1200. A plurality of pedestals 1204 can be positioned around the operating room, such as on a remote surgeon console outside the sterile field and a patient table within the sterile field, among other locations.
The joystick member 1208 may be compatible with each base 1204. In various instances, the joystick member 1208 may be a disposable and/or single-use member. In other cases, the joystick component 1208 may be re-sterilized between uses. For example, the joystick component 1208 may be sterilized (e.g., cryogenically sterilized) and sealed prior to use. When the surgeon enters the sterile field during the surgical procedure, the sealed lever component 1208 may be unsealed and ready for use. After use, the joystick component 1208 may be disposed of and/or sterilized for subsequent use.
Visualization system
"digital surgery" may encompass robotic systems, advanced imaging, advanced instrumentation, artificial intelligence, machine learning, data analysis for performance tracking and benchmarking, connectivity both inside and outside of the Operating Room (OR), and more. Although the various surgical platforms described herein may be used in conjunction with robotic surgical systems, such surgical platforms are not limited to use with robotic surgical systems. In some cases, advanced surgical visualization may be performed without robotics, without telemanipulation of robotic tools, and/or with limited and/or optional robotic assistance. Similarly, digital surgery may be performed without robotics, without telemanipulation of robotic tools, and/or with limited and/or optional robotic assistance.
In one instance, a surgical visualization system can include a first light emitter configured to emit a plurality of spectral waves, a second light emitter configured to emit a light pattern, and one or more receivers or sensors configured to detect visible light, molecular responses to spectral waves (spectral imaging), and/or light patterns. The surgical visualization system may also include an imaging system and control circuitry in signal communication with the receiver and the imaging system. Based on the output from the receiver, the control circuitry may determine a geometric surface map (i.e., a three-dimensional surface topography) of the surface visible at the surgical site and one or more distances relative to the surgical site. In some cases, the control circuitry may determine one or more distances to the at least partially concealed structure. Further, the imaging system may communicate the geometric surface map and the one or more distances to the clinician. In such cases, the enhanced view of the surgical site provided to the clinician may provide a representation of at least partially obscured structures within the relevant environment of the surgical site. For example, the imaging system may virtually augment a hidden structure on a geometric surface map of hidden and/or obstructing tissue, similar to a line drawn on the ground to indicate a utility line below the surface. Additionally or alternatively, the imaging system may communicate the proximity of one or more surgical tools to the visible obstructing tissue and/or to the at least partially concealed structure and/or the depth of the concealed structure below the visible surface of the obstructing tissue. For example, the visualization system may determine a distance relative to the enhancement line on the surface of the visible tissue and communicate the distance to the imaging system. In various instances, surgical visualization systems may collect data and communicate information during surgery.
Fig. 24 depicts a surgical visualization system 500 in accordance with at least one aspect of the present disclosure. The surgical visualization system 500 may be incorporated into a robotic surgical system, such as robotic system 510. The robotic system 510 may be similar in many respects to the robotic system 110 (fig. 1) and the robotic system 150 (fig. 3). Alternative robotic systems are also contemplated. The robotic system 510 includes at least one robotic arm, such as a first robotic arm 512 and a second robotic arm 514. The robotic arms 512, 514 include rigid structural members and joints, which may include servo motor controls. The first robotic arm 512 is configured to manipulate the surgical device 502 and the second robotic arm 514 is configured to manipulate the imaging device 520. The robotic control unit may be configured to issue control motions to the robotic arms 512, 514 that may affect, for example, the surgical device 502 and the imaging device 520. The surgical visualization system 500 may create visual representations of various structures within an anatomical field. The surgical visualization system 500 may be used for clinical analysis and/or medical intervention, for example. In some cases, the surgical visualization system 500 may be used intra-operatively to provide the clinician with real-time or near real-time information regarding proximity data, size, and/or distance during a surgical procedure.
In some cases, the surgical visualization system is configured to identify one or more critical structures, such as critical structures 501a, 501b in fig. 24, in real-time during surgery and/or configured to facilitate avoidance of critical structures 501a, 501b by the surgical device. In other cases, critical structures may be identified preoperatively. In this example, the critical structure 501a is a ureter and the critical structure 501b is a blood vessel in tissue 503, which is an organ, i.e. the uterus. Alternative key structures may be envisaged and many examples are provided herein. By identifying the critical structures 501a, 501b, the clinician may avoid manipulating the surgical device during the surgical procedure too close to the critical structures 501a, 501b and/or in an area of predefined proximity to the critical structures 501a, 501 b. For example, the clinician may avoid dissecting and/or approaching veins, arteries, nerves and/or vessels identified as critical structures, for example. In various instances, critical structures may be determined based on different procedures. The critical structures may be patient specific.
The critical structure may be the structure of interest. For example, critical structures may include anatomical structures such as ureters, arteries such as superior mesenteric arteries, veins such as portal veins, nerves such as the phrenic nerve, and/or tumors. In other cases, the critical structure may be a foreign structure, such as a surgical device, surgical fastener, clamp, tack, bougie, band, and/or plate, for example, in an anatomical field. The critical structures may be determined based on different patients and/or different procedures. Exemplary critical structures are further described herein and in U.S. patent application No. 16/128,192 entitled "VISUALIZATION OF SURGICAL DEVICES," filed 2018, 9, 11, which is incorporated herein by reference in its entirety.
Referring again to fig. 24, the critical structures 501a, 501b may be embedded in the tissue 503. In other words, the critical structures 501a, 501b may be positioned below the surface 505 of the tissue 503. In such cases, the tissue 503 conceals the critical structures 501a, 501b from view by the clinician. The critical structures 501a, 501b are also obscured from view by the tissue 503 by the imaging device 520. Tissue 503 may be, for example, fat, connective tissue, adhesions, and/or organs. In various cases, the critical structures 501a, 501b may be partially obscured from view.
Fig. 24 also depicts a surgical device 502. The surgical device 502 includes an end effector having opposing jaws extending from a distal end of a shaft of the surgical device 502. The surgical device 502 may be any suitable surgical device, such as, for example, a dissector, a stapler, a grasper, a clip applier, and/or an energy device (including a monopolar probe, a bipolar probe, an ablation probe, and/or an ultrasonic end effector). Additionally or alternatively, the surgical device 502 may include another imaging or diagnostic modality, such as an ultrasound device. In one aspect of the present disclosure, the surgical visualization system 500 may be configured to enable identification of one or more critical structures and the proximity of the surgical device 502 to the critical structures.
The surgical visualization system 500 includes an imaging subsystem including an imaging device 520, such as a camera, configured to provide a real-time view of a surgical site. The imaging device 520 may include a camera or imaging sensor configured to be capable of detecting, for example, visible light, spectral light waves (visible or invisible light), and/or structured light patterns (visible or invisible light). In various aspects of the present disclosure, an imaging system may include, for example, an imaging device, such as an endoscope. Additionally or alternatively, the imaging system may include, for example, an imaging device such as an arthroscope, angioscope, bronchoscope, cholangioscope, colonoscope, cystoscope, duodenoscope, enteroscope, esophagogastroduodenoscope (gastroscope), laryngoscope, nasopharyngo-nephroscope, sigmoidoscope, thoracoscope, ureteroscope, or exoscope. In other cases, such as in open surgical applications, the imaging system may not include a scope.
The imaging device 520 of the surgical visualization system 500 may be configured to be capable of emitting and detecting various wavelengths of light, such as, for example, visible light, spectral light wavelengths (visible or invisible) and structured light patterns (visible or invisible). The imaging device 520 may include multiple lenses, sensors, and/or receivers for detecting different signals. For example, the imaging device 520 may be a hyperspectral, multispectral, or selective spectral camera, as further described herein. The imaging device 520 may also include a waveform sensor 522 (such as a spectral image sensor, a detector, and/or a three-dimensional camera lens). For example, the imaging device 520 may include a right lens and a left lens used together to simultaneously record two-dimensional images, and thus generate a three-dimensional image of the surgical site, render a three-dimensional image of the surgical site, and/or determine one or more distances at the surgical site. Additionally or alternatively, the imaging device 520 may be configured to receive images indicative of the topography of visible tissue and the identification and orientation of hidden critical structures, as further described herein. For example, the field of view of the imaging device 520 may overlap with the pattern of light (structured light) formed by the light array 530 that is projected onto the surface 505 of the tissue 503, as shown in fig. 24.
Views from the imaging device 520 may be provided to the clinician, and in various aspects of the disclosure, these views may be enhanced with additional information based on tissue identification, topographic mapping, and distance sensor system 504. In such cases, the surgical visualization system 500 includes a plurality of subsystems, namely an imaging subsystem, a surface mapping subsystem, a tissue identification subsystem, and/or a distance determination subsystem, as further described herein. These subsystems may cooperate to provide advanced data synthesis and integration information to the clinician and/or control unit during surgery. For example, information from one or more of these subsystems may inform a clinician and/or a decision process of a control unit of an input control device of the robotic system.
The surgical visualization system 500 may include one or more subsystems for determining three-dimensional topographies or surface maps of various structures within an anatomical field, such as the surface of tissue. Exemplary surface mapping systems include lidar (light radar), Structured Light (SL), three-dimensional (3D) stereoscopy (stereo), recovering deformable shapes from motion (DSfM), recovering shapes from shading (SfS), simultaneous localization and mapping (SLAM), and time-of-flight (ToF). Various surface mapping systems are described further herein and in l.maier-Hein et al, "Optical technologies for 3D surface reconstruction in computer-assisted laboratory surgery," Medical Image Analysis, volume 17, 2013, pages 974 to 996, which is incorporated herein by reference in its entirety and can be found at www.sciencedirect.com/science (last access time 2019, 1/8). The surgical visualization system 500 may also determine proximity to various structures within the anatomical field, including the surface of the tissue, as further described herein.
In various aspects of the present disclosure, the surface mapping subsystem may be implemented with a light pattern system, as further described herein. Light patterns (or structured light) are known for use in surface mapping. Known surface mapping techniques may be used in the surgical visualization systems described herein.
Structured light is the process of projecting a known pattern (usually a grid or horizontal bar) onto a surface. U.S. patent application publication 2017/0055819 entitled "SET composition a SURGICAL INSTRUMENT" published on 3/2 2017 and U.S. patent application publication 2017/0251900 entitled "differentiation SYSTEM" published on 9/7 2017 disclose a SURGICAL SYSTEM that includes a light source and a projector for projecting a light pattern. U.S. patent application publication 2017/0055819 entitled "SET COMPRISING A SURGICAL INSTRUMENT" published on 3/2.2017 and U.S. patent application publication 2017/0251900 entitled "DEPICTION SYSTEM" published on 9/7.2017 are incorporated herein by reference in their entirety.
Fig. 37 illustrates a structured (or patterned) light system 700 in accordance with at least one aspect of the present disclosure. As described herein, structured light in the form of stripes or lines, for example, can be projected from a light source and/or projector 706 onto the surface 705 of the target anatomy to identify the shape and contour of the surface 705. Camera 720, which may be similar in various aspects to imaging device 520 (fig. 24), for example, may be configured to be capable of detecting the projected pattern of light on surface 705. The manner in which the projection pattern deforms when striking the surface 705 allows the vision system to calculate depth and surface information for the target anatomy.
In some cases, invisible (or imperceptible) structured light may be utilized. The structured light can be used without interfering with other computer vision tasks where the projected pattern may be confusing. For example, frames with light patterns may be isolated from the frames shown (e.g., enhanced outward). In other cases, interference may be prevented by using infrared light or a very fast visible light frame rate that alternates between two diametrically opposed patterns. Structured light is further described at en.wikipedia.org/wiki/Structured _ light.
Referring again to fig. 24, in one aspect, surgical visualization system 500 includes a transmitter 506 configured to emit a pattern of light, such as stripes, gridlines, and/or dots, to enable determination of the topography or topography of surface 505 of tissue 503. For example, the projected light array 530 may be used for three-dimensional scanning and registration on the surface 505 of the tissue 503. The projected light array 530 may be emitted from an emitter 506 located on, for example, the surgical device 502 and/or the robotic arms 512, 514 and/or the imaging device 520. In one aspect, the projected light array 530 is used to determine a shape defined by the surface 505 of the tissue 503 and/or the motion of the surface 505 during surgery. The imaging device 520 is configured to be able to detect the projected light array 530 reflected from the surface 505 to determine the topography of the surface 505 and various distances relative to the surface 505. One or more additional and/or alternative surface mapping techniques may also be employed.
In various aspects of the disclosure, the tissue identification subsystem may be implemented with a spectral imaging system. The spectral imaging system may rely on, for example, hyperspectral imaging, multispectral imaging, or selective spectral imaging. Hyperspectral IMAGING OF tissue is further described in U.S. patent 9,274,047 entitled "METHODS AND APPARATUS FOR IMAGING OF the interrupted object" published on 1/3/2016, which is incorporated herein by reference in its entirety.
In various instances, the imaging device 520 is a spectral camera (e.g., a hyperspectral camera, a multispectral camera, or a selective spectral camera) configured to detect reflected spectral waveforms and generate a spectral cube of an image based on molecular responses to different wavelengths. Spectral imaging will be described further herein.
In various instances, hyperspectral imaging techniques can be used to identify features in the anatomical structure in order to distinguish critical structures from occlusions. For example, hyperspectral imaging techniques may provide a visualization system that may provide a way to identify critical structures such as ureters and/or blood vessels, particularly when those structures are obscured by, for example, fat, connective tissue, blood, or other organs. Differences in reflectivity at different wavelengths in the Infrared (IR) spectrum can be used to determine the presence of critical structures and occlusions. Referring now to fig. 38-40, exemplary hyperspectral features of, for example, ureter, artery, and nerve tissue relative to obscuration such as fat, lung tissue, and blood are depicted.
Fig. 38 is a graphical representation 950 of exemplary ureter features relative to a mask. The graph shows the reflectance of fat, lung tissue, blood and ureters for each wavelength versus wavelength (nm). FIG. 39 is a graphical representation 952 of exemplary arterial features relative to an occlusion. The graph shows reflectance versus wavelength (nm) for fat, lung tissue, blood and blood vessels. Fig. 40 is a graphical representation 954 of exemplary neural features relative to an obscuration. The graph shows reflectance of fat, lung tissue, blood and nerves as a function of wavelength (nm).
Referring again to fig. 24, the imaging device 520 may include an optical waveform emitter 523 configured to emit electromagnetic radiation 524(NIR photons) that may penetrate the surface 505 of the tissue 503 and reach the critical structures 501a, 501 b. The imaging device 520 and the optical waveform emitter 523 thereon may be capable of being positioned by the robotic arms 512, 514. A corresponding waveform sensor 522 (e.g., an image sensor, spectrometer, or vibration sensor) on the imaging device 520 is configured to be able to detect the effects of electromagnetic radiation 524 received by the waveform sensor 522. The wavelength of the electromagnetic radiation 524 emitted by the optical waveform emitter 523 may be configured to enable identification of the type of anatomical and/or physical structure (such as critical structures 501a, 501 b). In one aspect, the wavelength of electromagnetic radiation 524 may be variable. The waveform sensor 522 and the optical waveform emitter 523 may include, for example, a multispectral imaging system and/or a selective spectral imaging system.
Identification of the critical structures 501a, 501b may be achieved by, for example, spectral analysis, photoacoustics, and/or ultrasound. In some cases, the waveform sensor 522 and the optical waveform emitter 523 may comprise, for example, a photoacoustic imaging system. In various instances, the optical waveform emitter 523 may be positioned on a surgical device separate from the imaging device 520. Alternative tissue identification techniques are also contemplated. In some cases, the surgical visualization system 500 may not be configured to be able to identify hidden critical structures.
In one case, the surgical visualization system 500 incorporates tissue recognition and geometric surface mapping, plus a distance determination subsystem, such as a distance sensor system 504. The distance sensor system 504 is configured to determine one or more distances at the surgical site. The distance sensor system 504 is a time-of-flight system configured to be able to determine a distance from one or more anatomical structures. Alternative distance determination subsystems are also contemplated. In combination, the tissue identification system, the geometric surface mapping and distance determination subsystem may determine the position of the critical structures 501a, 501b within the anatomical field and/or the proximity of the surgical device 502 to the surface 505 of the visible tissue 503 and/or to the critical structures 501a, 501 b.
In various aspects of the present disclosure, a distance determination system may be incorporated into a surface mapping system. For example, structured light may be utilized to generate a three-dimensional virtual model of a visible surface and to determine various distances relative to the visible surface. In other cases, the time-of-flight emitter may be separate from the structured light emitter.
In various instances, the distance determination subsystem may rely on time-of-flight measurements to determine one or more distances to tissue (or other structures) identified at the surgical site. In one aspect, the distance sensor system 504 may be a time-of-flight distance sensor system that includes a transmitter (such as transmitter 506) and a receiver 508 that may be positioned on the surgical device 502. In one general aspect, the transmitter 506 of the distance sensor system 504 may comprise a very tiny laser source and the receiver 508 of the distance sensor system 504 may comprise a matching sensor. The distance sensor system 504 may detect "time of flight" or the time it takes for the laser light emitted by the transmitter 506 to bounce back to the sensor portion of the receiver 508. The use of a very narrow light source in the emitter 506 may enable the distance sensor system 504 to determine the distance to the surface 505 of the tissue 503 directly in front of the distance sensor system 504.
Still referring to FIG. 24, deIs the emitter-tissue distance from emitter 506 to surface 505 of tissue 503, and dtIs a device-tissue from the distal end of the surgical device 502 to the surface 505 of the tissueDistance. The distance sensor system 504 may be used to determine the emitter-tissue distance de. Device-tissue distance dtCan be obtained from the known position of the emitter 506 on the shaft of the surgical device 502 relative to the distal end of the surgical device 502. In other words, when the distance between the emitter 506 and the distal end of the surgical device 502 is known, the device-tissue distance d is knowntAccording to the transmitter-tissue distance deAnd (4) determining.
In various instances, the receiver 508 of the distance sensor system 504 may be mounted on a separate surgical device rather than the surgical device 502. For example, the receiver 508 may be mounted on a cannula or trocar through which the surgical device 502 extends to reach the surgical site. In other cases, the receiver 508 of the distance sensor system 504 may be mounted on a separate robotically controlled arm (e.g., robotic arms 512, 514), on a movable arm operated by another robot, and/OR to an Operating Room (OR) table OR fixture. In some cases, the imaging device 520 includes a time-of-flight receiver 508 to determine the distance from the transmitter 506 to the surface 505 of the tissue 503 using a line between the transmitter 506 and the imaging device 520 on the surgical device 502. For example, the distance d may be paired based on the known orientations of the transmitter 506 (e.g., on the surgical device 502) and the receiver 508 (e.g., on the imaging device 520) of the distance sensor system 504 eTriangulation is performed. The three-dimensional position of the receiver 508 may be known and/or intraoperatively registered with the robot coordinate plane.
In some cases, the position of the transmitter 506 of the distance sensor system 504 may be controlled by the first robotic arm 512 and the position of the receiver 508 of the distance sensor system 504 may be controlled by the second robotic arm 514. In other cases, the surgical visualization system 500 may be used separately from the robotic system. In such cases, the distance sensor system 504 may be independent of the robotic system.
In certain instances, one or more of the robotic arms 512, 514 may be separate from the primary robotic system used in the surgical procedure. At least one of the robotic arms 512, 514 may be positioned and registered with a particular coordinate system without servo motor controls. For example, a closed loop control system and/or a plurality of sensors for the robotic arms 512, 514 may control and/or register the position of the robotic arms 512, 514 relative to a particular coordinate system. Similarly, the positions of the surgical device 502 and the imaging device 520 may be registered relative to a particular coordinate system.
Still referring to FIG. 24, dwIs the camera-critical structure distance from the optical waveform emitter 523 located on the imaging device 520 to the surface of the critical structure 501a, and d AIs the depth of the critical structure 501b below the surface 505 of the tissue 503 (i.e., the distance between the portion of the surface 505 closest to the surgical device 502 and the critical structure 501 b). In various aspects, the time of flight of the optical waveform emitted from the optical waveform emitter 523 located on the imaging device 520 may be configured to enable determination of the camera-critical structure distance dw. The use of spectral imaging in conjunction with time-of-flight sensors is further described herein.
In one aspect, the surgical visualization system 500 is configured to determine an emitter-tissue distance d from an emitter 506 on the surgical device 502 to the surface 505 of the uterus via structured lighte. The surgical visualization system 500 is configured to be based on the emitter-tissue distance deTo extrapolate a device-tissue distance d from the surgical device 502 to the surface 505 of the uterust. The surgical visualization system 500 is further configured to be able to determine a tissue-ureter distance d from the critical structure (ureter) 501a to the surface 505AAnd a camera-ureter distance d from the imaging device 520 to the critical structure (ureter) 501aw. As described herein, the surgical visualization system 500 may determine the distance d using, for example, spectral imaging and time-of-flight sensorsw. In various instances, the surgical visualization system 500 can determine (e.g., triangulate) the tissue-ureter distance d based on other distance and/or surface mapping logic components described herein A(or depth).
Referring now to fig. 29, in various aspects of the present disclosure, in a surgical visualization system 800, a depth d of a critical structure 801 relative to a surface 805 of a tissue 803ACan be determined by the following method: according to the distance dwAnd the known orientations of the emitter 806 and optical waveform emitters 823 and detectors 823 (and thus the known distances d between them)x) Performing triangulation to determine the distance dy(which is the distance d)eAnd depth dAThe sum).
Additionally or alternatively, the time of flight from the optical waveform emitter 823 may be configured to be able to determine the distance from the optical waveform emitter 823 to the surface 805 of the tissue 803. For example, a first waveform (or range of waveforms) may be used to determine a camera-critical structure distance dwAnd a second waveform (or range of waveforms) may be used to determine the distance to surface 805 of tissue 803. In such cases, different waveforms may be used to determine the depth of critical structures 801 below surface 805 of tissue 803. The spectral time-of-flight system will be described further herein.
Additionally or alternatively, in some cases, the distance dAMay be determined by ultrasound, registered Magnetic Resonance Imaging (MRI) or Computed Tomography (CT) scanning. In other cases, the distance d ASpectral imaging may be used for determination because the detection signal received by the imaging device may vary based on the type of material. For example, fat may decrease the detection signal in a first manner or amount, and collagen may decrease the detection signal in a second, different manner or amount.
Referring now to the surgical visualization system 860 in fig. 30, wherein the surgical device 862 includes an optical waveform emitter 823 'and a waveform sensor 822' configured to detect reflected waveforms. The optical waveform emitter 823' may be configured to emit a waveform for determining a distance d from a common device (such as a surgical device 862)tAnd dwAs further described herein. In such cases, the distance d from surface 805 of tissue 803 to the surface of critical structure 801ACan be determined as follows:
dA=dw-dt
as disclosed herein, various information about visible tissue, embedded critical structures, and surgical devices may be determined by utilizing a combination method incorporating one or more time-of-flight distance sensors, spectral imaging, and/or structured light arrays in conjunction with an image sensor configured to be capable of detecting spectral wavelengths and structured light arrays. Further, the image sensor may be configured to receive visible light and thereby provide an image of the surgical site to the imaging system. Logic or algorithms are employed to identify the information received from the time-of-flight sensors, spectral wavelengths, structured light, and visible light, and render three-dimensional images of the surface tissue and underlying anatomical structures. In various instances, the imaging device 520 may include multiple image sensors.
Camera-critical structure distance dwDetection may also be performed in one or more alternative ways. In one aspect, the critical structures 3201 can be illuminated using, for example, fluoroscopic visualization techniques, such as fluorescent indocyanine green (ICG), as shown in fig. 31-33. The camera 3220 may include two optical waveform sensors 3222, 3224 that capture left and right images of the critical structure 3201 simultaneously (fig. 32A and 32B). In such cases, the camera 3220 may depict the glow of the critical structure 3201 below the surface 3205 of the tissue 3203, and the distance dwMay be determined by the known distance between sensors 3222 and 3224. In some cases, the distance may be determined more accurately by utilizing more than one camera or by moving the camera between multiple positions. In certain aspects, one camera may be controlled by a first robotic arm and a second camera may be controlled by another robotic arm. In such robotic systems, one camera may be, for example, a follower camera on a follower arm. The follower arm and the camera thereon can be programmed to track the other camera and maintain a particular distance and/or lens angle, for example.
In other aspects, the surgical visualization system 500 may employ two separate waveform receivers (i.e., cameras/image sensors) to determine d w. Referring now to fig. 34, if a critical structure 3301 or its contents (e.g., a blood vessel or blood vessel contents) can emit a signal 3302, such as with fluoroscopy, the actual position can be triangulated from two separate cameras 3320a, 3320b at known locations.
In another aspect, referring now to fig. 35A and 35B, the surgical visualization system may employ a camera 440 that is dithered or moved to determine the distance dw. The camera 440 is robotically controlled such that the three-dimensional coordinates of the camera 440 at different orientations are known. In various instances, the camera 440 may pivot at the cannula or patient interface. For example, if the critical structure 401 or its contents (e.g., blood vessels or blood vessel contents) can emit signals, such as with fluoroscopy, the actual position can be triangulated from the camera 440 moving rapidly between two or more known positions. In fig. 35A, the camera 440 moves axially along axis a. More specifically, the camera 440 translates along the axis a distance d closer to the critical structure 4011To translate to a position indicated as position 440', such as by moving in and out on a robotic arm. When the camera 440 moves a distance d1And the dimensions of the view change relative to the critical structure 401, the distance to the critical structure 401 may be calculated. For example, an axial translation (distance d) of 4.28mm 1) May correspond to an angle theta of 6.28 degrees1And an angle theta of 8.19 degrees2
Additionally or alternatively, the camera 440 may rotate or sweep in an arc between different orientations. Referring now to FIG. 35B, the camera 440 moves axially along axis A and rotates about axis A by an angle θ3. A pivot point 442 for rotation of the camera 440 is positioned at the cannula/patient interface. In fig. 35B, the camera 440 translates and rotates to a position 440 ". As the camera 440 moves and the view edges change with respect to the critical structure 401, the distance to the critical structure 401 may be calculated. In FIG. 35B, the distance d2May be, for example, 9.01mm, and angle θ3May be, for example, 0.9 degrees.
Fig. 25 is a schematic diagram of a control system 833 that may be used with, for example, surgical visualization system 500 and input control device 1000. The control system 833 includes a control circuit 832 in signal communication with a memory 834. The memory 834 stores instructions executable by the control circuitry 832 to determine and/or identify critical structures (e.g., critical structures 501a, 501b in fig. 24), determine and/or calculate one or more distances and/or three-dimensional digital representations, and communicate certain information to one or more clinicians, etc. For example, memory 834 stores surface mapping logic 836, imaging logic 838, tissue recognition logic 840, or distance determination logic 841, or any combination of logic 836, 838, 840 and 841. Memory 834 may also include input control logic for implementing input controls provided to input control 1000, including, for example, scaling and/or locking certain controls in certain circumstances, and/or switching between modes of operation based on real-time intraoperative tissue proximity data. The control system 833 also includes an imaging system 842 having one or more cameras 844 (such as imaging device 520 in fig. 24), one or more displays 846, or one or more controls 848, or any combination of these elements. The camera 844 may include one or more image sensors 835 to receive signals from various light sources (e.g., visible light, spectral imagers, three-dimensional lenses, etc.) that emit light in various visible and non-visible spectrums. Display 846 may include one or more screens or monitors for depicting real, virtual, and/or virtually augmented images and/or information to one or more clinicians.
In various aspects, the heart of the camera 844 is the image sensor 835. In general, modern image sensors 835 are solid-state electronic devices containing up to millions of discrete photodetector sites (referred to as pixels). The image sensor 835 technology falls into one of two categories: charge Coupled Devices (CCD) and Complementary Metal Oxide Semiconductor (CMOS) imagers, and recently, Short Wave Infrared (SWIR) is an emerging imaging technology. Another type of image sensor 835 employs a hybrid CCD/CMOS architecture (sold under the name "sCMOS") and is composed of a CMOS readout integrated circuit (ROIC) bump bonded to a CCD imaging substrate. The CCD and CMOS image sensor 835 is sensitive to wavelengths of about 350nm to 1050nm, but this range is typically given as 400nm to 1000 nm. Generally, CMOS sensors are more sensitive to IR wavelengths than CCD sensors. The solid-state image sensor 835 is based on the photoelectric effect, and therefore cannot distinguish colors. Therefore, there are two types of color CCD cameras: single chip and three chips. Single chip color CCD cameras provide common low cost imaging solutions and use mosaic (e.g., bayer) optical filters to separate the incident light into a range of colors and employ interpolation algorithms to resolve full color images. Each color then points to a different set of pixels. Three-chip color CCD cameras provide higher resolution by using a prism to direct each portion of the incident spectrum to a different chip. More accurate color reproduction is possible because each point in the space of the object has a separate RGB intensity value, rather than using an algorithm to determine the color. Three-chip cameras provide extremely high resolution.
The control system 833 further includes a spectral light source 850 and a structured light source 852. In some cases, a single source may be pulsed to emit light at a wavelength within the range of the spectral light source 850 and a wavelength within the range of the structured light source 852. Alternatively, a single light source may be pulsed to provide wavelengths of light in the invisible spectrum (e.g., infrared spectrum light) and light over the visible spectrum. The spectral light source 850 may be, for example, a hyperspectral light source, a multispectral light source, and/or a selective spectral light source. In various instances, the tissue recognition logic 840 may identify critical structures via data from the spectral light source 850 received by the image sensor 835 portion of the camera 844. The surface mapping logic 836 may determine a surface profile of visible tissue based on the reflected structured light. Using the time-of-flight measurements, distance determination logic 841 may determine one or more distances to visible tissue and/or critical structures. One or more outputs from the surface mapping logic 836, tissue recognition logic 840, and distance determination logic 841 may be provided to the imaging logic 838, and may be combined, blended, and/or overlapped for communication to a clinician via a display 846 of the imaging system 842.
The specification now briefly turns to fig. 26-28 to describe various aspects of the control circuit 832 for controlling various aspects of the surgical visualization system 500. Turning to fig. 26, a control circuit 400 configured to control aspects of a surgical visualization system 500 is shown, according to at least one aspect of the present disclosure. The control circuit 400 may be configured to implement the various processes described herein. The control circuit 400 may include a microcontroller including one or more processors 402 (e.g., microprocessors, microcontrollers) coupled to at least one memory circuit 404. The memory circuit 404 stores machine executable instructions that, when executed by the processor 402, cause the processor 402 to execute the machine instructions to implement the various processes described herein. Processor 402 may be any of a variety of single-core or multi-core processors known in the art. The memory circuit 404 may include volatile storage media and non-volatile storage media. Processor 402 may include an instruction processing unit 406 and an arithmetic unit 408. The instruction processing unit may be configured to be able to receive instructions from the memory circuit 404 of the present disclosure.
Fig. 27 illustrates a combinational logic circuit 410 configured to control aspects of a surgical visualization system 500 in accordance with at least one aspect of the present disclosure. The combinational logic circuit 410 may be configured to enable the various processes described herein. Combinatorial logic circuitry 410 may include a finite state machine including combinatorial logic 412 configured to receive data associated with a surgical instrument or tool at input 414, process the data through combinatorial logic 412, and provide output 416.
Fig. 28 illustrates a sequential logic circuit 420 configured to control aspects of a surgical visualization system 500 in accordance with at least one aspect of the present disclosure. Sequential logic circuit 420 or combinational logic component 422 may be configured to enable the various processes described herein. Sequential logic circuit 420 may include a finite state machine. Sequential logic circuit 420 may include, for example, a combinational logic component 422, at least one memory circuit 424, and a clock 429. The at least one memory circuit 424 may store a current state of the finite state machine. In some cases, sequential logic circuit 420 may be synchronous or asynchronous. The combinatorial logic 422 is configured to receive data associated with a surgical device or system from an input 426, process the data through the combinatorial logic 422, and provide an output 428. In other aspects, a circuit may comprise a combination of a processor (e.g., processor 402 in fig. 26) and a finite state machine to implement various processes herein. In other aspects, the finite state machine may comprise a combination of combinational logic circuitry (e.g., combinational logic circuitry 410, fig. 27) and sequential logic circuitry 420.
Referring now to fig. 36, depicted therein is a schematic diagram of a control system 600 for, for example, a surgical visualization system, such as surgical visualization system 500. For example, the control system 600 is a conversion system that integrates spectral signature tissue identification and structured light tissue localization to identify critical structures, particularly when these structures are obscured by other tissues (such as fat, connective tissue, blood, and/or other organs). Such techniques may also be used to detect tissue variability, such as to distinguish between tumors and/or unhealthy tissue and healthy tissue within an organ.
The control system 600 is configured to implement a hyperspectral imaging and visualization system in which molecular responses are utilized to detect and identify anatomical structures in a surgical field of view. The control system 600 includes conversion logic 648 to convert the tissue data into information usable by the surgeon. For example, critical structures in the anatomical structure may be identified using variable reflectivity based on wavelength relative to the masking material. Further, the control system 600 combines the identified spectral features and the structured light data in an image. For example, the control system 600 may be used to create a three-dimensional data set for surgical use in a system with enhanced image overlay. The technique can be used both intra-operatively and pre-operatively using additional visual information. In various instances, the control system 600 is configured to provide a warning to a clinician when one or more critical structures are approached. Various algorithms may be employed to guide robotic automation and semi-automation methods based on surgery and proximity to critical structures.
An array of projected light is used to determine tissue shape and motion during surgery. Alternatively, a flash lidar may be used for surface mapping of tissue.
The control system 600 is configured to be able to detect and provide image overlay of critical structures and measure distances to the surface of visible tissue and to embedded/buried critical structures. In other cases, control system 600 may measure the distance to the surface of the visible tissue or detect critical structures and provide an image overlay of the critical structures.
The control system 600 includes a spectral control circuit 602. For example, the spectral control circuit 602 may be a Field Programmable Gate Array (FPGA) or another suitable circuit configuration as described herein in connection with fig. 26-28. The spectral control circuit 602 includes a processor 604 to receive a video input signal from a video input processor 606. For example, processor 604 may be configured for hyperspectral processing and may utilize C/C + + code. For example, video input processor 606 receives video input of control (metadata) data, such as shutter time, wavelength, and sensor analysis. The processor 604 is configured to be able to process a video input signal from the video input processor 606 and provide a video output signal to the video output processor 608, which includes a hyperspectral video output such as interface control (metadata) data. The video output processor 608 provides the video output signal to an image overlay controller 610.
The video input processor 606 is coupled to a camera 612 at the patient side via patient isolation circuitry 614. As previously described, the camera 612 includes a solid-state image sensor 634. The patient isolation circuitry may include a plurality of transformers so that the patient is isolated from other circuitry in the system. Camera 612 receives intraoperative images through optics 632 and image sensor 634. Image sensor 634 may comprise, for example, a CMOS image sensor, or may comprise, for example, any of the image sensor technologies described herein in connection with fig. 25. In one aspect, the camera 612 outputs an image at a 14 bit/pixel signal. It is understood that higher or lower pixel resolutions may be employed without departing from the scope of this disclosure. The isolated camera output signal 613 is provided to a color RGB fusion circuit 616 that processes the camera output signal 613 using hardware registers 618 and a Nios2 co-processor 620. The color RGB fused output signal is provided to the video input processor 606 and the laser pulse control circuit 622.
The laser pulse control circuit 622 controls the laser engine 624. The laser engine 624 outputs a plurality of wavelengths (λ) including Near Infrared (NIR)1、λ2、λ3、......、λn) Of (2) is detected. The laser engine 624 may be inOperating in multiple modes. In one aspect, the laser engine 624 may operate in two modes, for example. In a first mode (e.g., a normal operating mode), the laser engine 624 outputs an illumination signal. In a second mode (e.g., a recognition mode), the laser engine 624 outputs RGBG and NIR light. In various cases, the laser engine 624 may operate in a polarization mode.
Light output 626 from laser engine 624 illuminates the targeted anatomy in intraoperative surgical site 627. The laser pulse control circuit 622 also controls a laser pulse controller 628 for a laser pattern projector 630 that projects a laser pattern 631 (such as a grid or pattern of lines and/or dots) of a predetermined wavelength (λ) onto the surgical tissue or organ at the surgical site 627. Camera 612 receives the patterned light and reflected light output by camera optics 632. The image sensor 634 converts the received light into a digital signal.
The color RGB fusion circuit 616 also outputs signals to the image overlay controller 610 and the video input module 636 for reading the laser pattern 631 projected by the laser pattern projector 630 onto the target anatomy at the surgical site 627. The processing module 638 processes the laser light pattern 631 and outputs a first video output signal 640 representing the distance to visible tissue at the surgical site 627. The data is provided to the image overlay controller 610. The processing module 638 also outputs a second video signal 642 representing a three-dimensional rendered shape of a tissue or organ of the target anatomy at the surgical site.
The first video output signal 640 and the second video output signal 642 comprise data representing the position of the critical structure on the three-dimensional surface model, which is provided to the integration module 643. In conjunction with data from the video output processor 608 of the spectral control circuit 602, the integration module 643 can determine the distance d to the buried critical structureA(FIG. 24) (e.g., via triangulation algorithm 644), and distance dAMay be provided to the image overlay controller 610 via a video output processor 646. The conversion logic described above may encompass conversion logic 648 between video monitor 652 and camera 612, laser engine 624, and laser positioned at surgical site 627A pattern projector 630.
In various instances, the pre-operative data 650 from a CT or MRI scan may be employed to register or match certain three-dimensional deformable tissues. Such preoperative data 650 may be provided to an integration module 643 and ultimately to the image overlay controller 610 so that such information may be overlaid with the view from the camera 612 and provided to the video monitor 652. Registration OF preoperative data is further described herein and in, for example, U.S. patent application 16/128,195 entitled "integrity OF IMAGING DATA," filed on 2018, 9, 11, which is incorporated herein by reference in its entirety.
The video monitor 652 may output the integrated/enhanced view from the image overlay controller 610. The clinician may select and/or switch between different views on one or more monitors. On the first monitor 652a, the clinician may switch between (a) a view in which a three-dimensional rendering of the visible tissue is depicted and (B) an enhanced view in which one or more hidden critical structures are depicted on the three-dimensional rendering of the visible tissue. On the second monitor 652b, the clinician may, for example, switch the distance measurement to one or more surfaces that hide critical structures and/or visible tissue.
The control system 600 and/or various control circuits thereof may be incorporated into various surgical visualization systems disclosed herein.
In various instances, selected wavelengths for spectral imaging (i.e., "selective spectral" imaging) may be identified and utilized based on expected critical structures and/or occlusions at the surgical site. By utilizing selective spectral imaging, the amount of time required to obtain a spectral image can be minimized, such that information can be obtained and utilized intra-operatively in real-time or near real-time. In various cases, the wavelength may be selected by a clinician or selected by the control circuitry based on input from the clinician. In some cases, the wavelength may be selected based on, for example, large data accessible to the machine learning and/or control circuitry via the cloud.
The foregoing application of spectral imaging to tissue can be used intra-operatively to measure the distance between a waveform emitter and critical structures obscured by tissue. In one aspect of the present disclosure, referring now to fig. 41 and 42, a time-of-flight sensor system 2104 that utilizes waveforms 2124, 2125 is illustrated. In certain instances, the time-of-flight sensor system 2104 may be incorporated into the surgical visualization system 500 (fig. 24). The time-of-flight sensor system 2104 includes a waveform transmitter 2106 and a waveform receiver 2108 on the same surgical device 2102. Transmitted wave 2124 extends from transmitter 2106 to critical structure 2101 and received wave 2125 reflects from critical structure 2101 back to receiver 2108. The surgical device 2102 is positioned through trocar 2110 extending into a lumen 2107 of a patient.
The waveforms 2124, 2125 are configured to penetrate the covering tissue 2103. For example, the wavelengths of waveforms 2124, 2125 may be in the NIR or SWIR wavelength spectrum. In one aspect, a spectral signal (e.g., hyperspectral, multispectral, or selective spectrum) or a photoacoustic signal may be emitted from the emitter 2106 and may penetrate the tissue 2103 in which the relevant key structure 2101 is concealed. The transmitted waveform 2124 may be reflected by the critical structure 2101. The received waveform 2125 may be delayed due to the distance d between the distal end of the surgical device 2102 and the critical structure 2101. In various instances, the waveforms 2124, 2125 can be selected to target the critical structures 2101 within the tissue 2103 based on the spectral characteristics of the critical structures 2101, as further described herein. In various instances, the transmitter 2106 is configured to be capable of providing a binary signal on and off, as shown in fig. 42, which can be measured by the receiver 2108, for example.
Based on the delay between transmitted wave 2124 and received wave 2125, time-of-flight sensor system 2104 is configured to be able to determine distance d (fig. 41). A time-of-flight timing diagram 2130 of the transmitter 2106 and receiver 2108 of fig. 41 is shown in fig. 42. The delay is a function of the distance d, and the distance d is given by:
Figure BDA0003342537230000541
wherein:
c is the speed of light;
t is the length of the pulse;
q1when light is emittedAn accumulated charge; and
q2charge that accumulates when no light is emitted.
As provided herein, the time of flight of the waveforms 2124, 2125 corresponds to the distance d in fig. 41. In various instances, the additional transmitter/receiver and/or the pulsed signal from transmitter 2106 may be configured to be capable of transmitting a non-penetrating signal. The non-penetrating tissue may be configured to enable determination of the distance from the emitter to a surface 2105 of the occluding tissue 2103. In various cases, the depth of the critical structure 2101 may be determined by:
dA=dw-dt
wherein:
dAthe depth of the critical structure 2101 below the surface 2105 of the occluding tissue 2103;
dwdistance from transmitter 2106 to critical structure 2101 (d in fig. 41); and
dtdistance from emitter 2106 (on the distal end of surgical device 2102) to surface 2105 of shielding tissue 2103.
In one aspect of the disclosure, referring now to fig. 43, a time-of-flight sensor system 2204 utilizing waves 2224a, 2224b, 2224c, 2225a, 2225b, 2225c is illustrated. In certain instances, the time-of-flight sensor system 2204 may be incorporated into the surgical visualization system 500 (fig. 24). Time-of-flight sensor system 2204 includes a waveform transmitter 2206 and a waveform receiver 2208. The waveform transmitter 2206 is positioned on the first surgical device 2202a and the waveform receiver 2208 is positioned on the second surgical device 2202 b. The surgical devices 2202a, 2202b are positioned through their respective trocars 2210a, 2210b, respectively, which extend into the patient's cavity 2207. The transmitted waves 2224a, 2224b, 2224c extend from the transmitter 2206 toward the surgical site, and the received waves 2225a, 2225b, 2225c are reflected back to the receiver 2208 from various structures and/or surfaces at the surgical site.
The different emissions 2224a, 2224b, 2224c are configured to target different types of materials at the surgical site. For example, wave 2224a targets the shielding tissue 2203, wave 2224b targets a first critical structure 2201a (e.g., a blood vessel), and wave 2224c targets a second critical structure 2201b (e.g., a cancerous tumor). The wavelengths of waves 2224a, 2224b, 2224c may be in the visible, NIR or SWIR wavelength spectrum. For example, visible light may be reflected from the surface 2205 of the tissue 2203, and the NIR and/or SWIR waveforms may be configured to be able to penetrate the surface 2205 of the tissue 2203. In various aspects, a spectral signal (e.g., a hyperspectral, multispectral, or selective spectrum) or a photoacoustic signal may be emitted from the emitter 2206, as described herein. In various instances, the waves 2224b, 2224c may be selected to target critical structures 2201a, 2201b within the tissue 2203 based on the spectral characteristics of the critical structures 2201a, 2201b, as further described herein.
The transmitted waves 2224a, 2224b, 2224c may reflect from the target material (i.e., surface 2205, first critical structure 2201a, and second critical structure 2201b, respectively). The received waveforms 2225a, 2225b, 2225c may be due to the distance d shown in FIG. 431a、d2a、d3a、d1b、d2b、d3bBut is delayed.
In a time-of-flight sensor system 2204 in which the transmitter 2206 and receiver 2208 may be independently positioned (e.g., on separate surgical devices 2202a, 2202b and/or controlled by separate robotic arms), various distances d may be calculated from the known positions of the transmitter 2206 and receiver 22081a、d2a、d3a、d1b、d2b、d3b. For example, when the surgical devices 2202a, 2202b are robotically controlled, these orientations may be known. Knowledge of the orientation of the transmitter 2206 and receiver 2208 and the time at which the photon stream is targeted to a certain tissue, and information of this particular response received by the receiver 2208 may allow the distance d to be determined1a、d2a、d3a、d1b、d2b、d3b. In one aspect, the distance to the obscured critical structures 2201a, 2201b may be triangulated using the penetration wavelength. Since the speed of light is constant for any wavelength of visible or invisible light, the time-of-flight sensor system 2204 can determine various distances.
Still referring to fig. 43, in various instances, in the view provided to the clinician, the receiver 2208 may be rotated such that the center of mass of the target structure in the resulting image remains constant, i.e., in a plane perpendicular to the axis of the selected target structure 2203, 2201a, or 2201 b. Such orientations may quickly convey one or more relevant distances and/or perspectives with respect to critical structures. For example, as shown in fig. 43, the surgical site is displayed from a perspective where the first critical structure 2201a is perpendicular to the viewing plane (i.e., the blood vessels are oriented in/out of the page). In various instances, such orientation may be a default setting; however, the view may be rotated or otherwise adjusted by the clinician. In some cases, the clinician may switch between different surfaces and/or target structures that define the viewing angle of the surgical site provided by the imaging system.
In various instances, receiver 2208 can be mounted on a trocar or cannula (such as trocar 2210b), for example, through which second surgical device 2202b is positioned. In other cases, the receiver 2208 can be mounted on a separate robotic arm whose three-dimensional orientation is known. In various instances, the receiver 2208 may be mounted on a movable arm separate from the robot controlling the first surgical device 2202a, OR may be mounted to an Operating Room (OR) table that may be intraoperatively registered with a robot coordinate plane. In such cases, the orientation of the transmitter 2206 and receiver 2208 may be capable of registration with the same coordinate plane such that distances may be triangulated from the output from the time-of-flight sensor system 2204.
The combination OF TIME-OF-flight sensor systems and near INFRARED SPECTROSCOPY (NIRS), known as TOF-NIRS, which is capable OF measuring TIME-resolved profiles OF NIR light with nanosecond resolution, can be found IN an article entitled "TIME-OF-FLIGHT NEAR-FRARED SPECTROSCOPY FOR NONDESTERISTIVE MEASUREMENT OF INTERNAL QUALITY IN GRAPEFRUIT" (Journal OF the American Society OF sciences FOR horticulture Science), 5.2013, Vol.138, No. 3, page 225-.
In various instances, the time-of-flight spectral waveform is configured to enable determination of the depth of a critical structure and/or proximity of a surgical device to the critical structure. Further, various surgical visualization systems disclosed herein include surface mapping logic configured to create a three-dimensional rendering of the surface of the visible tissue. In such cases, the clinician may know the proximity (or lack thereof) of the surgical device to the critical structure even when the visible tissue blocks the critical structure. In one case, the topography of the surgical site is provided on a monitor by surface mapping logic. If the critical structures are close to the surface of the tissue, spectral imaging may communicate the position of the critical structures to the clinician. For example, spectral imaging may detect structures within 5mm or 10mm of the surface. In other cases, spectral imaging may detect structures 10mm or 20mm below the surface of the tissue. Based on the known limits of the spectral imaging system, the system is configured to be able to communicate that the critical structure is out of range if the spectral imaging system is unable to detect the critical structure at all. Thus, the clinician may continue to move the surgical device and/or manipulate tissue. When a critical structure moves into range of the spectral imaging system, the system may identify the structure and, thus, communicate that the structure is in range. In such cases, an alert may be provided when the structure is initially identified and/or further moved within a predefined proximity zone. In such cases, the clinician may be provided with proximity information (i.e., lack of proximity) even if the spectral imaging system does not identify critical structures with known boundaries/ranges.
Various surgical visualization systems disclosed herein may be configured to recognize the presence of and/or proximity of critical structures intraoperatively and alert a clinician prior to damaging the critical structures by accidental dissection and/or transection. In various aspects, the surgical visualization system is configured to identify one or more of the following critical structures, for example: ureters, intestines, rectum, nerves (including phrenic nerve, recurrent laryngeal nerve [ RLN ], sacrosporalis facial nerve, vagus nerve and their branches), blood vessels (including pulmonary artery and aortic artery and pulmonary vein, inferior mesenteric artery [ IMA ] and its branches, superior rectal artery, superior b-artery and left colonic artery), Superior Mesenteric Artery (SMA) and its branches (including middle and right colon arteries, ileal artery), hepatic artery and its branches, portal vein and its branches, splenic artery/vein and its branches, external and internal (lower abdomen) ileal vessels, short gastric artery, uterine artery, sacral middle vessel and lymph nodes. Further, the surgical visualization system is configured to indicate proximity of the surgical device to the critical structure and/or alert a clinician when the surgical device is proximate to the critical structure.
Various aspects of the present disclosure provide intra-operative critical structure identification (e.g., identification of ureters, nerves, and/or blood vessels) and instrument proximity monitoring. For example, various surgical visualization systems disclosed herein may include spectral imaging and surgical instrument tracking, which enables visualization of critical structures, for example, below the surface of the tissue (such as 1.0cm to 1.5cm below the surface of the tissue). In other cases, the surgical visualization system may identify structures less than 1.0cm or greater than 1.5cm below the surface of the tissue. For example, even a surgical visualization system that can identify structures only within 0.2mm of the surface, for example, may be valuable if the structures are otherwise not visible due to depth. In various aspects, the surgical visualization system may enhance the clinician's view, for example, with a virtual depiction of the critical structures as a visible white light image superimposed on the surface of the visible tissue. The surgical visualization system may provide real-time three-dimensional spatial tracking of the distal tip of the surgical instrument and may provide proximity alerts, for example, when the distal tip of the surgical instrument moves within a particular range of a critical structure (such as within 1.0cm of the critical structure).
Various surgical visualization systems disclosed herein can identify when anatomy is too close to a critical structure. The anatomy may be "too close" to the critical structure based on temperature (i.e., too hot near the critical structure that may risk damaging/heating/melting the critical structure) and/or based on tension (i.e., too much tension near the critical structure that may risk damaging/tearing/pulling the critical structure). For example, when the vessel is skeletonized prior to ligation, such surgical visualization systems may facilitate dissection around the vessel. In various instances, a thermal imaging camera may be utilized to read heat at the surgical site and provide a warning to the clinician based on the detected heat and the distance from the tool to the structure. For example, if the temperature of the tool is above a predefined threshold (such as, for example, 120 ° f), an alert may be provided to the clinician at a first distance (such as, for example, 10mm), and if the temperature of the tool is less than or equal to the predefined threshold, an alert may be provided to the clinician at a second distance (such as, for example, 5 mm). The predefined threshold and/or warning distance may be a default setting and/or may be clinician programmable. Additionally or alternatively, the proximity alert may be associated with a thermal measurement made by the tool itself (such as a thermocouple that measures heat in the distal jaw of a monopolar or bipolar dissector or vascular sealer, for example).
Various surgical visualization systems disclosed herein may provide sufficient sensitivity with respect to critical structures and specificity to enable clinicians to perform fast but safe anatomy with confidence based on criteria of care and/or device safety data. The system can function intraoperatively in real time during a surgical procedure with minimal and, in various cases, no risk of ionizing radiation to the patient or clinician. Conversely, during fluoroscopy, the patient and clinician may be exposed to ionizing radiation via, for example, an X-ray beam used to view the anatomy in real time.
The various surgical visualization systems disclosed herein may be configured to, for example, detect and identify one or more desired types of critical structures in the forward path of the surgical device, such as when the path of the surgical device is robotically controlled. Additionally or alternatively, the surgical visualization system may be configured to detect and identify one or more types of critical structures, for example, in the surrounding area and/or in multiple planes/dimensions of the surgical device.
The various surgical visualization systems disclosed herein may be readily operated and/or interpreted. Further, various surgical visualization systems may incorporate an "override" feature that allows the clinician to override default settings and/or operations. For example, the clinician may selectively turn off alerts from the surgical visualization system and/or be closer to critical structures than the surgical visualization system suggests, such as when the risk of a critical structure is less than the risk of avoiding the region (e.g., when removing cancer around a critical structure, the risk of leaving cancerous tissue may be greater than the risk of causing damage to the critical structure).
The various surgical visualization systems disclosed herein may be incorporated into a surgical system and/or used during a surgical procedure with limited impact on workflow. In other words, the particular implementation of the surgical visualization system may not change the manner in which the surgical procedure is performed. Further, surgical visualization systems may be more economical than the cost of accidental transection. The data show that unexpected damage reduction to critical structures can drive incremental compensation.
The various surgical visualization systems disclosed herein may be operated in real-time or near real-time and far enough in advance to enable a clinician to anticipate critical structures. For example, the surgical visualization system may provide sufficient time to "slow down, assess, and avoid" in order to maximize the efficiency of the surgical procedure.
Various surgical visualization systems disclosed herein may not require contrast agents or dyes injected into the tissue. For example, spectral imaging is configured to enable intraoperative visualization of hidden structures without the use of contrast agents or dyes. In other cases, the contrast agent may be more easily injected into the appropriate tissue layer than other visualization systems. For example, the time between injection of contrast agent and visualization of critical structures may be less than two hours.
Various surgical visualization systems disclosed herein may be associated with clinical data and/or device data. For example, the data may provide a boundary as to how close the energy-enabled surgical device (or other potentially damaging device) should be to tissue that the surgeon does not want to damage. Any data module that interacts with the surgical visualization system disclosed herein may be provided integrally or separately from the robot to enable use with a stand-alone surgical device, for example, in open or laparoscopic procedures. In various instances, the surgical visualization system may be compatible with a robotic surgical system. For example, the visualization image/information may be displayed in a robot console.
Various surgical visualization systems disclosed herein may provide enhanced visualization data and additional information to a surgeon and/or a control unit of a robotic system and/or a controller thereof to improve, enhance, and/or inform input controls and/or controls of the robotic system.
Additional control system
Some surgeons may be accustomed to using a hand-held surgical instrument in which displacement of the handle portion of the surgical instrument effects corresponding displacement of the end effector portion of the surgical instrument. For example, advancing a handle of a surgical instrument one inch may cause an end effector of the surgical instrument to be advanced by a corresponding one inch. Such a one-to-one correlation between input and output may also be preferred by some surgeons utilizing robotic applications. For example, a one-to-one correlation between input and output motions can provide intuitive control motions as the robotic surgical end effector is moved around tissue. While a one-to-one correlation may be desirable in some circumstances, such input motions may not be feasible or practical when displacing a surgical tool across a large distance without the assistance of a clutch mechanism. Furthermore, a one-to-one correlation may not be necessary or desirable in some cases; however, a surgeon may prefer displacement input motions (translation and/or rotation) when controlling a robotic surgical tool in certain situations, such as during a precision motion mode.
The clutchless input control device may allow limited translation of a portion thereof during the fine motion mode and may rely on force sensing techniques, such as a space joint 1006 and a sensor arrangement 1048 (fig. 8 and 9) during the coarse motion mode. Organizing the proximity data may cause the input control device to switch between a fine motion mode and a coarse motion mode. In such cases, the surgeon may utilize the force sensor to drive the surgical end effector a greater distance toward the tissue, and upon reaching a predefined proximity to the tissue, the limited translation of the portion of the clutch-less input control device may be utilized to provide displacement input motion to control the robotic surgical tool.
Referring now to fig. 44-49, an input control device 4000 is shown. Input control device 4000 is a clutchless input control device, as further described herein. The input control device 4000 may be utilized at a surgeon's console or workspace of the robotic surgical system. For example, input control device 4000 may be incorporated into a surgical system, such as surgical system 110 (fig. 1) or surgical system 150 (fig. 3), to provide control signals to a surgical robot and/or a surgical tool coupled thereto. The input control device 4000 includes manual input controls for moving the robotic arm and/or surgical tool in three-dimensional space. For example, a surgical tool controlled by the input control device 4000 may be configured to move in three-dimensional space and rotate or articulate about multiple axes (e.g., roll about a longitudinal tool axis and articulate about one or more articulation axes).
The input control device 4000 includes a multi-dimensional space joint 4006 having a central portion 4002 supported on a base 4004 that is similar in many respects to the multi-dimensional space joint 1006, the central portion 1002, and the base 1004 of the input control device 1000 (fig. 6-11). For example, the base 4004 is structured to rest on a surface, such as a table or work surface at a surgeon's console or workspace, and can remain in a fixed, stationary position relative to the underlying surface upon application of input control motions to the input control device 4000. The space joint 4006 is configured to receive a multi-dimensional input corresponding to a control motion of the surgical tool in a multi-dimensional space. A power line 4032 extends from the base 4004. The input control device 4000 also includes a multi-axis force and/or torque sensor arrangement 4048 (fig. 46) that is similar in many respects to the sensor arrangement 1048 (fig. 8 and 9). For example, the sensor arrangement 4048 is configured to be able to detect forces and moments at the space joint 4006, such as forces applied to the central portion 4002. The multi-dimensional spatial joint and its sensor arrangement will be further described herein.
The central portion 4002 is flexibly supported relative to the base 4004. In such cases, the center portion 4002 may be configured to be able to move or float within the predefined cell upon receiving a force control input thereto. For example, the central portion 4002 may be a floating shaft supported on the base 4004 by one or more elastomeric members (such as springs). The center portion 4002 can be configured to be movable or float within a predefined three-dimensional volume. For example, the elastomeric couplings may allow the center portion 4002 to move relative to the base 4004; however, the limiting plates, pins, and/or other structures can be configured to limit the range of motion of the center portion 4002 relative to the base 4004. In one aspect, movement of the center portion 4002 relative to the base 4004 from a center or "home" position can be permitted in a range of about 1.0mm to about 5.0mm in any direction (up, down, left, right, back, and forward). In other cases, movement of the center portion 4002 relative to the base 4004 can be limited to less than 1.0mm or greater than 5.0 mm. In some cases, the central portion 4002 can move about 2.0mm in all directions relative to the base 4004, and in other cases, the central portion 4002 can remain stationary or fixed relative to the base 4004.
In various circumstances, the central portion 4002 of the space joint 4006 can be spring biased toward a center or starting position in which the central portion 4002 is aligned with the Z-axis (i.e., a vertical axis passing through the central portion 4002 and the space joint 4006). Driving (e.g., pushing and/or pulling) the center portion 4002 in any direction away from the Z-axis can be configured to "drive" the end effector of the associated surgical tool in a corresponding direction. When the external driving force is removed, the center portion 4002 can be configured to return to a center or starting position and can stop the movement of the end effector. During portions of the surgical procedure, such as during a coarse motion mode, the robotic surgical tool may be allowed to be controlled by applying force to the sensor arrangement 4048 at the spatial joint 4006, as further described herein.
In various instances, the space joint 4006 and the central portion 4002 coupled thereto define a six degree of freedom input control. Referring again to the end effector 1052 of the surgical tool 1050 in FIG. 12, the X direction is alignedThe force into the central portion 4002 of the control device 4000 corresponds to the force of the end effector 1052 along its XtDisplacement of the shaft (e.g., longitudinally), a force in the Y-direction to the center portion 4002 corresponding to the end effector 1052 along its Y tDisplacement of the shaft (e.g., laterally), and a force in the Z-direction to the center portion 4002 corresponds to the end effector 1052 being along ZtDisplacement of the shaft (e.g., vertical/up and down). In addition, a force (moment force R) about the X-axis to center portion 4002 causes end effector 1052 to rotate about XtRotation of the shaft (e.g. in direction R)tUpper rolling motion about the longitudinal axis), a force (moment force P) about the Y-axis to the center portion 4002 causes the end effector 1052 to rotate about YtArticulation of the shaft (e.g. direction P)tUp) and a force about the Z-axis to center portion 4002 (moment force T) causes end effector 1052 to move about the Z of the end effectortArticulation of the shaft (e.g. direction T)tA yaw or torsional motion). In such cases, the input control device 4000 comprises, for example, a six degree-of-freedom joystick configured to receive and detect six degrees-of-freedom forces along the X, Y and Z axes and moments about the X, Y and Z axes. These forces may correspond to translational inputs to the end effector 1052 of the associated surgical tool 1050 and these moments may correspond to rotational inputs. A six degree of freedom input device will be described further herein.
Referring again to the input control device 4000 in fig. 44-49, the forearm support 4008 is movably coupled to the base 4004. For example, a mechanical joint 4042 incorporated into the center portion 4002 may hold or support the forearm support 4008 such that the forearm support 4008 is movable relative to the base 4004 at the mechanical joint 4042. Referring now primarily to fig. 47, forearm support member 4008 is shown in a first configuration (solid lines) and in a second configuration (dashed lines). The base 4004 of the input control device 4000 remains stationary as an upper portion of the input control device 4000 (e.g., the collective unit 4011 described herein) is displaced between the first configuration and the second configuration along a longitudinal axis S that extends parallel to the longitudinal X-axis. In some cases, the mechanical joint 4042 may allow movement of the forearm support 4008 in multiple directions relative to the base 4004. For example, forearm support 4008 may be movable relative to base 4004 along one, two, or three different axes.
The forearm support 4008 may be movable within a range of motion defined by a travel zone 4050 (fig. 47) about a forearm starting position. For example, the travel zone 4050 may define a one-dimensional path from the forearm starting position, wherein the one-dimensional path extends between 2.0cm and 6.0cm from the forearm starting position along the longitudinal axis. Referring again to fig. 47, in the first configuration (indicated in solid lines as input control device 4000), the input control device 4000 has moved proximally along the longitudinal shaft axis S to the proximal end or limit of the travel zone 4050, and in the second configuration (indicated in dashed lines as input control device 4000'), the input control device 4000 has moved distally along the longitudinal shaft axis S to the distal end or limit of the travel zone 4050. In various instances, the travel zone 4050 can define a two-dimensional space that extends between 2.0cm and 6.0cm from the forearm starting position in two dimensions. In other cases, the travel zone 4050 can define a three-dimensional space that extends between 2.0cm and 6.0cm in three dimensions from the forearm starting position. The type and/or arrangement of joints at mechanical joint 4042 may determine the degree of freedom of forearm support 4008 relative to base 4004. The mechanical joint 4042 (which is supported and/or built on the central portion 4002 of the space joint 4006) may comprise, for example, a resiliently coupled component, a slider, a journaled shaft, a hinge, and/or a rotational bearing.
The degrees of freedom and dimensions of the travel zone 4050 may be selected to provide the surgeon with first-person perspective control of the end effector (i.e., from the surgeon's perspective, at the jaw of a remotely-located end effector that is "positioned" at the surgical site). In various circumstances, movement of the handpiece 4020 on the input control device 4000 may correspond to one-to-one movement of the surgical end effector. For example, moving the handpiece 4020 distally a distance of 1.0cm along the shaft axis S may correspond to distal displacement of the end effector a distance of 1.0cm along the longitudinal shaft axis S of the surgical tool. Similarly, rotating the handpiece 4020 counterclockwise five degrees at the wrist or joint 4010 may correspond to the end effector being rotationally displaced five degrees in the counterclockwise direction. In various instances, input control motions to the control input device 4000 may be scaled, as further described herein and in various commonly owned applications that have been incorporated by reference herein.
The input control device 4000 also includes a shaft 4012 extending distally from the forearm support 4008 and a handpiece 4020 extending distally from the shaft 4012. Forearm support 4008, shaft 4012, and hand piece 4020 form a collective unit 4011 that is movable together as forearm support 4008 moves relative to base 4004 within a travel zone 4050 defined by mechanical joint 4042. The displacement sensor is configured to be able to detect the movement of the collective unit 4011. The handpiece 4020 defines an end effector actuator having at least one jaw, as further described herein. The shaft 4012 includes a linear portion extending along a shaft axis S that is parallel to the X-axis in the configuration shown in fig. 6. The shaft 4012 also includes a contoured "gooseneck" portion 4018 that curves away from the shaft axis S to position the handpiece 4020 in a comfortable position and orientation for the surgeon relative to the forearm support 4008. For example, the contoured portion 4018 defines a curvature of about 90 degrees. In other cases, the curvature may be less than 90 degrees or greater than 90 degrees and may be selected based on, for example, surgeon preference and/or anthropometry.
The shaft 4012 supports a wrist 4010 between the linear portion and the contoured portion 4018. For example, the wrist 4010 can be positioned at a distal end of the linear portion such that the contoured portion 4018 is configured to rotate relative to the linear portion upon application of a manually controlled motion thereto. The wrist 4010 is longitudinally offset from the space joint 4006. The wrist 4010 defines a mechanical joint to facilitate rotational motion. The wrist 4010 may include, for example, resiliently coupled components, sliders, journaled shafts, hinges, and/or rotational bearings. Wrist 4010 may also include a rotation sensor (e.g., sensor 1049 in fig. 25) which may be, for example, a rotational force/torque sensor and/or transducer, a rotational strain gauge, a strain gauge on a spring, a rotational encoder, and/or an optical sensor to detect rotational displacement at the joint.
The wrist 4010 may define an input control motion for at least one degree of freedom. For example, the wrist 4010 may define an input control motion for a roll motion of the robotic end effector controlled by the input control device 4000. Rotation of the wrist 4010 by the surgeon to roll the end effector can provide control of the roll motion at the surgeon's fingertip and correspond to first-person perspective control of the end effector (i.e., from the surgeon's perspective, at the jaw of a remotely located end effector that is "positioned" at the surgical site). As further described herein, such arrangements and perspectives can be utilized to provide precise control motions to input control device 4000 during portions of a surgical procedure (e.g., a precise motion pattern).
In some cases, the input control device 4000 may include an additional wrist joint. For example, the shaft 4012 can include one or more additional rotational joints along its length, such as at an interface or joint 4014 (fig. 44) along a linear portion of the shaft 4012 and/or at an interface or joint 4016 at a distal end of the contoured portion 4018 of the shaft 4012. For example, a mechanical joint at junction 4016 may allow for articulation of handpiece 4020 relative to shaft 4012 about at least one axis. In various instances, the handpiece 4020 can be about at least two axes (e.g., Z parallel to the Z axis in fig. 45)1Axes and Y parallel to the Y axis in FIG. 451Shaft) to articulate. The additional joints may provide an additional degree of freedom to the input control device 4000, which may be detected by the sensor arrangement and converted into a rotational input control motion of the end effector, such as a yaw or pitch articulation motion of the end effector. For example, such an arrangement requires one or more additional sensor arrangements to detect rotational input at the junction 4016.
As further described herein, the spatial joint 4006 may define input control motions for multiple degrees of freedom. For example, the spatial joint 4006 may define input control motions for translation of the surgical tool in three-dimensional space and rotation of the surgical tool about at least one axis. The scrolling motion may be controlled by input to the space joint 4006 and/or the wrist 4010. Whether a roll control motion is provided by wrist 4010 or space joint 4006 of input control device 4000 may depend on the surgeon's motion and/or the operating mode of input control device 4000, as further described herein. Articulation motions may be controlled by inputs to the space joint 4006 and/or the joint 4016. Whether articulation control motions are provided by the joints 4016 or the space joint 4006 of the input control device 4000 may depend on the actions of the surgeon and/or the mode of operation of the input control device 4000, as further described herein.
The handpiece 4020 includes an end effector actuator having opposing fingers 4022 extending distally from the shaft 4012. The opposing fingers 4022 may be similar in many respects to the fingers 1022 (fig. 6-11). Applying an actuation force to the opposing fingers 4022 includes an input controlled movement to the surgical tool. For example, referring again to fig. 12, application of a constricting force to the opposing fingers 4022 may close and/or clamp jaws 1054 of the end effector 1052 (see arrow C in fig. 12). In various circumstances, application of an expansion force can open and/or release the jaws 1054 of the end effector 1052, such as for expansion of an anatomical task. Finger 4022 also includes a ring 4030, which is similar in many respects to ring 1030 (figures 6-11). The opposing fingers 4022 may be displaced symmetrically or asymmetrically relative to the longitudinal shaft axis S during actuation. The displacement of the opposing fingers 4022 may depend on, for example, the force applied by the surgeon and the desired surgical function. The input control device 4000 includes at least one additional actuator, such as actuation buttons 4026, 4028 that may provide additional control at the surgeon's fingertip (e.g., the surgeon's index finger I), which are similar in many respects to the actuation buttons 1026, 1028 (fig. 6-11). The reader will appreciate that the actuation buttons 4026, 4028 may have different geometries and/or structures, and may include triggers, buttons, switches, levers, toggle devices, and combinations thereof.
Referring primarily to fig. 48 and 49, during use, a surgeon may position a portion of his or her arm on the forearm support 4008 and may provide a force to the space joint 4006 via an input at the forearm support 4008. The surgeon's forearm may be positioned on a lower portion of forearm support 4008 and a cuff or sleeve of forearm support 4008 may in some cases at least partially surround the surgeon's arm. For example, forearm support member 4008 forms a partial loop having a curvature greater than 180 degrees. In some cases, the curvature may define an arc of a circle, for example, about 270 degrees. In other cases, the cuff or sleeve may form a closed loop through which the surgeon can position his or her arm. Alternative geometries of the forearm support may be envisaged. The surgeon's thumb T is positioned through one of the rings 4030 and the surgeon's middle finger M is positioned through the other ring 4030. In such instances, the surgeon may pinch and/or expand his thumb T and middle finger M to actuate the opposing fingers 4022. Distally extending fingers 4022 (for actuating the jaws) and actuation buttons 4026, 4028 (for actuating surgical functions at the jaws) are distal to the space joint 4006 and wrist 4010. This configuration mirrors that of the surgical tool in which the end effector is distal of the more proximal articulation joint and/or rotatable shaft and thus provides an intuitive arrangement that facilitates surgeon training and adoption of input control device 4000.
In various instances, the input control of the input control device 4000 is segmented between a first control motion and a second control motion, which is similar in many respects to the operational modes described with respect to the input control device 1000 (fig. 6-11). The control logic components of input control device 4000 may be used, for example, in control circuitry 832 (fig. 25), control circuitry 1400 (fig. 11C), combinatorial logic 1410 (fig. 11D), and/or sequential logic 1420 (fig. 11E), where inputs are provided by inputs to input control device 4000 and/or a surgical visualization system or distance determination subsystem thereof, as further described herein. Inputs from the input control device 4000 include feedback from its various sensors and related to control inputs at, for example, the space joint 4006, the wrist 4010, and/or the handpiece 4020.
Referring now to FIG. 50, control logic 4068 of input control device 4000 may be at a distance (d) determined by the distance determination subsystemt) Greater than or equal to a threshold distance (D)Critical point of) Then the coarse motion mode is activated or maintained at block 4082 and may be at distance (d)t) Less than a threshold distance (D)Critical point of) While the coarse motion mode is deactivated at block 4076Formula (II) is shown. More specifically, when a force is initially applied to forearm support 4008 at block 4070 to move forearm support 4008 to the end of its limited travel zone (e.g., the boundary of travel zone 4050 in fig. 47), the robotic surgical tool is controlled at block 4072 to move relative to the associated tissue at the surgical site. In various instances, the force required to input control motion via the sensor arrangement 4048 (fig. 46) may be greater than the force required to move the forearm support member 4008 to the end of its zone of travel. In other words, the surgeon may move the forearm support 4008 to the end of its travel zone before effecting the control motion with the sensor arrangement 4048.
As the robotic surgical tool moves relative to the tissue, the control logic examines the proximity data provided by the tissue proximity detection system to determine the distance (d) at block 4074t) Whether greater than or equal to a threshold distance (D)Critical point of). Control logic 4068 may periodically and/or continuously adjust the distance (d) during a surgical procedure (e.g., intra-operatively and/or in real-time)t) Distance from threshold (D)Critical point of) A comparison is made. The threshold distance (D) may be set by the surgeon in some casesCritical point of). In addition, the surgeon can selectively override default rules and conditions of the control logic 4068, such as comparison with block 4074 and/or to a threshold distance (D)Critical point of) The regulation of (2) is related to the rule.
If distance (d)t) Greater than or equal to a threshold distance (D)Critical point of) Then a coarse motion mode may be activated at block 4082. As force continues to be applied to the forearm support 4008 to move the forearm support 4008 to the end of its limited travel zone (block 4070) and move the tool relative to the tissue (block 4072), the control circuitry may continue to monitor the distance (d)t) (Block 4074) and at a distance (d)t) Greater than or equal to a threshold distance (D)Critical point of) The coarse motion mode is maintained (block 4082).
If distance (d)t) Becomes less than a threshold distance (D)Critical point of) Then the coarse motion mode may be deactivated at block 4076. With the coarse motion mode deactivated, the forearm support 4008 at block 4078 may be utilized to make a zone of travel (e.g., a forearm support 4008Travel zone 4050 in fig. 47) and/or control motions of the robotic tool using actuation of a wrist (e.g., wrist 4010 and/or joint 4016) and/or hand piece 4020 at block 4080. The control circuit can continue to monitor the distance (d)t) (Block 4074) and as long as the distance (d)t) Less than a threshold distance (D)Critical point of) The coarse motion mode is deactivated (block 4076).
During the course motion mode, the surgical tool and its end effector may be driven in the detected direction by the force applied by the forearm support 4008 at the spatial joint 4006 until the force is removed and the central portion 4002 is biased back to the starting position. The driving force provided to the end effector can also terminate when the force to the space joint 4006 is removed during the coarse motion mode.
Referring again to fig. 44-49, the input control device 4000 has been described as having a mechanical joint 4042 between the space joint 4006 and the forearm support 4008, which allows the forearm support 4008 (and the entire collective unit 4011) to move relative to the base 4004 within the zone of travel 4050. The travel zone 4050 may provide a precise control zone for the surgeon to provide precise control motion to the end effector by moving the handpiece 4020. In other cases, similar to the input control device 1000, for example, the input control device 4000 may not include an additional mechanical joint 4042 between the space joint 4006 and the forearm support 4008. In such cases, precise control motions may be applied to the space joint 4006; however, such control motions may be scaled according to data from the tissue proximity detection system, as further described herein. For example, the scaling algorithm may also be applied to the input control device 4000.
Referring now to fig. 51 and 52, an input control device 4100 is shown. The input control device 4100 is similar in many respects to the input control device 4000 (fig. 44-49). For example, the input control device 4100 includes a base 4104, a central portion 4102 supported relative to the base 4104, and a spatial joint 4106 therebetween. A power cord 4132 extends from the base 4104. Further, the collective unit 4111 is supported by the central portion 4102 and includes a forearm support 4108, a distally extending shaft 4112, and a distally terminating hand piece 4120 including a pair of opposing jaws 4122 and actuation buttons 4126, 4128. The collective unit 4111 may also include at least one wrist joint along the axis 4112, as described herein with respect to the input control device 4000. However, the input control device 4100 does not include a mechanical joint that allows the collective unit 4111 to move within the travel zone with respect to the base 4104. Further, unlike forearm support 4008, forearm support 4108 forms a partial loop having a curvature of less than 180 degrees. The reader will readily appreciate that a wide variety of input control devices incorporating the various features of input control devices 4000 and 4100 can be designed based on formative testing and user preferences. The robotic system may allow a user to choose from a number of different forms to select the style that best suits his/her needs.
Referring now to fig. 53-56, an input control device 4200 is shown. The input control device 4200 is a clutchless input control device, as further described herein. The input control device 4200 may be utilized at a surgeon's console or workspace of the robotic surgical system. For example, the input control device 4200 may be incorporated into a surgical system, such as the surgical system 110 (fig. 1) or the surgical system 150 (fig. 3), to provide control signals to a surgical robot and/or a surgical tool coupled thereto. The input control device 4200 includes manual input controls for moving the robotic arm and/or surgical tool in three-dimensional space. For example, a surgical tool controlled by the input control device 4200 may be configured to move in three-dimensional space and rotate or articulate about multiple axes (e.g., roll about a longitudinal tool axis and articulate about one or more articulation axes).
The input control device 4200 includes a multi-dimensional space joint 4206 having a central portion 4202 supported on a base 4204 that is similar in many respects to the multi-dimensional space joint 1006, central portion 1002, and base 1004 of the input control device 1000 (fig. 6-11). The base 4204 is structured to rest on a surface, such as a table or work surface at a surgeon's console or workspace, and may remain in a fixed, stationary position relative to the underlying surface when input control motions are applied to the input control device 4200. Base 4204 includes a wide curved foot portion 4203 that may provide stability to the balanced and gravity compensated geometry of the input control device. The base 4204 also includes an upright portion 4205 for suspending or hanging the central portion 4202 above the foot portion 4203. The space joint 4206 is configured to receive multi-dimensional inputs corresponding to control motions of the surgical tool in a multi-dimensional space. The input control device 4200 also includes a multi-axis force and/or torque sensor arrangement 4248 (fig. 54) that is similar in many respects to the sensor arrangement 1048 (fig. 8 and 9). For example, the sensor arrangement 4248 is configured to be able to detect forces and moments at the space joint 4206, such as forces applied to the central portion 4202. The multi-dimensional spatial joint and its sensor arrangement will be further described herein.
The central portion 4202 may be flexibly supported relative to the base 4204. In such cases, the center portion 4202 may be configured to be able to move or float within a predefined cell upon receiving a force control input thereto. For example, the central portion 4202 may be a floating shaft supported on the base 4204 by one or more elastomeric members (such as springs). The central portion 4202 may be configured to be movable or float within a predefined three-dimensional volume. For example, the elastomeric coupling may allow the central portion 4202 to move relative to the base 4204; however, the limiting plates, pins, and/or other structures may be configured to limit the range of motion of the central portion 4202 relative to the base 4204. In one aspect, movement of the central portion 4202 relative to the base 4204 from a center or "home" position may be allowed in a range of about 1.0mm to about 5.0mm in any direction (up, down, left, right, back, and forward). In other cases, movement of the central portion 4202 relative to the base 4204 may be limited to less than 1.0mm or greater than 5.0 mm. In some cases, the central portion 4202 may move about 2.0mm in all directions relative to the base 4204, and in other cases, the central portion 4202 may remain stationary or fixed relative to the base 4204.
In various instances, the central portion 4202 of the space joint 4206 may be spring biased toward a central or starting position in which the central portion 4202 is aligned with the X-axis (i.e., a horizontal axis passing through the central portion 4202 and the space joint 4206). Driving (e.g., pushing and/or pulling) the central portion 4202 in any direction away from the X-axis can be configured to "drive" an end effector of an associated surgical tool in a corresponding direction. When the external driving force is removed, the central portion 4202 can be configured to return to a center or starting position and can stop the movement of the end effector.
In each case, spatial joint 4206 and central portion 4202 coupled thereto define a six degree-of-freedom input control. Referring now again to the end effector 1052 of the surgical tool 1050 in fig. 12, the force in the X-direction on the central portion 4202 of the input control device 4200 corresponds to the end effector 1052 being along its X-directiontDisplacement of the shaft (e.g., longitudinally), a force in the Y-direction to the central portion 4202 corresponding to the end effector 1052 being along its YtDisplacement of the shaft (e.g., laterally), and a force in the Z-direction to the central portion 4202 corresponds to the end effector 1052 being moved along ZtDisplacement of the shaft (e.g., vertical/up and down). In addition, a force (moment force R) about the X-axis to the central portion 4202 causes the end effector 1052 to rotate about X tRotation of the shaft (e.g. in direction R)tRolling motion about the longitudinal axis), a force (moment force P) about the Y-axis to the central portion 4202 causes the end effector 1052 to rotate about YtArticulation of the shaft (e.g. direction P)tUp) and a force (moment force T) about the Z-axis to center portion 4202 causes end effector 1052 to move about Z of the end effectortArticulation of the shaft (e.g. direction T)tA yaw or torsional motion). In such cases, the input control device 4200 includes, for example, a six degree-of-freedom joystick configured to receive and detect six degrees-of-freedom forces along the X, Y and Z axes and moments about the X, Y and Z axes. These forces may correspond to translational inputs to the end effector 1052 of the associated surgical tool 1050 and these moments may correspond to rotational inputs. A six degree of freedom input device will be described further herein.
Referring again to the input control device 4200 in fig. 53-56, the hand piece 4220, including the shaft 4212, is movably coupled to the base 4204 by a mechanical linkage assembly 4208. The mechanical link assembly 4208 includes three arms 4208a, 4208b, 4208c, and each arm 4208a, 4208b, 4208c includes a series of links and joints therebetween. The arms 4208a, 4208b, 4208c are rotationally or radially symmetrical about the X-axis. Each arm 4208a, 4208b and 4208c includes two links and three joints. The reader will appreciate that alternative configurations and geometries of the mechanical link assembly 4208 are contemplated. For example, the mechanical linkage assembly 4208 may include fewer than three or more than three arms, and each arm may include a different number of links and joints. In addition, the type of joint may be selected to further limit and/or allow the type and/or degree of rotational movement.
Referring now primarily to fig. 54, the mechanical link assembly 4208 is shown in a first configuration (solid line) and in a second configuration (dashed line). The base 4204 of the input control device 4200 remains stationary as the upper portion (collective unit 4211) of the input control device 4200 is displaced between the first configuration and the second configuration along a longitudinal axis S extending parallel to the longitudinal X-axis in fig. 54. In some instances, the mechanical linkage assembly 4208 may allow movement of the shaft 4212 and the hand piece 4220 in multiple directions relative to the base 4204. For example, the mechanical link assembly 4208 may move relative to the base 4204 along one, two, or three different axes.
For example, the mechanical linkage assembly 4208 may be movable within a range of motion defined by a travel zone around a linkage start position shown in fig. 53. In some cases, the travel zone may define a one-dimensional path from the link starting position, wherein the one-dimensional path extends between 2.0cm and 6.0cm from the link starting position along the longitudinal axis X. Referring again to fig. 54, in the first configuration (solid line), the input control device 4200 has moved proximally along the longitudinal axis to the proximal end or limit of the travel zone, and in the second configuration (dashed line), the input control device 4200 has moved distally along the longitudinal axis to the distal end or limit of the travel zone. In other cases, the travel zone may define a two-dimensional space that extends between 2.0cm and 6.0cm in two dimensions from the starting position of the connecting rod. In other cases, the travel region may define a three-dimensional space that extends between 2.0cm and 6.0cm in three dimensions from the starting position of the connecting rod. The type and/or arrangement of joints and linkages in the mechanical link assembly 4208 may determine the degrees of freedom of the hand piece 4220 and the shaft 4212 relative to the base 4204. The mechanical link assembly 4208 (which is supported and/or built on the central portion 4202 of the space joint 4206) may include, for example, elastically coupled components, sliders, journaled shafts, hinges, and/or rotational bearings.
The degrees of freedom and dimensions of the region of travel may be selected to provide the surgeon with first-person perspective control of the end effector (i.e., from the surgeon's perspective, at the jaw of a remotely-located end effector that is "positioned" at the surgical site). In various circumstances, movement of the shaft 4212 can correspond to one-to-one movement of the surgical end effector. For example, moving the shaft 4212 distally along the shaft axis S a distance of 1.0cm may correspond to distal displacement of the end effector along the longitudinal shaft axis of the surgical tool by a distance of 1.0 cm. Similarly, rotating the hand piece 4220 five degrees counterclockwise at the wrist or joint 4210 in the shaft 4212 may correspond to the end effector being rotationally displaced five degrees in the counterclockwise direction. In various instances, input control motions to the input control device 4200 may be scaled, as further described herein and in various commonly owned applications that have been incorporated by reference herein.
The shaft 4212 extends proximally from the mechanical linkage assembly 4208, and the hand piece 4220 extends proximally from the shaft 4212. The mechanical link assembly 4208, the shaft 4212, and the hand piece 4220 form a collective unit 4211 that is capable of moving together as the mechanical link assembly 4208 moves within the travel zone relative to the base 4204. The displacement sensor is configured to be able to detect the movement of the collective unit 4211. The handpiece 4220 defines an end effector actuator having at least one jaw, as further described herein.
Shaft 4212 supports wrist 4210 along its length. Wrist 4210 is longitudinally offset from space joint 4206 and defines a mechanical joint to facilitate rotational movement. Wrist 4210 may include, for example, elastically coupled components, sliders, journaled shafts, hinges, and/or rotational bearings. Wrist 4210 may also include a rotation sensor (e.g., sensor 1049 in fig. 25), which may be, for example, a rotational force/torque sensor and/or transducer, a rotational strain gauge and/or strain gauge on a spring and/or an optical sensor to detect rotational displacement at the joint.
Wrist 4210 may define an input control motion for at least one degree of freedom. For example, wrist 4210 may define an input control motion for a roll motion of a robotic end effector controlled by input control device 4200. Rotation of wrist 4210 by the surgeon to roll the end effector can provide control of the rolling motion at the surgeon's fingertip and correspond to first-person perspective control of the end effector (i.e., from the surgeon's perspective, at the jaw of a remotely located end effector that is "positioned" at the surgical site). As further described herein, such arrangements and perspectives can be utilized to provide precise control motions to the input control device 4200 during portions of a surgical procedure (e.g., a precise motion mode).
In some cases, the input control device 4200 may include an additional wrist joint, e.g., as described with respect to the input control device 4000 (fig. 44-49), and a sensor arrangement to detect rotational input motion thereto. In other cases, the mechanical linkage assembly 4208 may provide sufficient degrees of freedom to precisely control the robotic surgical tool at the surgical site. A sensor arrangement on the mechanical linkage assembly 4208 may be employed to detect rotational user input motion thereto.
As further described herein, spatial joint 4206 may define input control motions for multiple degrees of freedom. For example, the spatial joint 4206 may define input control motions for translation of a surgical tool in three-dimensional space and rotation of the surgical tool about at least one axis. The rolling motion may be controlled by input to the space joint 4206 and/or the wrist 4210. Whether the roll control motion is provided by the wrist 4210 or the space joint 4206 of the input control device 4200 may depend on the surgeon's motion and/or the operating mode of the input control device 4200, as further described herein. The articulation motion may be controlled by input to the space joint 4206 and/or the mechanical linkage assembly 4208. Whether articulation control motions are provided by the mechanical linkage assembly 4208 or the spatial joint 4206 of the input control device 4200 may depend on the surgeon's actions and/or the mode of operation of the input control device 4200, as further described herein.
The hand piece 4220 comprises an end effector actuator having opposing fingers 4222 extending distally from the shaft 4212. The opposing fingers 4222 may be similar in many respects to the fingers 1022 (fig. 6-11). Applying an actuation force to the opposing fingers 4222 includes an input controlled movement to the surgical tool. For example, referring again to fig. 12, application of a compressive force to the opposing fingers 4222 can close and/or clamp the jaws 1054 of the end effector 1052 (see arrow C in fig. 12). In various circumstances, application of an expansion force can open and/or release the jaws 1054 of the end effector 1052, such as for expansion of an anatomical task. The fingers 4222 also include rings 4230, which are similar in many respects to the rings 1030 (fig. 6-11). The opposing fingers 4222 may be displaced symmetrically or asymmetrically relative to the longitudinal shaft axis S during actuation. The displacement of the opposing fingers 4222 may depend on, for example, the force applied by the surgeon and the desired surgical function. The input control device 4200 includes at least one additional actuator, such as an actuation button 4226, 4228 that may provide additional control at the surgeon's fingertip, which is similar in many respects to the actuation buttons 1026, 1028 (fig. 6-11).
Referring primarily to fig. 55 and 56, during use, a surgeon may position his or her hand relative to the hand piece 4220 to reach the fingers 4222 and actuate the buttons 4226, 4228. The surgeon's thumb T is positioned through one of the rings 4230 and the surgeon's middle finger M is positioned through the other ring 4230. In such instances, the surgeon may pinch and/or expand his thumb T and middle finger M to actuate the finger 4222.
In various instances, the input control of the input control device 4200 is segmented between a first control motion and a second control motion, which is similar in many respects to the operational modes described with respect to the input control device 1000 (fig. 6-11). The control logic components of input control device 4200 may be used, for example, in control circuitry 832 (fig. 25), control circuitry 1400 (fig. 11C), combinatorial logic 1410 (fig. 11D), and/or sequential logic 1420 (fig. 11E), where inputs are provided by inputs to input control device 4200 and/or a surgical visualization system or distance determination subsystem thereof, as further described herein. Inputs from the input control device 4200 include feedback from its various sensors and related to control inputs at, for example, the space joint 4206, the mechanical linkage assembly 4208, the wrist 4210, and/or the hand piece 4220.
In some cases, the input control device 4200 may operate the control logic component 4068 of FIG. 50. In such cases, the input control motion in the coarse motion mode (activated at block 4082) may be applied when the mechanical linkage assembly 4208 is positioned in the mode B region shown in fig. 54. In this position, user input forces may be provided to space joint 4206 and detected by multi-axis force and torque sensor arrangement 4248 (fig. 54). When the coarse motion mode is deactivated, a fine control input motion may be applied via an input to the handpiece 4220 while the mechanical linkage assembly 4208 is in mode a zone shown in fig. 54. For example, the surgeon may utilize one-to-one movement of the handpiece 4220 to control the operation of a surgical end effector positioned in close proximity to tissue.
When employing certain robotic surgical systems and/or their input controls, the surgeon may need to remain at the surgeon's console or workspace as long as he or she is still involved in the surgery and/or actively controlling the robotic surgical tools. Because the surgeon's console may be fixed at a particular location (e.g., within the operating room but outside the sterile field), the surgeon may not be able to move around in the operating room and/or access the sterile field. When the mobility of the surgeon is limited, the surgeon may not be able to obtain first-hand information regarding the patient and/or the patient's condition and/or other conditions within the operating room (e.g., the status of supplies, the condition of an assistant, etc.).
To increase surgeon mobility, the input control device may be ungrounded or untethered to the surgeon's console. Such an input control device may be a wireless handpiece capable of movement in three dimensions. In such cases, the input control device may not utilize a base or docking station to detect and/or deliver input control motions provided to the handpiece. The motion capture system may detect movement of the handpiece relative to multiple degrees of freedom. For example, the motion capture system may detect input control motions of the handpiece in three-dimensional space by the user, including rotational displacement of the handpiece. In such cases, the input control device may be a clutch-less input control device, as the handpiece does not need to be electronically disengaged from the controls of the robotic surgical tool using a clutch mechanism in order to maintain the handpiece within a working envelope defined by the position and configuration of the input control device and the surgeon at the surgeon's console.
The wireless handpiece of the input control device may also include a scrolling input member (e.g., a rotation shaft and a rotation sensor) and one or more actuation buttons or inputs for receiving control inputs from the surgeon. The scroll input component can provide a scroll control input motion that corresponds to a scroll motion of an end effector of a robotic surgical tool controlled by the input control device. Additional actuation buttons may actuate surgical functions of the end effector, such as firing motions, energization functions, cooling functions, irrigation functions, and/or adjustments to clamping pressure and/or tissue gap. Additional surgical functions of the robotic surgical tool will be described further herein. In some cases, the handpiece may also include additional input control actuators on the handpiece, such as pivotable arms or levers, which the surgeon may provide control motions to move the surgical tool in three-dimensional space. For example, the pivotable arm may define a joystick operably coupled to the multi-axis force sensor, as further described herein.
The control of such an input control device may be segmented in different operating modes. For example, a first input control motion may be used in a first mode, such as a coarse motion mode, and a second input control motion may be used in a second mode, such as a fine motion mode. The control circuit may selectively lock or ignore unused input control motions when they are not being used to control the robotic surgical tool.
In one aspect, the control unit may switch the input control device between the coarse motion mode and the fine motion mode based on intraoperative tissue proximity data from the proximity detection system, as further described herein. In the first mode, control input motion may be provided by, for example, a joystick and a multi-axis force sensor coupled thereto. Movement of the joystick may provide a coarse motion control input to the robotic surgical tool in a coarse motion mode. For example, the joystick may be movable relative to X, Y and/or the Z-axis, which corresponds to a control motion for moving the robotic surgical tool in its X, Y and/or Z-axis. The force provided by the joystick may correspond to a displacement of the robotic surgical tool that implements the clutch-less function of the input control device.
In some cases, the sensitivity of the input to the joystick may be reduced when the surgical tool approaches tissue, and in some cases may be deactivated or locked when tissue proximity less than or equal to a threshold distance is detected. For example, the handpiece may include a mechanical lock and/or an electronic lock for manipulating the wand. The lock may be movable from an unlocked position to a locked position in response to switching from the coarse motion mode to the fine motion mode.
In the precision motion mode, the motion capture system may detect motion of the handpiece and provide precision motion control input to the robotic surgical tool based on the detected motion. For example, the handpiece may be movable relative to the X, Y and/or Z-axis and/or articulated relative to one or more different articulation axes, which correspond to control motions for moving and/or articulating the robotic surgical tool accordingly. Additional features and variations of the untethered, ungrounded input control device will be described further herein.
When a surgeon who does not need to participate in a surgical procedure remains at the surgeon's console or at a predetermined location (as provided by the untethered, ungrounded input control devices described further herein), the surgeon may, for example, move around in the operating room and/or access the sterile field. The increased mobility tolerance of the surgeon may allow the surgeon to reposition in order to obtain first-hand information related to the patient and/or the state within the operating room, rather than relying on information provided at the surgeon's console (e.g., obtained by a camera and relayed to an imaging system) and/or second-hand information provided by assistants and/or others within the operating room. In addition, the untethered or ungrounded input control device allows the surgeon to select a comfortable ergonomic position from a variety of postures, including sitting, kneeling, and standing postures, regardless of the support surface of the input control device. The ergonomic effect on the surgeon is improved since the surgeon can select a comfortable posture for himself/herself and adjust his/her posture throughout the surgery. In addition, there is no need for a clutch mechanism to utilize such an input control device.
Referring now to fig. 57-61, an input control device 5000 is shown. The input control device 5000 may be incorporated into a surgical system, such as the surgical system 110 (fig. 1) or the surgical system 150 (fig. 3), to provide control signals to the surgical robot and/or a surgical tool coupled thereto. The input control device 5000 includes manual input controls for moving the robotic arm and/or surgical tool in three-dimensional space. The input control device 5000 includes sufficient degrees of freedom to control a robotic surgical tool 1050, such as that shown in fig. 12. The input control device 5000 includes a handpiece 5002 configured to be held by a surgeon. For example, the surgeon may hold the handpiece 5002 as shown in fig. 60 and 61. The hand piece 5002 is untethered or ungrounded. Further, the handpiece 5002 is a wireless handpiece 5002. Referring primarily to fig. 59A, the handpiece 5002 may include a power supply 5054 and a control circuit 5048 having wireless communication capabilities with external components. For example, the handpiece 5002 may include one or more communication modules. Since the communication does not require high power signals, the near field communication protocol may be utilized in various circumstances.
The handpiece 5002 includes an elongated body 5004. The surgeon may grasp the elongated body 5004 between his or her palm and fingers to hold the hand piece 5002. The elongated body 5004 can be sized and configured for comfortable gripping by a surgeon, and can include, for example, a contoured edge and a generally cylindrical shape.
The handpiece 5002 also includes a lever 5006 extending proximally from the elongate body 5004. The joystick 5006 can define a ring and/or ring-like structure 5008 that can allow engagement with a finger of a surgeon, such as a thumb of the surgeon. The surgeon's thumb (T) is shown engaged with ring 5008 in fig. 60 and 61. The joystick 5006 is operably coupled to a multi-axis force sensor 5010 (fig. 59A), which may be similar in many respects to the multi-axis force and torque sensor 1048 (fig. 8 and 9). The joystick 5006 can be flexibly supported at a space joint 5012, which can be similar in many respects to the space joint 1006 (fig. 6-11). In various circumstances, the lever 5006 can be spring biased toward a home position. In other cases, joystick 5006 can be substantially fixed relative to handpiece 5002.
The surgeon can be configured to provide joystick 5006 with input control forces that are detected by multi-axis force sensor 5010. For example, multi-axis force sensor 5010 can be configured to be able to detect input forces in multiple dimensions, such as the X, Y and Z-axis forces shown in fig. 57. In some cases, multi-axis force sensor 5010 may also be configured to be able to detect input moments in multiple dimensions, such as moments R, P and T around X, Y and the Z axis in fig. 57, respectively. However, as further described herein, in at least one aspect of the present disclosure, the rotation or articulation of the robotic surgical tool may be controlled by different inputs, and in such cases, the multi-axis force sensor 5010 may not detect and/or utilize the torque applied to the joystick 5006. Moreover, in some cases, the sensors of joystick 5006 may be configured to detect displacements of joystick 5006 in lieu of and/or in addition to detecting forces applied to joystick 5006.
The handpiece 5002 also includes a shaft 5016 extending distally from the elongate body 5004. The shaft 5016 is rotatably coupled to the elongate body 5004. For example, a rotational joint 5018 between the elongate body 5004 and the shaft 5016 can allow the shaft 5016 to be in a direction R as shown in fig. 57sAnd (c) in the same direction. The rotary joint 5018 can include, for example, a journaled shaft and/or a rotary bearing. The elongate body 5004 also includes a rotation sensor 5020 (fig. 59A) configured to detect the shaft 5016 relative theretoRotational displacement of the elongate body 5004. The rotational sensor 5020 can be, for example, a rotational force/torque sensor and/or transducer, a rotational strain gauge, a strain gauge on a spring, a rotational encoder, and/or an optical sensor to detect rotational displacement at the rotational joint 5018.
The shaft 5016 also includes a radial sensor 5022 (fig. 59A) configured to detect radial input force to the shaft 5016. For example, the radial sensor 5022 may detect the direction S depicted in fig. 57s1And Ss2A squeezing or pinching force applied to the shaft 5016. As further described herein, the force detected by the radial sensor 5022 can correspond to input control motions of one or more jaws of a robotic surgical tool controlled by manipulation of the input control device 5000. For example, in the direction S s1And Ss2The application of the pinching force may be configured to apply a jaw clamping motion at a jaw of the robotic surgical tool in order to close an opposing jaw 1054 (fig. 12) of the robotic surgical tool 1050.
Additionally or alternatively, the hand piece 5002 can include an actuation trigger pivotably coupled to the elongate body 5004. The activation trigger may be movable between an unactuated position and one or more actuated positions. The handpiece 5002 can also include a trigger sensor configured to detect actuation of the trigger. In various instances, actuation of the trigger can correspond to a closing motion of one or more jaws of a robotic surgical tool 1050 (fig. 12), for example. For example, the trigger sensor may be communicatively coupled to the control circuit 5048 of the input control 5000. The output control signal may be relayed to the robotic surgical system based on actuation of the trigger as detected by the trigger sensor. For some surgeons who are accustomed to using a handheld surgical instrument having a trigger for closing the jaws of the end effector, a pivotable trigger on the input control 5000 may provide an intuitive mechanism.
The handpiece 5002 also includes a body sensor 5030 (fig. 59A) embedded in the elongated body 5004 and configured to detect movement of the elongated body 5004 in three-dimensional space. The subject sensor 5030 is a motion sensor. The subject sensor 5030 can be, for example, an inertial sensor. The inertial sensors can include, for example, accelerometers and/or gyroscopes to detect the effect of gravity on the moving elongated body 5004. In other instances, the subject sensor 5030 can be an electromagnetic tracking receiver configured to detect electromagnetic forces emitted from a source near the handpiece 5002, such as a transmitter in an operating room or a surgical suite. The subject sensor 5030 can be configured to be able to detect input motion in multiple dimensions, such as displacement along the X, Y and Z axes in fig. 57, and input rotation in multiple dimensions, such as moments R, P and T about the X, Y and Z axes in fig. 57, respectively.
The handpiece 5002 also includes a button 5026 on the elongate body 5004. The button 5026 can be positioned and dimensioned such that a surgeon holding the elongate body 5004 can manipulate the button 5026 with one of his or her fingers, such as the middle finger (M), ring finger (R), and/or little finger (L) in the position shown in fig. 60. In various circumstances, the button 5026 can selectively unlock or unlatch certain input control motions via the input control 5000. For example, upon activation of the button 5026, input detected by the body sensor 5030 can be used to control the motion to control the robotic surgical tool. In some cases, the input control motions detected by the body sensor 5030 may be output to control the robotic surgical tool only when a set of requirements is met, including activation of the button 5026. In other cases, activation of the button 5026 can override the default operating mode conditions and corresponding input control functions of the input control device 5000. For example, activation of button 5026 may enable a precision motion mode even though the requirements have not been met, such as before a proximity detection system detects that a surgical tool is positioned within a threshold proximity of tissue. The button 5026 can be activated by pressing the button 5026 toward the elongate body 5004, which can be spring-loaded away from the elongate body 5004. A sensor 5032 (fig. 59A) in the elongate body 5004 can be configured to detect activation and deactivation of the button 5026.
While one actuation button 5026 is shown on the handpiece 5002, the reader will appreciate that additional actuation buttons can be positioned on the elongate body 5004. Each actuation button may correspond to a different function, such as a different control function for providing input control motions via the input control device 5000 and/or a different surgical function performed by the end effector. Further, the reader will appreciate that the actuation button 5026 and/or additional actuation buttons on the handpiece 5002 can have different geometries and/or configurations and can include triggers, buttons, switches, levers, toggles, and combinations thereof.
The control logic of the input control 5000 may be used for a control circuit 5048, which is schematically depicted in fig. 59A. The reader will appreciate that control circuit 832 (fig. 25), control circuit 1400 (fig. 11C), combinational logic circuit 1410 (fig. 11D), and/or sequential logic circuit 1420 (fig. 11E), for example, may also be configured to enable control logic components of input control device 5000 in certain circumstances. Referring again to the control circuit 5048, the processor 5050, memory 5052, and power supply 5054 are contained within the housing of the handpiece 5002. Power supply 5054 may be a battery or battery pack that may be replaceable and/or rechargeable in some circumstances. The power supply 5054 is configured to provide current to various energized components in the handpiece 5002. Memory 5052 stores instructions that are executable by processor 5050 to implement control logic of input control device 5000. For example, memory 5052 may store threshold proximity values and/or incremental proximity values for implementing control logic components as further described herein.
The control circuit 5048 also includes a communication module 5056 coupled to the processor 5050. A communication module 5056 may communicatively couple the processor 5050 to a proximity detection system 5060 configured to detect proximity between a robotic surgical tool controlled by the input control device 5000 and tissue. Proximity detection system 5060 includes a structured light source, such as structured light source 852 in fig. 25. In other cases, the proximity detection system 5060 may rely on a lidar and/or a time-of-flight distance determination system to determine the proximity between the robotic surgical tool and the tissue. Alternative proximity detection systems compatible with the control circuit 5048 will be further described herein.
The processor 5050 is configured to receive proximity signals from the proximity detection system 5060 and input control signals from sensors of the input control device 5000, including sensors 5032 such as a multi-axis force sensor 5010, a rotational sensor 5020, a radial sensor 5022, a body sensor 5030 and a button 5026. In response to the processor 5050 receiving a proximity signal indicating that the proximity to tissue has decreased to less than a threshold, the processor is configured to enable the input control device 5000 to switch between a first mode, which may be a coarse motion mode, and a second mode, which may be a fine motion mode (see, e.g., control logic 1068 in fig. 11A).
In the coarse motion mode, the processor 5050 is configured to provide output control signals to the robotic surgical tool based on the input control force provided to the joystick 5006 and detected by the multi-axis force sensor 5010. For example, the surgeon may apply a force with his or her thumb (T) to joystick 5006 (fig. 60 and 61), which may correspond to an input control motion for moving a robotic surgical tool in three-dimensional space. In some cases, control motions applied to joystick 5006 and detected by multi-axis force sensor 5010 in the coarse motion mode can drive the robotic surgical tool in a desired direction. In such cases, the input control device 5000 may drive the robotic surgical tool toward tissue and across a larger distance without requiring displacement of the input control device 5000 and without requiring a clutch to maintain the input control device within the working envelope. In various instances, the rotational and/or articulation functions of the robotic surgical tool may be selectively locked during the coarse motion mode.
In the precision motion mode, the processor 5050 can provide output control signals to the robotic surgical tool based on input control motions applied to the elongate body 5004 and detected by the body sensors 5030. For example, the surgeon can move the handpiece 5002 in three-dimensional space and this motion can be detected by the subject sensor 5030, which can correspond to an input control motion for moving the robotic surgical tool in three-dimensional space. In each case Next, displacement of the robotic surgical tool and articulation of the surgical tool in three-dimensional space (e.g., pitch and yaw motions of the end effector) may be activated by detection of such rotational movement by the body sensor 5030 during the precision motion mode. In the precision motion mode, the joystick 5006 and/or the multi-axis force sensor 5010 can be deactivated and/or the input control force detected by the multi-axis force sensor 5010 can be ignored. Furthermore, in the fine movement mode, this can be achieved, for example, by moving in the direction Rs(FIG. 57) the upper roll shaft 5016 generates a rolling motion of the end effector (e.g., rolling about a longitudinal shaft axis of the robotic surgical tool) and may be moved by, for example, in direction Ss1And Ss2The compressive force applied to the shaft 5016 generates an actuation of the end effector (e.g., a closure of the jaws).
In various aspects, the input control motions applied to the input control device 5000 in the precision motion mode may correspond to equal motions of the robotic surgical tool. For example, the displacement of the handpiece 5002 in three-dimensional space may correspond to an equal displacement of the robotic surgical tool. In other words, the control input motions may have a one-to-one correlation with the robotic surgical tool motions, which may provide the surgeon with an intuitive control system. In other cases, the control motion in the precision mode may be scaled.
In some cases, the input motion applied to the handpiece 5002 may be selectively ignored and/or not converted by the processor 5050 into output control signals for controlling the robotic surgical tool. For example, until the proximity detected by proximity detection system 5060 reaches a threshold proximity (i.e., while operating in a coarse motion mode), subject sensor 5030 may be deactivated and/or input control motion detected by subject sensor 5030 may be ignored. To control the robotic surgical tool in the coarse motion mode, the surgeon may apply an input control force to joystick 5006. In other cases, even in the coarse motion mode and before reaching the threshold proximity, the surgeon may utilize the subject sensor 5030 to provide control motions to the robotic surgical tool by: button 5026 is selectively activated to override the default rules of the control logic.
Referring now to fig. 62, a hypothetical graphical representation 5070 of the input control sensitivity of the input control device 5000 relative to tissue proximity is shown. The control logic of input control 5000 may be configured to scale the output control signal in response to the proximity signal received from proximity detection system 5060 in the coarse motion mode. For example, in coarse motion mode 5071, a proximity to a threshold proximity (D) may be responded to Critical point of) While scaling the force detected by the multi-axis force sensor 5010. More specifically, the coarse motion mode 5071 may enable the robotic surgical tool to selectively move at relatively high speeds in a region 5072 where the multi-axis force sensor 5010 is highly sensitive to input control forces applied by the surgeon.
At a first distance (D) relative to the tissue1) In time, the sensitivity of the multi-axis force sensor 5010 may begin to decrease, which may reduce the speed of the robotic surgical tool as it approaches tissue. In some cases, the sensitivity decrease relative to tissue proximity may define one or more curves and/or linear relationships. In other cases, the drop may define an incremental and/or stepped profile. Sensitivity may drop in region 5074 until a threshold proximity (D) is metCritical point of) At this point, the input control 5000 enters the precision motion mode 5075 and the sensitivity is zero. In accordance with the default rules of the control logic, in the precision motion mode 5075, the input control force at the multi-axis force sensor 5010 does not control the displacement of the robotic surgical tool.
In certain instances, for example, joystick 5006 and multi-axis force sensor 5010 (fig. 59A) can be disabled throughout the surgical procedure, including during the coarse motion mode. In such cases, the surgeon can selectively operate the input control device 5000 by relying entirely on the subject sensor 5030 (fig. 59A) when the joystick 5006 and/or the multi-axis force sensor 5010 are disabled.
In various circumstances, the surgeon can provide a scrolling input control motion to the shaft 5016 in a coarse motion mode and a fine motion mode. In other cases, the control circuit 5048 can selectively lock rotation of the shaft 5016 during one or more portions of the surgical procedure (such as during a coarse motion mode) and/or can selectively ignore scroll input control motions to the shaft 5016. Additionally or alternatively, in certain instances, the control circuit 5048 can selectively lock and/or ignore application of opening and/or closing motions to the jaws of the surgical end effector during one or more portions of the surgical procedure (such as during a coarse motion mode).
In various aspects of the present disclosure, an untethered, ungrounded input control device may utilize a motion capture system to control a robotic surgical tool, such as robotic surgical tool 1050 (fig. 12), in a plurality of operating modes, including a fine motion mode and a coarse motion mode. Further, in certain aspects of the present disclosure, the input control device may not include coarse mode actuators and sensors, e.g., unlike joystick 5006 and multi-axis force sensor 5010. In such cases, the surgeon may selectively operate the input control device 5000 by relying entirely on the motion capture system, as further described herein.
Referring now to fig. 63-67, an untethered, ungrounded input control 5100 is shown. The input control device 5100 may be incorporated into a surgical system, such as the surgical system 110 (fig. 1) or the surgical system 150 (fig. 3), to provide control signals to the surgical robot and/or a surgical tool coupled thereto. The input control device 5100 includes manual input controls for moving the robotic arm and/or surgical tool in three-dimensional space. The input control device 5100 is similar in many respects to the input control device 5000. For example, the input control device 5100 includes sufficient degrees of freedom to control the robotic surgical tool 1050 shown in fig. 12. The input control device 5100 also includes a handpiece 5102, which is similar in many respects to the handpiece 5002. For example, the handpiece 5102 is wireless and includes a power supply and control circuitry having wireless communication capabilities with external components, similar in many respects to the control circuitry 5048 (fig. 59A). The handpiece 5102 includes an elongated body 5104 that can be comfortably grasped by a surgeon during use. The input control device 5100 includes a shaft 5016 rotatably coupled to the elongate body 5104 at a rotational joint 5018 and including a rotation sensor (e.g., rotation sensor 5020 in fig. 59A) configured to be capable of detecting rotational displacement of the shaft 5016 relative to the elongate body 5104. In addition, the shaft 5016 can include a radial sensor (e.g., radial sensor 5022 in fig. 59A). In other instances, the input control device 5100 may include an actuation trigger and corresponding trigger sensor for applying opening and/or closing motions to the jaws of the robotic surgical tool, as further described herein with respect to fig. 57-61.
Unlike the input control device 5000, the input control device 5100 does not include a joystick and a multi-axis force sensor coupled thereto. Instead, control input motions for moving the robotic surgical tool in three-dimensional space are controlled by the motion capture system in a coarse motion mode and a fine motion mode. The handpiece 5102 includes a body sensor, like the body sensor 5030 in fig. 59A, embedded in the elongated body 5104 and configured to be able to detect movement of the elongated body 5104 in three dimensions. The handpiece 5102 also includes a button 5126 and button sensor on the elongate body 5104, which are similar in many respects to the button 5026 (fig. 57-61) and the sensor 5032 (fig. 59A).
The control function of the input control device 5100 may depend on the position of the input control device 5100 relative to the starting position and the area around the starting position. The position of the input control device 5100 relative to the start position and surrounding area may be determined by its motion capture system. A hypothetical coarse motion region 5140 is shown in fig. 63. The coarse motion zone 5140 is a three-dimensional volume having a boundary or edge 5142 therearound. The coarse motion zone 5140 may be defined relative to a starting position. The starting position may be set or established by input from the surgeon (e.g., by actuating button 5126 or another actuator on the handpiece 5102). The starting position of the input control device 5100 may be a center position that is equally spaced from all of the boundaries 5142 of the coarse motion zone 5140. In various instances, starting from the starting position, boundary 5142 may be positioned from about 2 inches to about 6 inches from the starting position. In other cases, boundary 5142 may be positioned less than 2 inches or greater than 6 inches from the starting position in at least one direction. Although the coarse motion region in fig. 63 defines a cylindrical shape, the reader will appreciate that alternative geometries may be employed.
In various instances, in the coarse motion mode, the surgeon can drive the robotic surgical tool in a particular direction by moving the input control device 5100 in a corresponding direction away from the starting position. As the input control device 5100 approaches the boundary or edge of the coarse motion zone 5140, the speed of the robotic surgical tool driven by the input control device 5100 may increase. For example, the coarse movement mode can be maintained by holding down button 5126. In various instances, to initiate the coarse motion mode, the surgeon can engage button 5126 that defines or sets a starting position within the coarse motion zone, and can drive the robotic surgical tool in the coarse motion mode as long as button 5126 remains engaged and/or actuated.
Referring now to fig. 68, the input control device 5100 may utilize control logic that correlates the distance of the input control device 5100 from a start position relative to the speed of the robotic surgical tool driven by the input control device 5100. The control logic components of the input control device 5100 may be used, for example, in the control circuit 832 (fig. 25), the control circuit 1400 (fig. 11C), the combinational logic circuit 1410 (fig. 11D), and/or the sequential logic circuit 1420 (fig. 11E), where inputs are provided by inputs to the input control device 4000 and/or the surgical visualization system or distance determination subsystem thereof, as further described herein. As the distance of the input control device 5100 from the starting position increases in fig. 68, the speed of the robotic surgical tool driven by the input control device 5100 may gradually increase to a maximum speed. When the input control device 5100 reaches and/or exceeds the boundary 5142 of the coarse motion zone 5140 (e.g., leaves the coarse motion zone 5140), a high velocity of the robotic surgical tool can be maintained in the coarse motion mode. In various circumstances, retracting the input control device 5100 toward a starting position within the coarse motion zone 5140 may correspondingly decrease the speed of the robotic surgical tool.
The input control device 5100 may switch from the coarse motion mode to the fine motion mode when the proximity of the robotic surgical tool reaches a threshold or critical proximity that is stored in memory and accessible to the control logic. In the precision motion mode, movement of the robotic surgical tool may be controlled by the same input controls described herein with respect to fig. 57-62. For example, input control motions detected by body sensors of the motion capture system may control three-dimensional displacements of the robotic surgical tool and pivots of the robotic surgical tool in three-dimensional space, such as pitch and yaw articulation motions of the end effector. In addition, rotation of the shaft 5016 can control the rolling motion of the end effector, and the tightening force on the shaft 5016 can control the opening and/or closing of one or more jaws of the end effector.
In various aspects, the input control motions applied to the input control device 5100 in the precision motion mode may correspond to equal motions of the robotic surgical tool. For example, the displacement of the handpiece 5102 in three-dimensional space may correspond to equal displacement of the robotic surgical tool. In other words, the control input motions may have a one-to-one correlation with the robotic surgical tool motions, which may provide the surgeon with an intuitive control system. In other cases, the control motion in the precision mode may be scaled.
Examples
Various aspects of the subject matter described herein are set forth in the following numbered examples.
The list of examples is as follows:
example 1-a control system for a surgical robot, the control system comprising: a base; a central portion flexibly supported by the base; a wrist longitudinally offset from and rotationally coupled to the central portion; a multi-axis sensor arrangement configured to be able to detect a user input force applied to the central portion; a rotation sensor configured to be able to detect a user input motion applied to the wrist; a memory; and a processor communicatively coupled to the memory. The processor is configured to receive a plurality of first input signals from the multi-axis sensor arrangement, provide a plurality of first output signals to the surgical robot based on the plurality of first input signals, receive a plurality of second input signals from the rotation sensor, and provide a plurality of second output signals to the surgical robot based on the plurality of second input signals.
Embodiment 2-the control system of embodiment 1, wherein the plurality of first input signals correspond to forces and moments applied to the central portion in three-dimensional space.
Example 3-the control system of examples 1 or 2, wherein the plurality of first output signals correspond to translation and rotation of a surgical tool coupled to the surgical robot.
Embodiment 4-the control system of any of embodiments 1-3, wherein the plurality of second input signals correspond to rotational displacement of the wrist relative to the base in three-dimensional space.
Example 5-the control system of examples 3 or 4, wherein the plurality of second output signals correspond to a rolling motion of the surgical tool.
Embodiment 6-the control system of any of embodiments 1-5, wherein the central portion comprises a joystick. A shaft extends between the joystick and the wrist.
Embodiment 7-the control system of any of embodiments 1 to 6, further comprising a jaw movably supported on the wrist and a jaw sensor configured to detect movement of the jaw. The jaws are movable between an open configuration and a clamped configuration. The processor is further configured to receive a plurality of third input signals from the jaw sensor and provide a plurality of third output signals to the surgical robot indicative of actuation motions of a jaw member of a surgical tool coupled to the surgical robot.
Example 8-a control system for a surgical robot, the control system comprising: a first control input comprising a flexibly supported joystick; a memory; and a control circuit communicatively coupled to the memory. The memory stores instructions executable by the control circuit to cause the control system to switch between a first mode and a second mode, receive a plurality of first input signals from the first control input, scale the plurality of first input signals by a first multiplier in the first mode, and scale the plurality of first input signals by a second multiplier in the second mode. The second multiplier is different from the first multiplier.
Embodiment 9-the control system of embodiment 8, wherein the joystick of the flexible support is operably coupled to a multi-axis force and torque sensor configured to detect forces and moments applied to the joystick of the flexible support. The plurality of first input signals correspond to output signals from the multi-axis force and torque sensor.
Embodiment 10-the control system of embodiment 8 or 9, wherein the first mode corresponds to a coarse motion mode and the second mode corresponds to a fine motion mode.
Embodiment 11-the control system of any of embodiments 8 to 10, wherein the control circuit is communicatively coupled to the proximity detection system. The control circuit is further configured to be capable of receiving a proximity signal from the proximity detection system and causing the control system to switch from the first mode to the second mode when the proximity signal corresponds to a proximity less than a threshold.
Embodiment 12-the control system of any one of embodiments 8 to 11, further comprising a second control input comprising a rotatable wrist. The control circuit is further configured to be capable of receiving a plurality of second input signals from the second control input, scaling the plurality of second input signals by a third multiplier in the first mode, and scaling the plurality of second input signals by a fourth multiplier in the second mode. The fourth multiplier is different from the third multiplier.
Embodiment 13-the control system of embodiment 12, further comprising a rotation sensor configured to be able to detect rotation of the rotatable wrist. The plurality of second input signals correspond to output signals from the rotation sensor.
Example 14-the control system of examples 12 or 13, further comprising a shaft extending from the flexibly supported joystick. The rotatable wrist is configured to be rotatable on the axis.
Embodiment 15-the control system of embodiment 14, further comprising a pair of opposing actuators pivotally supported on the shaft and a sensor configured to detect pivotal movement of the pair of opposing actuators.
Example 16-a control system for a surgical robot, the control system comprising: a first input comprising a flexibly supported joystick and a multi-axis force and torque sensor arrangement configured to be able to detect a user input force and torque applied to the flexibly supported joystick; a second input comprising a rotary joint and a rotary sensor configured to be able to detect a user input motion applied to the rotary joint; and a control unit. The control unit is configured to provide a first plurality of output signals to the surgical robot based on actuation of the first input and a second plurality of output signals to the surgical robot based on actuation of the second input.
Embodiment 17-the control system of embodiment 16, wherein the flexibly supported joystick is spring biased to an upright position.
Embodiment 18-the control system of embodiment 16 or 17, wherein the control unit is communicatively coupled to the proximity detection system. The control unit is further configured to be able to receive a proximity signal from the proximity detection system, scale the user input force and torque detected by the multi-axis force and torque sensor arrangement by a first factor when the proximity signal is greater than a threshold value, and scale the user input force and torque detected by the multi-axis force and torque sensor arrangement by a second factor when the proximity signal is equal to or less than the threshold value. The second factor is different from the first factor.
Embodiment 19-the control system of embodiment 18, wherein the second factor is less than the first factor.
Embodiment 20-the control system of embodiment 18 or 19, wherein the control unit is further configured to ignore the user input motion applied to the rotary joint when the proximity signal is greater than a threshold value.
Another list of embodiments is as follows:
Example 1-a control system comprising a robotic surgical tool; a tissue proximity detection system configured to be capable of intra-operatively detecting a distance between the robotic surgical tool and an anatomical structure; and a user input device. The user input device includes: a base including a force sensor; a forearm support member movably coupled to the base; a shaft extending distally from the forearm support; a handpiece extending distally from the shaft; and a jaw sensor configured to detect pivotal movement of the jaws. The forearm support is movable relative to the base within a travel zone and the handpiece includes jaws. The forearm support member, the shaft and the hand piece are capable of moving together as a collective unit as the forearm support member moves relative to the base within the zone of travel. The user input device also includes a displacement sensor configured to detect movement of the collective unit. The control system also includes a control circuit communicatively coupled to the force sensor, the displacement sensor, and the jaw sensor. The control circuit is configured to receive a first input signal from the force sensor, a second input signal from the displacement sensor, a third input signal from the jaw sensor, and switch the user input device from a first mode to a second mode in response to an input from the tissue proximity detection system indicating that the distance between the robotic surgical tool and the anatomical structure decreases to less than a threshold distance. The first input signal controls movement of the robotic surgical tool in the first mode, and the second input signal and the third input signal control movement of the robotic surgical tool in the second mode.
Embodiment 2-the control system of embodiment 1, wherein the zone of travel comprises a three-dimensional space surrounding the forearm starting position. The forearm support member is spring biased toward the forearm starting position and the three-dimensional space extends in all directions from the forearm starting position by a distance of between 2.0cm and 6.0 cm.
Embodiment 3-the control system of embodiment 1 or 2, wherein the forearm support comprises a curved arc forming a cuff and the cuff is sized to at least partially surround the arm of the surgeon.
Embodiment 4-the control system of any of embodiments 1 to 3, wherein the tissue proximity detection system comprises a structured light source on the robotic surgical tool.
Embodiment 5-a control system comprising an organization proximity detection system and a user input device. The user input device includes a base; a forearm support member movably coupled to the base; a shaft extending distally from the forearm support; a handpiece extending distally from the shaft; and a plurality of sensors. The forearm support is movable relative to the base within a travel zone and the handpiece includes a jaw configured to pivot relative to the shaft. The plurality of sensors includes a first sensor arrangement configured to be capable of detecting a user input force to the base; a second sensor arrangement configured to be able to detect a displacement of the forearm support; and a third sensor arrangement configured to detect pivotal movement of the jaw. The control system also includes control circuitry configured to be capable of receiving a proximity data signal from the tissue proximity detection system, receiving a first input signal from the first sensor arrangement, receiving a second input signal from the second sensor arrangement, receiving a third input signal from the third sensor arrangement, and causing the user input device to switch from the first mode to the second mode in response to the proximity data signal from the tissue proximity detection system indicating a predefined tissue proximity. The first input signal controls movement of the robotic surgical tool in the first mode, and the second input signal and the third input signal control movement of the robotic surgical tool in the second mode.
Embodiment 6-a user input device for controlling a robotic surgical tool, the user input device comprising: a base comprising a first sensor arrangement; and a forearm support member movably coupled to the base. The forearm support member is moveable relative to the base within the travel zone and the forearm support member includes a second sensor arrangement. The user input device also includes control circuitry configured to receive a first input signal from the first sensor arrangement, receive a second input signal from the second sensor arrangement, and switch the user input device between a first mode in which the first input signal controls movement of the robotic surgical tool and a second mode in which the second input signal controls movement of the robotic surgical tool.
Embodiment 7-the user input device of embodiment 6, wherein the control circuit is communicatively coupled to the tissue proximity detection system. The control circuit is configured to switch the user input device between the first mode and the second mode in response to an input from the tissue proximity detection system.
Embodiment 8-the user input device of embodiment 7, wherein the first mode comprises a coarse motion mode and the second mode comprises a fine motion mode. The control circuit is configured to be capable of: switching the user input device between the coarse motion mode and the fine motion mode when the tissue proximity detection system provides a proximity signal indicating that the robotic surgical tool is positioned at less than a threshold distance of an anatomic structure.
Embodiment 9-the user input device of any of embodiments 6-8, wherein the first sensor arrangement comprises a six degree of freedom force and torque sensor.
Embodiment 10-the user input device of any of embodiments 6 to 9, wherein the first sensor arrangement comprises a joystick movable in three dimensions around the input starting position. The three-dimensional space extends in all directions from the input starting position by a distance of between 1.0mm and 5.0mm and the joystick is spring-biased towards the input starting position.
Embodiment 11-the user input device of any of embodiments 6 to 10, wherein the second sensor arrangement comprises a displacement sensor.
Embodiment 12-the user input device of any of embodiments 6 to 11, wherein the travel zone comprises a three-dimensional space surrounding a forearm starting position and the forearm support member is spring biased towards the forearm starting position.
Embodiment 13-the user input device of embodiment 12, wherein the three-dimensional space extends from the forearm starting position by a distance of between 2.0cm and 6.0cm in all directions.
Embodiment 14-the user input device of any of embodiments 6-13, wherein the forearm support member comprises a curved arc forming a cuff and the cuff is sized to at least partially surround the surgeon's arm.
Embodiment 15-the user input device of any of embodiments 6 to 14, further comprising a shaft extending distally from the forearm support member; a handpiece extending distally from the shaft and including a first jaw and a second jaw; and a jaw sensor arrangement configured to detect pivotal movement of the first jaw and the second jaw within the range of motion. The first jaw and the second jaw are pivotable relative to the shaft within a range of motion,
Example 16-the user input device of example 15, wherein the jaw sensor arrangement is communicatively coupled to the control circuit. The control circuit is further configured to receive a third input signal from the jaw sensor arrangement and provide an output signal to the robotic surgical tool to control actuation of one or more jaws of an end effector of the robotic surgical tool.
Example 17-the user input device of examples 15 or 16, further comprising a first finger ring on the first jaw, the first finger ring positioned and sized to receive at least one finger of a surgeon's hand; and a second finger loop on the second jaw, the second finger loop positioned and sized to receive at least one finger of the surgeon's hand.
Embodiment 18-the user input device of any of embodiments 15 to 17, further comprising: a rotary joint between the handpiece and the shaft; and a rotation sensor configured to detect rotational movement of the handpiece relative to the shaft.
Embodiment 19-the user input device of any of embodiments 15-18, wherein the handpiece further comprises an actuator communicatively coupled to the control circuit. The control circuit is further configured to receive an input actuation signal from the actuator and provide an output actuation signal to the robotic surgical tool to actuate a surgical function.
Embodiment 20-the user input device of embodiment 19, wherein the actuator is selected from the group consisting of: triggers, buttons, switches, levers, toggles, and combinations thereof.
Another list of embodiments is as follows:
example 1-a control system for a robotic surgical tool, the control system comprising: an untethered handpiece, the untethered handpiece comprising a body; a joystick extending from the body; a rotatable shaft extending from the body; and a plurality of sensors including a body sensor embedded in the body and configured to be able to detect a motion of the body in a three-dimensional space; a multi-axis force sensor configured to be able to detect a force applied to the joystick; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body. The control system also includes a control circuit communicatively coupled to the plurality of sensors and the proximity detection system. The control circuit is configured to receive a proximity signal from the proximity detection system, receive an input control signal from the plurality of sensors, switch between a coarse motion mode and a fine motion mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold, provide a coarse motion control signal to the robotic surgical tool based on the input control signal from the multi-axis force sensor in the coarse motion mode, and provide a fine motion control signal to the robotic surgical tool based on the input control signals from the body sensor and the axis sensor in the fine motion mode. The proximity signal indicates a proximity of the robotic surgical tool to tissue.
Embodiment 2-the control system of embodiment 1, wherein the joystick comprises a ring sized and positioned to receive a user's thumb.
Embodiment 3-the control system of embodiment 1 or 2, wherein the subject sensor is selected from the group consisting of an inertial sensor and an electromagnetic tracking receiver.
Embodiment 4-the control system of any of embodiments 1 to 3, wherein the shaft sensor is selected from the group consisting of a rotary transducer, a strain gauge, and an optical sensor.
Example 5-a control system for a robotic surgical tool, the control system comprising: an untethered handpiece, the untethered handpiece comprising a body; an actuator extending from the body; a rotatable shaft extending from the body; and a plurality of sensors including a body sensor embedded in the body and configured to be able to detect a motion of the body in a three-dimensional space; a force sensor configured to be able to detect a force applied to the actuator; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body. The control system further includes a proximity detection system configured to detect proximity of the robotic surgical tool to tissue; and a control circuit communicatively coupled to the plurality of sensors and the proximity detection system. The control circuit is configured to receive a proximity signal from the proximity detection system, receive an input control signal from the plurality of sensors, switch between a first mode and a second mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold value, provide a first motion control signal to the robotic surgical tool based on the input control signal from the force sensor in the first mode, and provide a second motion control signal to the robotic surgical tool based on the input control signals from the body sensor and the shaft sensor in the second mode. The proximity signal indicates the proximity of the robotic surgical tool to tissue and scales the first motion control signal based on the proximity signal.
Embodiment 6-the control system of embodiment 5, wherein the proximity detection system comprises a structured light source and a light receiver.
Embodiment 7-the control system of embodiment 5 or 6, wherein the shaft further comprises a radial sensor configured to be able to detect a radial force applied to the shaft. The control circuit is further configured to receive an input control signal from the radial sensor and provide an output control signal to the robotic surgical tool based on the input control signal from the radial sensor. The output control signal is configured to apply a closing motion to one or more jaws of the robotic surgical tool.
Example 8-a control system for a robotic surgical tool, the control system comprising an untethered handpiece comprising a body; an actuation arm extending proximally from the body; a shaft extending distally from the body; and a plurality of sensors including a body sensor embedded in the body and configured to be able to detect a motion of the body in a three-dimensional space; an arm sensor configured to be capable of detecting a force applied to the actuator arm; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body. The control system also includes a control circuit communicatively coupled to the plurality of sensors and the proximity detection system. The control circuit is configured to receive a proximity signal from the proximity detection system, receive an input control signal from the plurality of sensors, switch between a coarse motion mode and a fine motion mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold, provide an output control signal to the robotic surgical tool based on the input control signal from the arm sensor in the coarse motion mode, and provide an output control signal to the robotic surgical tool based on the input control signals from the body sensor and the shaft sensor in the fine motion mode. The proximity signal indicates a proximity of the robotic surgical tool to tissue.
Embodiment 9-the control system of embodiment 8, wherein the actuator arm comprises a joystick movably coupled to the body at a spatial joint.
Embodiment 10-the control system of embodiment 9, wherein the joystick comprises a ring sized and positioned to receive a user's thumb.
Embodiment 11-the control system of embodiment 9 or 10, wherein the arm sensor comprises a multi-axis force sensor positioned at the space joint.
Embodiment 12-the control system of any one of embodiments 8 to 11, wherein the subject sensor comprises an inertial sensor.
Embodiment 13-the control system of any of embodiments 8 to 11, wherein the subject sensor comprises an electromagnetic tracking receiver.
Embodiment 14-the control system of any of embodiments 8 to 13, wherein the proximity detection system comprises a structured light source.
Embodiment 15-the control system of any one of embodiments 8 to 14, wherein the control circuit is further configured to scale the output control signal based on the input control signal received by the arm sensor in response to the proximity signal received from the proximity detection system in the coarse motion mode.
Example 16-the control system of example 15, wherein the control circuitry is further configured to decrease the output control signal in the coarse motion mode in response to the proximity signal indicating that the robotic surgical tool is approaching tissue.
Embodiment 17-the control system of any of embodiments 8-16, wherein the untethered handpiece further comprises a lock for the actuator arm. The lock is movable from an unlocked position to a locked position in response to receiving the proximity signal indicating that the proximity decreases to less than the threshold.
Embodiment 18-the control system of any of embodiments 8-17, wherein the body further comprises a spring-loaded actuation button that is movable between an initial position and a depressed position. The output control signal based on the input control signal detected by the body sensor in the coarse motion mode is provided to the robotic surgical tool only when the spring-loaded actuation button is moved to the depressed position.
Embodiment 19-the control system of any of embodiments 8 to 18, wherein the shaft further comprises a radial sensor configured to be able to detect a radial force applied to the shaft. The control circuit is further configured to receive an input control signal from the radial sensor and provide an output control signal to the robotic surgical tool based on the input control signal from the radial sensor to apply a closing motion to one or more jaws of the robotic surgical tool.
Embodiment 20-the control system of any of embodiments 8-19, wherein the untethered handpiece further comprises a trigger and a trigger sensor configured to be able to detect actuation of the trigger. The control circuit is further configured to receive an input control signal from the trigger sensor and provide an output control signal to the robotic surgical tool based on the input control signal from the trigger sensor to apply a closing motion to one or more jaws of the robotic surgical tool.
Example 21-a control system for a robotic surgical tool, the control system comprising an untethered handpiece comprising: a coarse motion controller comprising a multi-axis sensor; a precision motion controller comprising an embedded motion sensor; and control circuitry communicatively coupled to the multi-axis sensor, the embedded motion sensor, and the proximity detection system. The control circuitry is configured to receive a proximity signal from the proximity detection system indicative of a proximity of the robotic surgical tool to tissue, and switch between a coarse motion mode in which the robotic surgical tool is controlled with the input control signal from the coarse motion controller and a fine motion mode in which the robotic surgical tool is controlled with the input control signal from the fine motion controller in response to receiving the proximity signal indicative of the proximity decreasing to less than a threshold.
While several forms have been illustrated and described, it is not the intention of the applicants to restrict or limit the scope of the appended claims to such detail. Numerous modifications, variations, changes, substitutions, combinations, and equivalents of these forms can be made without departing from the scope of the present disclosure, and will occur to those skilled in the art. Further, the structure of each element associated with the described forms may alternatively be described as a means for providing the function performed by the element. In addition, where materials for certain components are disclosed, other materials may also be used. It should be understood, therefore, that the foregoing detailed description and the appended claims are intended to cover all such modifications, combinations and permutations as fall within the scope of the disclosed forms of the invention. It is intended that the following claims cover all such modifications, variations, changes, substitutions, modifications, and equivalents.
The foregoing detailed description has set forth various forms of the devices and/or methods via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Those skilled in the art will recognize that some aspects of the forms disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and/or hardware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as one or more program products in a variety of forms, and that an illustrative form of the subject matter described herein applies regardless of the particular type of signal bearing media used to actually carry out the distribution.
Instructions for programming logic to perform the various disclosed aspects may be stored within a memory in a system, such as a Dynamic Random Access Memory (DRAM), cache, flash memory, or other memory. Further, the instructions may be distributed via a network or by other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), but is not limited to, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, read-only memories (ROMs), Random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or a tangible, machine-readable storage device used in transmitting information over the internet via electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Thus, a non-transitory computer-readable medium includes any type of tangible machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
As used in any aspect herein, the term "control circuitry" can refer to, for example, hardwired circuitry, programmable circuitry (e.g., a computer processor that includes one or more separate instruction processing cores, processing units, processors, microcontrollers, microcontroller units, controllers, Digital Signal Processors (DSPs), Programmable Logic Devices (PLDs), Programmable Logic Arrays (PLAs), Field Programmable Gate Arrays (FPGAs)), state machine circuitry, firmware that stores instructions executed by programmable circuitry, and any combination thereof. The control circuitry may be implemented collectively or individually as circuitry that forms part of a larger system, e.g., an Integrated Circuit (IC), an Application Specific Integrated Circuit (ASIC), a system on a chip (SoC), a desktop computer, a laptop computer, a tablet computer, a server, a smartphone, etc. Thus, as used herein, "control circuitry" includes, but is not limited to, electronic circuitry having at least one discrete circuit, electronic circuitry having at least one integrated circuit, electronic circuitry having at least one application specific integrated circuit, electronic circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program that implements, at least in part, the methods and/or apparatus described herein, or a microprocessor configured by a computer program that implements, at least in part, the methods and/or apparatus described herein), electronic circuitry forming a memory device (e.g., forming a random access memory), and/or electronic circuitry forming a communication device (e.g., a modem, a communication switch, or an optoelectronic device). Those skilled in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion, or some combination thereof.
As used in any aspect herein, the term "logic" may refer to an application, software, firmware, and/or circuitry configured to be capable of performing any of the foregoing operations. The software may be embodied as a software package, code, instructions, instruction sets, and/or data recorded on a non-transitory computer-readable storage medium. Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., non-volatile) in a memory device.
As used in any aspect herein, the terms "component," "system," "module," and the like can refer to a computer-related entity, either hardware, a combination of hardware and software, or software in execution.
An "algorithm," as used in any aspect herein, is a self-consistent sequence of steps leading to a desired result, wherein "step" refers to the manipulation of physical quantities and/or logical states, which may (but are not necessarily) take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. And are used to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. These and similar terms may be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities and/or conditions.
The network may comprise a packet switched network. The communication devices may be capable of communicating with each other using the selected packet switched network communication protocol. One exemplary communication protocol may include an ethernet communication protocol that may be capable of allowing communication using the transmission control protocol/internet protocol (TCP/IP). The ethernet protocol may conform to or be compatible with the ethernet Standard entitled "IEEE 802.3 Standard" promulgated by the Institute of Electrical and Electronics Engineers (IEEE) at 12 months 2008 and/or higher versions of the Standard. Alternatively or additionally, the communication devices may be capable of communicating with each other using an x.25 communication protocol. The x.25 communication protocol may conform to or be compatible with standards promulgated by the international telecommunication union, telecommunication standardization sector (ITU-T). Alternatively or additionally, the communication devices may be capable of communicating with each other using a frame relay communication protocol. The frame relay communication protocol may conform to or be compatible with standards promulgated by the international telegraph telephone consultancy (CCITT) and/or the American National Standards Institute (ANSI). Alternatively or additionally, the transceivers may be capable of communicating with each other using an Asynchronous Transfer Mode (ATM) communication protocol. The ATM communication protocol may conform to or be compatible with the ATM standard entitled "ATM-MPLS Network Interworking 2.0" promulgated by the ATM forum at 8 months 2001 and/or higher versions of that standard. Of course, different and/or later-developed connection-oriented network communication protocols are also contemplated herein.
Unless specifically stated otherwise as apparent from the above discussion, it is appreciated that throughout the above disclosure, discussions utilizing terms such as "processing," "computing," "calculating," "determining," "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
One or more components may be referred to herein as "configured to be able," "configurable to be able," "operable/operable," "adapted/adaptable," "capable," "conformable/conformable," or the like. Those skilled in the art will recognize that "configured to be able to" may generally encompass components in an active state and/or components in an inactive state and/or components in a standby state unless the context indicates otherwise.
The terms "proximal" and "distal" are used herein with respect to a clinician manipulating a handle portion of a surgical instrument. The term "proximal" refers to the portion closest to the clinician and the term "distal" refers to the portion located away from the clinician. It will be further appreciated that for simplicity and clarity, spatial terms such as "vertical," "horizontal," "up," and "down" may be used herein with respect to the drawings. However, surgical instruments are used in many orientations and positions, and these terms are not intended to be limiting and/or absolute.
Those skilled in the art will recognize that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as "open" terms (e.g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases "at least one" and "one or more" to introduce claims. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., "a" and/or "an" should typically be interpreted to mean "at least one" or "one or more"); this also applies to the use of definite articles used to introduce claim recitations.
In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations," without other modifiers, typically means at least two recitations, or two or more recitations). Further, in those instances where a convention analogous to "at least one of A, B and C, etc." is used, in general such construction is intended to have a meaning that one of skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include, but not be limited to, systems having a alone, B alone, C, A and B together alone, a and C together, B and C together, and/or A, B and C together, etc.). In those instances where a convention analogous to "A, B or at least one of C, etc." is used, in general such construction is intended to have a meaning that one of skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems having a alone, B alone, C, A and B together alone, a and C together, B and C together, and/or A, B and C together, etc.). It will also be understood by those within the art that, in general, disjunctive words and/or phrases having two or more alternative terms, whether appearing in the detailed description, claims, or drawings, should be understood to encompass the possibility of including one of the terms, either of the terms, or both terms, unless the context indicates otherwise. For example, the phrase "a or B" will generally be understood to include the possibility of "a" or "B" or "a and B".
Those skilled in the art will appreciate from the appended claims that the operations recited therein may generally be performed in any order. Additionally, while the various operational flow diagrams are listed in one or more sequences, it should be understood that the various operations may be performed in other sequences than the illustrated sequences, or may be performed concurrently. Examples of such alternative orderings may include overlapping, interleaved, interrupted, reordered, incremental, preliminary, complementary, simultaneous, reverse, or other altered orderings, unless context dictates otherwise. Furthermore, unless the context dictates otherwise, terms like "responsive," "related," or other past adjectives are generally not intended to exclude such variations.
It is worthy to note that any reference to "an aspect," "an example" means that a particular feature, structure, or characteristic described in connection with the aspect is included in at least one aspect. Thus, the appearances of the phrases "in one aspect," "in an example" in various places throughout this specification are not necessarily all referring to the same aspect. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more aspects.
Any patent applications, patents, non-patent publications or other published materials mentioned in this specification and/or listed in any application data sheet are herein incorporated by reference, to the extent that the incorporated materials are not inconsistent herewith. Thus, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.
In summary, a number of benefits resulting from employing the concepts described herein have been described. The foregoing detailed description of one or more forms has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications or variations are possible in light of the above teachings. The form or forms selected and described are to be illustrative of the principles and practical applications to thereby enable one of ordinary skill in the art to utilize the various forms and modifications as are suited to the particular use contemplated. The claims as filed herewith are intended to define the full scope.

Claims (61)

1. A control system for a surgical robot, the control system comprising:
a base;
a central portion flexibly supported by the base;
a wrist longitudinally offset from the central portion and rotationally coupled to the central portion,
a multi-axis sensor arrangement configured to be able to detect a user input force applied to the central portion;
a rotation sensor configured to be able to detect a user input motion applied to the wrist;
a memory; and
a processor communicatively coupled to the memory, wherein the processor is configured to:
receiving a plurality of first input signals from the multi-axis sensor arrangement;
providing a plurality of first output signals to the surgical robot based on the plurality of first input signals;
receiving a plurality of second input signals from the rotation sensor; and
providing a plurality of second output signals to the surgical robot based on the plurality of second input signals.
2. The control system of claim 1, wherein the plurality of first input signals correspond to forces and moments applied to the central portion in three-dimensional space.
3. The control system of claim 2, wherein the plurality of first output signals correspond to translation and rotation of a surgical tool coupled to the surgical robot.
4. The control system of claim 3, wherein the plurality of second input signals correspond to rotational displacement of the wrist relative to the base in three-dimensional space.
5. The control system of claim 4, wherein the plurality of second output signals correspond to a rolling motion of the surgical tool.
6. The control system of claim 5, wherein the central portion comprises a joystick, and wherein an axis extends between the joystick and the wrist.
7. The control system of claim 1, further comprising:
a jaw movably supported on the wrist, wherein the jaw is movable between an open configuration and a clamped configuration; and
a jaw sensor configured to detect movement of the jaws,
wherein the processor is further configured to be capable of:
receiving a plurality of third input signals from the jaw sensor; and
providing a plurality of third output signals to the surgical robot indicative of actuation motions of a jaw member of a surgical tool coupled to the surgical robot.
8. A control system for a surgical robot, the control system comprising:
a first control input comprising a flexibly supported joystick;
a memory; and
a control circuit communicatively coupled to the memory, wherein the memory stores instructions executable by the control circuit to:
switching the control system between a first mode and a second mode;
receiving a plurality of first input signals from the first control input;
scaling the plurality of first input signals by a first multiplier in the first mode; and
scaling the plurality of first input signals by a second multiplier in the second mode, wherein the second multiplier is different from the first multiplier.
9. The control system of claim 8, wherein the flexibly supported joystick is operably coupled to a multi-axis force and torque sensor configured to detect forces and moments applied to the flexibly supported joystick, and wherein the plurality of first input signals correspond to output signals from the multi-axis force and torque sensor.
10. The control system of claim 8, wherein the first mode corresponds to a coarse motion mode, and wherein the second mode corresponds to a fine motion mode.
11. The control system of claim 10, wherein the control circuit is communicatively coupled to a proximity detection system, and wherein the control circuit is further configured to:
receiving a proximity signal from the proximity detection system; and
causing the control system to switch from the first mode to the second mode when the proximity signal corresponds to a proximity that is less than a threshold.
12. The control system of claim 8, further comprising:
a second control input comprising a rotatable wrist, wherein the control circuit is further configured to:
receiving a plurality of second input signals from the second control input;
scaling the plurality of second input signals by a third multiplier in the first mode; and
scaling the plurality of second input signals by a fourth multiplier in the second mode, wherein the fourth multiplier is different from the third multiplier.
13. The control system of claim 12, further comprising a rotation sensor configured to be capable of detecting rotation of the rotatable wrist, wherein the plurality of second input signals correspond to output signals from the rotation sensor.
14. The control system of claim 12, further comprising a shaft extending from the flexibly supported joystick, wherein the rotatable wrist is configured to be rotatable on the shaft.
15. The control system of claim 14, further comprising:
a pair of opposing actuators pivotally supported on the shaft; and
a sensor configured to detect pivotal movement of the pair of opposing actuators.
16. A control system for a surgical robot, the control system comprising:
a first input comprising a flexibly supported joystick and a multi-axis force and torque sensor arrangement configured to be able to detect user input forces and torques applied to the flexibly supported joystick;
a second input comprising a rotary joint and a rotary sensor configured to be able to detect a user input motion applied to the rotary joint; a control unit configured to be capable of:
providing a first plurality of output signals to the surgical robot based on actuation of the first input; and
Providing a second plurality of output signals to the surgical robot based on actuation of the second input.
17. The control system of claim 16, wherein the flexibly supported joystick is spring biased to an upright position.
18. The control system of claim 16, wherein the control unit is communicatively coupled to a proximity detection system, and wherein the control unit is further configured to:
receiving a proximity signal from the proximity detection system;
scaling the user input force and torque detected by the multi-axis force and torque sensor arrangement by a first factor when the proximity signal is greater than a threshold; and
scaling the user input force and torque detected by the multi-axis force and torque sensor arrangement by a second factor when the proximity signal is equal to or less than the threshold value, and wherein the second factor is different from the first factor.
19. The control system of claim 18, wherein the second factor is less than the first factor.
20. The control system of claim 18, wherein the control unit is further configured to ignore the user input motion applied to the rotary joint when the proximity signal is greater than a threshold value.
21. A control system, the control system comprising:
a robotic surgical tool;
a tissue proximity detection system configured to be capable of intra-operatively detecting a distance between the robotic surgical tool and an anatomical structure;
a user input device, the user input device comprising:
a base comprising a force sensor;
a forearm support member movably coupled to the base, wherein the forearm support member is movable relative to the base within a travel zone;
a shaft extending distally from the forearm support;
a handpiece extending distally from the shaft, wherein the handpiece comprises a jaw; and
a jaw sensor configured to detect pivotal movement of the jaws;
wherein the forearm support, the shaft and the hand piece are movable together as a collective unit when the forearm support is moved relative to the base within the zone of travel, and wherein the user input device further comprises a displacement sensor configured to be able to detect movement of the collective unit; and
A control circuit communicatively coupled to the force sensor, the displacement sensor, and the jaw sensor, wherein the control circuit is configured to:
receiving a first input signal from the force sensor;
receiving a second input signal from the displacement sensor;
receiving a third input signal from the jaw sensor; and
causing the user input device to switch from a first mode to a second mode in response to an input from the tissue proximity detection system indicating that the distance between the robotic surgical tool and the anatomical structure decreases to less than a threshold distance, wherein in the first mode the first input signal controls movement of the robotic surgical tool, and wherein in the second mode the second input signal and the third input signal control movement of the robotic surgical tool.
22. The control system of claim 21, wherein the travel zone comprises a three-dimensional space surrounding a forearm starting position, wherein the forearm support is spring biased toward the forearm starting position, and wherein the three-dimensional space extends a distance of between 2.0cm and 6.0cm in all directions from the forearm starting position.
23. The control system of claim 21, wherein the forearm support includes a curved arc forming a cuff, and wherein the cuff is sized to at least partially surround an arm of a surgeon.
24. The control system of claim 21, wherein the tissue proximity detection system comprises a structured light source on the robotic surgical tool.
25. A control system, the control system comprising:
an tissue proximity detection system;
a user input device, the user input device comprising:
a base;
a forearm support member movably coupled to the base, wherein the forearm support member is movable relative to the base within a travel zone;
a shaft extending distally from the forearm support;
a handpiece extending distally from the shaft, wherein the handpiece comprises a jaw configured to pivot relative to the shaft; and
a plurality of sensors, the plurality of sensors comprising:
a first sensor arrangement configured to be able to detect a user input force to the base;
a second sensor arrangement configured to be able to detect displacement of the forearm support; and
A third sensor arrangement configured to detect pivotal movement of the jaws; and
a control circuit configured to be capable of:
receiving a proximity data signal from the tissue proximity detection system;
receiving a first input signal from the first sensor arrangement;
receiving a second input signal from the second sensor arrangement;
receiving a third input signal from the third sensor arrangement; and
causing the user input device to switch from a first mode to a second mode in response to a proximity data signal from the tissue proximity detection system indicating a predefined tissue proximity, wherein in the first mode the first input signal controls movement of the robotic surgical tool, and wherein in the second mode the second input signal and the third input signal control movement of the robotic surgical tool.
26. A user input device for controlling a robotic surgical tool, the user input device comprising:
a base comprising a first sensor arrangement;
a forearm support member movably coupled to the base, wherein the forearm support member is movable relative to the base within a travel zone, and wherein the forearm support member includes a second sensor arrangement; and
A control circuit configured to be capable of:
receiving a first input signal from the first sensor arrangement;
receiving a second input signal from the second sensor arrangement; and
switching the user input device between a first mode in which the first input signal controls movement of the robotic surgical tool and a second mode in which the second input signal controls movement of the robotic surgical tool.
27. The user input device of claim 26, wherein the control circuit is communicatively coupled to a tissue proximity detection system, and wherein the control circuit is configured to switch the user input device between the first mode and the second mode in response to an input from the tissue proximity detection system.
28. The user input device of claim 27, wherein the first mode comprises a coarse motion mode, wherein the second mode comprises a fine motion mode, and wherein the control circuit is configured to: switching the user input device between the coarse motion mode and the fine motion mode when the tissue proximity detection system provides a proximity signal indicating that the robotic surgical tool is positioned at less than a threshold distance of an anatomic structure.
29. A user input device as in claim 26 wherein the first sensor arrangement comprises a six degree of freedom force and torque sensor.
30. A user input device according to claim 29 wherein the first sensor arrangement comprises a joystick movable in a three dimensional space around an input start position, wherein the three dimensional space extends in all directions from the input start position by a distance of between 1.0mm and 5.0mm, and wherein the joystick is spring biased towards the input start position.
31. A user input device as in claim 26 wherein the second sensor arrangement comprises a displacement sensor.
32. A user input device as in claim 31 wherein the travel region comprises a three dimensional space surrounding a forearm starting position and wherein the forearm support member is spring biased toward the forearm starting position.
33. A user input device as in claim 32 wherein the three dimensional space extends in all directions from the forearm starting position by a distance of between 2.0cm and 6.0 cm.
34. The user input device of claim 26, wherein the forearm support member comprises a curved arc forming a sleeve, and wherein the sleeve is sized to at least partially surround a surgeon's arm.
35. The user input device of claim 26, further comprising:
a shaft extending distally from the forearm support;
a handpiece extending distally from the shaft and including a first jaw and a second jaw, wherein the first jaw and the second jaw are pivotable relative to the shaft within a range of motion; and
a jaw sensor arrangement configured to detect pivotal movement of the first jaw and the second jaw within the range of motion.
36. The user input device of claim 35, wherein the jaw sensor arrangement is communicatively coupled to the control circuit, and wherein the control circuit is further configured to:
receiving a third input signal from the jaw sensor arrangement; and
providing output signals to the robotic surgical tool to control actuation of one or more jaws of an end effector of the robotic surgical tool.
37. The user input device of claim 35, further comprising:
a first finger loop on the first jaw, the first finger loop positioned and sized to receive at least one finger of a hand of a surgeon; and
A second finger loop on the second jaw, the second finger loop positioned and sized to receive at least one finger of the surgeon's hand.
38. The user input device of claim 35, further comprising:
a rotary joint interposed between the handpiece and the shaft; and
a rotation sensor configured to detect rotational movement of the handpiece relative to the shaft.
39. The user input device of claim 35, wherein the handpiece further comprises an actuator communicatively coupled to the control circuit, and wherein the control circuit is further configured to:
receiving an input actuation signal from the actuator; and
providing an output actuation signal to the robotic surgical tool to actuate a surgical function.
40. The user input device of claim 39, wherein the actuator is selected from the group consisting of: triggers, buttons, switches, levers, toggles, and combinations thereof.
41. A control system for a robotic surgical tool, the control system comprising:
An untethered handpiece, the untethered handpiece comprising:
a main body;
a joystick extending from the body;
a rotatable shaft extending from the body; and
a plurality of sensors, the plurality of sensors comprising: a body sensor embedded in the body and configured to be able to detect motion of the body in three-dimensional space; a multi-axis force sensor configured to be able to detect a force applied to the joystick; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body; and
a control circuit communicatively coupled to the plurality of sensors and the proximity detection system, the control circuit configured to:
receiving a proximity signal from the proximity detection system, wherein the proximity signal indicates a proximity of the robotic surgical tool to tissue;
receiving input control signals from the plurality of sensors;
switching between a coarse motion mode and a fine motion mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold;
Providing a coarse motion control signal to the robotic surgical tool based on the input control signal from the multi-axis force sensor in the coarse motion mode; and
providing a precision motion control signal to the robotic surgical tool based on the input control signals from the body sensor and the shaft sensor in the precision motion mode.
42. The control system of claim 41, wherein the joystick comprises a ring sized and positioned to receive a thumb of a user.
43. The control system of claim 41, wherein the subject sensor is selected from the group consisting of an inertial sensor and an electromagnetic tracking receiver.
44. The control system of claim 41, wherein the shaft sensor is selected from the group consisting of a rotary transducer, a strain gauge, and an optical sensor.
45. A control system for a robotic surgical tool, the control system comprising:
an untethered handpiece, the untethered handpiece comprising:
a main body;
an actuator extending from the body;
a rotatable shaft extending from the body; and
A plurality of sensors, the plurality of sensors comprising: a body sensor embedded in the body and configured to be able to detect motion of the body in three-dimensional space; a force sensor configured to be able to detect a force applied to the actuator; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body;
a proximity detection system configured to detect proximity of the robotic surgical tool to tissue; and
a control circuit communicatively coupled to the plurality of sensors and the proximity detection system, the control circuit configured to:
receiving a proximity signal from the proximity detection system, wherein the proximity signal is indicative of the proximity of the robotic surgical tool to tissue;
receiving input control signals from the plurality of sensors;
switching between a first mode and a second mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold;
providing a first motion control signal to the robotic surgical tool based on the input control signal from the force sensor in the first mode, wherein the first motion control signal is scaled based on the proximity signal; and
Providing a second motion control signal to the robotic surgical tool based on the input control signals from the body sensor and the shaft sensor in the second mode.
46. The control system of claim 45, wherein the proximity detection system comprises a structured light source and a light receiver.
47. The control system of claim 45, wherein the shaft further comprises a radial sensor configured to detect a radial force applied to the shaft, and wherein the control circuitry is further configured to:
receiving an input control signal from the radial sensor; and
providing an output control signal to the robotic surgical tool based on the input control signal from the radial sensor, wherein the output control signal is configured to apply a closing motion to one or more jaws of the robotic surgical tool.
48. A control system for a robotic surgical tool, the control system comprising:
an untethered handpiece, the untethered handpiece comprising:
a main body;
an actuation arm extending proximally from the body;
A shaft extending distally from the body; and
a plurality of sensors, the plurality of sensors comprising: a body sensor embedded in the body and configured to be able to detect motion of the body in three-dimensional space; an arm sensor configured to be capable of detecting a force applied to the actuation arm; and a shaft sensor configured to be able to detect rotational displacement of the shaft relative to the body; and
a control circuit communicatively coupled to the plurality of sensors and the proximity detection system, the control circuit configured to:
receiving a proximity signal from the proximity detection system, wherein the proximity signal indicates a proximity of the robotic surgical tool to tissue;
receiving input control signals from the plurality of sensors;
switching between a coarse motion mode and a fine motion mode in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold;
providing an output control signal to the robotic surgical tool based on the input control signal from the arm sensor in the coarse motion mode; and
Providing an output control signal to the robotic surgical tool based on the input control signals from the body sensor and the shaft sensor in the precision motion mode.
49. The control system of claim 48, wherein the actuation arm includes a lever that is movably coupled to the body at a spatial joint.
50. The control system of claim 49, wherein the joystick comprises a ring sized and positioned to receive a user's thumb.
51. The control system of claim 49, wherein the arm sensor comprises a multi-axis force sensor positioned at the spatial joint.
52. The control system of claim 48, wherein the subject sensor comprises an inertial sensor.
53. The control system of claim 48, wherein the subject sensor comprises an electromagnetic tracking receiver.
54. The control system of claim 48, wherein the proximity detection system comprises a structured light source.
55. The control system of claim 48, wherein the control circuit is further configured to scale the output control signal based on the input control signal received by the arm sensor in response to the proximity signal received from the proximity detection system in the coarse motion mode.
56. The control system of claim 55, wherein the control circuitry is further configured to decrease the output control signal in the coarse motion mode in response to the proximity signal indicating that the robotic surgical tool is approaching tissue.
57. The control system of claim 48, wherein the untethered handpiece further comprises a lock for the actuation arm, wherein the lock is movable from an unlocked position to a locked position in response to receiving the proximity signal indicating that the proximity has decreased to less than the threshold.
58. The control system of claim 48, wherein the body further comprises a spring-loaded actuation button movable between an initial position and a depressed position, and wherein the output control signal based on the input control signal detected by the body sensor in the coarse motion mode is provided to the robotic surgical tool only when the spring-loaded actuation button is moved to the depressed position.
59. The control system of claim 48, wherein the shaft further comprises a radial sensor configured to detect a radial force applied to the shaft, and wherein the control circuitry is further configured to:
Receiving an input control signal from the radial sensor; and
providing an output control signal to the robotic surgical tool to apply a closing motion to one or more jaws of the robotic surgical tool based on the input control signal from the radial sensor.
60. The control system of claim 48, wherein the untethered handpiece further comprises a trigger and a trigger sensor configured to detect actuation of the trigger, and wherein the control circuit is further configured to:
receiving an input control signal from the trigger sensor; and
providing an output control signal to the robotic surgical tool to apply a closing motion to one or more jaws of the robotic surgical tool based on the input control signal from the trigger sensor.
61. A control system for a robotic surgical tool, the control system comprising:
an untethered handpiece, the untethered handpiece comprising:
a coarse motion controller comprising a multi-axis sensor;
a precision motion controller comprising an embedded motion sensor; and
Control circuitry communicatively coupled to the multi-axis sensor, the embedded motion sensor, and a proximity detection system, the control circuitry configured to:
receiving a proximity signal from the proximity detection system, the proximity signal indicating a proximity of the robotic surgical tool to tissue; and
switching between a coarse motion mode in which the robotic surgical tool is controlled with input control signals from the coarse motion controller and a fine motion mode in which the robotic surgical tool is controlled with input control signals from the fine motion controller in response to receiving a proximity signal indicating that the proximity decreases to less than a threshold.
CN202080034358.XA 2019-03-15 2020-03-04 Input controls for robotic surgery Pending CN113795214A (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US16/354,417 US11666401B2 (en) 2019-03-15 2019-03-15 Input controls for robotic surgery
US16/354,420 2019-03-15
US16/354,420 US20200289228A1 (en) 2019-03-15 2019-03-15 Dual mode controls for robotic surgery
US16/354,422 US11992282B2 (en) 2019-03-15 2019-03-15 Motion capture controls for robotic surgery
US16/354,422 2019-03-15
US16/354,417 2019-03-15
PCT/IB2020/051847 WO2020188390A1 (en) 2019-03-15 2020-03-04 Input controls for robotic surgery

Publications (1)

Publication Number Publication Date
CN113795214A true CN113795214A (en) 2021-12-14

Family

ID=69845478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080034358.XA Pending CN113795214A (en) 2019-03-15 2020-03-04 Input controls for robotic surgery

Country Status (3)

Country Link
EP (1) EP3937818A1 (en)
CN (1) CN113795214A (en)
WO (1) WO2020188390A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113063538A (en) * 2021-03-22 2021-07-02 马洪文 Distributed multi-dimensional force sensor
US11537219B2 (en) 2018-08-07 2022-12-27 The Research Foundation For The State University Of New York Feedback input apparatus and method for use thereof

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4255335A1 (en) * 2020-12-01 2023-10-11 Intuitive Surgical Operations, Inc. Interaction between user-interface and master controller
DE102021119618B4 (en) 2021-07-28 2023-02-23 Karl Storz Se & Co. Kg Input unit for a medical instrument and medical system with an input unit
US20240350122A1 (en) * 2021-08-24 2024-10-24 Rmi Oceania Pty Ltd Diagnostic imaging system
CN114323128A (en) * 2021-12-21 2022-04-12 北京罗森博特科技有限公司 Operation measurement analysis handle
WO2024171020A1 (en) * 2023-02-17 2024-08-22 Medical Microinstruments Inc. Master controller device comprising a wearable portion for robotic
WO2024218672A1 (en) * 2023-04-21 2024-10-24 Medical Microinstruments Inc. Unconstrained master control device for a master control station for medical or surgical teleoperation and control method
CN116636934B (en) * 2023-06-28 2023-09-26 敏捷医疗科技(苏州)有限公司 Master-slave delay testing device of surgical robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070144298A1 (en) * 2005-12-27 2007-06-28 Intuitive Surgical Inc. Constraint based control in a minimally invasive surgical apparatus
US20100262162A1 (en) * 2007-12-28 2010-10-14 Terumo Kabushiki Kaisha Medical manipulator and medical robot system
CN102596085A (en) * 2009-11-13 2012-07-18 直观外科手术操作公司 Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
CN103764061A (en) * 2011-06-27 2014-04-30 内布拉斯加大学评议会 On-board tool tracking system and methods of computer assisted surgery
WO2018112227A2 (en) * 2016-12-15 2018-06-21 Intuitive Surgical Operations, Inc. Actuated grips for controller

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3611337A1 (en) 1986-04-04 1987-10-22 Deutsche Forsch Luft Raumfahrt OPTO-ELECTRONIC ARRANGEMENT HOUSED IN A PLASTIC BALL
US6951535B2 (en) 2002-01-16 2005-10-04 Intuitive Surgical, Inc. Tele-medicine system that transmits an entire state of a subsystem
DE10158775B4 (en) 2001-11-30 2004-05-06 3Dconnexion Gmbh Arrangement for detecting relative movements or relative positions of two objects
EP1585015B1 (en) 2004-03-17 2007-05-02 3DConnexion GmbH User interface device
EP1850210B1 (en) 2006-04-05 2012-02-01 Société Civile "GALILEO 2011" Optoelectronic device for determining relative movements or relative positions of two objects
US7516675B2 (en) 2007-07-03 2009-04-14 Kulite Semiconductor Products, Inc. Joystick sensor apparatus
US8224484B2 (en) 2007-09-30 2012-07-17 Intuitive Surgical Operations, Inc. Methods of user interface with alternate tool mode for robotic surgical tools
US9274047B2 (en) 2013-05-24 2016-03-01 Massachusetts Institute Of Technology Methods and apparatus for imaging of occluded objects
FR3016512B1 (en) * 2014-01-23 2018-03-02 Universite De Strasbourg MASTER INTERFACE DEVICE FOR MOTORIZED ENDOSCOPIC SYSTEM AND INSTALLATION COMPRISING SUCH A DEVICE
EP3228254B1 (en) 2014-02-21 2020-01-01 3DIntegrated ApS A set comprising a surgical instrument
DK178899B1 (en) 2015-10-09 2017-05-08 3Dintegrated Aps A depiction system
GB2596658B (en) * 2016-09-21 2022-05-11 Cmr Surgical Ltd User interface device
US10198086B2 (en) * 2016-10-27 2019-02-05 Fluidity Technologies, Inc. Dynamically balanced, multi-degrees-of-freedom hand controller
GB2606080B (en) * 2017-03-10 2023-02-15 Cmr Surgical Ltd Controlling a surgical instrument

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070144298A1 (en) * 2005-12-27 2007-06-28 Intuitive Surgical Inc. Constraint based control in a minimally invasive surgical apparatus
US20100262162A1 (en) * 2007-12-28 2010-10-14 Terumo Kabushiki Kaisha Medical manipulator and medical robot system
CN102596085A (en) * 2009-11-13 2012-07-18 直观外科手术操作公司 Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
CN103764061A (en) * 2011-06-27 2014-04-30 内布拉斯加大学评议会 On-board tool tracking system and methods of computer assisted surgery
WO2018112227A2 (en) * 2016-12-15 2018-06-21 Intuitive Surgical Operations, Inc. Actuated grips for controller

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11537219B2 (en) 2018-08-07 2022-12-27 The Research Foundation For The State University Of New York Feedback input apparatus and method for use thereof
CN113063538A (en) * 2021-03-22 2021-07-02 马洪文 Distributed multi-dimensional force sensor
CN113063538B (en) * 2021-03-22 2023-01-20 马洪文 Distributed multi-dimensional force sensor

Also Published As

Publication number Publication date
WO2020188390A1 (en) 2020-09-24
EP3937818A1 (en) 2022-01-19

Similar Documents

Publication Publication Date Title
US11666401B2 (en) Input controls for robotic surgery
US11992282B2 (en) Motion capture controls for robotic surgery
US11490981B2 (en) Robotic surgical controls having feedback capabilities
US11284957B2 (en) Robotic surgical controls with force feedback
US11583350B2 (en) Jaw coordination of robotic surgical controls
US11690690B2 (en) Segmented control inputs for surgical robotic systems
US20200289228A1 (en) Dual mode controls for robotic surgery
US11701190B2 (en) Selectable variable response of shaft motion of surgical robotic systems
US11471229B2 (en) Robotic surgical systems with selectively lockable end effectors
CN113795214A (en) Input controls for robotic surgery
US20200289205A1 (en) Robotic surgical systems with mechanisms for scaling camera magnification according to proximity of surgical tool to tissue
US12115029B2 (en) Analyzing surgical trends by a surgical system
JP6689203B2 (en) Medical system integrating eye tracking for stereo viewer
CN115279252A (en) Visualization system using structured light
JP2023508524A (en) dynamic surgical visualization system
CN115605158A (en) System and method for determining, adjusting and managing a resection edge around patient tissue
KR102171873B1 (en) Haptic glove and Surgical robot system
US20210196098A1 (en) Surgical system control based on multiple sensed parameters
CN115551428A (en) Adaptive surgical system control based on surgical smoke particle characteristics
CN115087406A (en) Adaptive surgical system control based on surgical smoke cloud characteristics
KR20140139840A (en) Display apparatus and control method thereof
KR20140112207A (en) Augmented reality imaging display system and surgical robot system comprising the same
KR20140115575A (en) Surgical robot system and method for controlling the same
EP4066771A1 (en) Visualization systems using structured light
WO2020188391A1 (en) Robotic surgical controls having feedback capabilities

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination