US20220031412A1 - Planning a tool path for an end-effector using an environmental map - Google Patents
Planning a tool path for an end-effector using an environmental map Download PDFInfo
- Publication number
- US20220031412A1 US20220031412A1 US17/502,500 US202117502500A US2022031412A1 US 20220031412 A1 US20220031412 A1 US 20220031412A1 US 202117502500 A US202117502500 A US 202117502500A US 2022031412 A1 US2022031412 A1 US 2022031412A1
- Authority
- US
- United States
- Prior art keywords
- effector
- bone
- tool path
- environmental map
- tool
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000012636 effector Substances 0.000 title claims abstract description 68
- 230000007613 environmental effect Effects 0.000 title claims abstract description 47
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 81
- 238000000034 method Methods 0.000 claims abstract description 62
- 239000000463 material Substances 0.000 claims abstract description 14
- 238000002372 labelling Methods 0.000 claims description 7
- 230000000399 orthopedic effect Effects 0.000 claims description 4
- 238000005457 optimization Methods 0.000 claims description 2
- 238000005520 cutting process Methods 0.000 abstract description 12
- 238000001356 surgical procedure Methods 0.000 abstract description 10
- 238000006073 displacement reaction Methods 0.000 abstract description 3
- 230000000007 visual effect Effects 0.000 abstract description 3
- 238000011084 recovery Methods 0.000 description 17
- 239000003550 marker Substances 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 7
- 239000007943 implant Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000011960 computer-aided design Methods 0.000 description 4
- 238000003801 milling Methods 0.000 description 4
- 239000000523 sample Substances 0.000 description 4
- 210000000689 upper leg Anatomy 0.000 description 4
- 210000003484 anatomy Anatomy 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000002591 computed tomography Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000008439 repair process Effects 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 238000011883 total knee arthroplasty Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 210000003423 ankle Anatomy 0.000 description 1
- 238000011882 arthroplasty Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 210000001513 elbow Anatomy 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000002594 fluoroscopy Methods 0.000 description 1
- 210000002683 foot Anatomy 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000002432 robotic surgery Methods 0.000 description 1
- 210000003625 skull Anatomy 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/56—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/56—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
- A61B2017/564—Methods for bone or joint treatment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
- A61B2034/104—Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/252—User interfaces for surgical systems indicating steps of a surgical procedure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
- A61B2034/743—Keyboards
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
- A61B2034/744—Mouse
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
Definitions
- the present invention generally relates to the field of robotic-assisted orthopedic surgery, and more particularly to a system and method for dynamically generating an environmental mapping for use in robotic assisted surgery.
- Robotic surgery is an expanding field having applications in orthopedics, neurology, oncology, and soft-tissue procedures.
- robotic surgical systems generally aid in the planning and execution of a bone procedure to repair and replace a damaged joint.
- the TSOLUTION ONE® Surgical System (THINK Surgical, Inc., Fremont, Calif.) is one such system that aids in the planning and execution of total hip arthroplasty (THA) and total knee arthroplasty (TKA).
- TSOLUTION ONE® Surgical System includes: a pre-operative planning software program to generate a surgical plan; and an autonomous surgical robot that precisely mills the bone to receive an implant according to the plan.
- the robot only knows the POSE of the registered bone and the tool path 10 on which to control the end-effector tool.
- the robot without any other sensors (e.g., laser scanner), lacks information about the environment including the presence or absence of obstacles (e.g., clinical staff, patient's anatomy) in the robot's workspace. Therefore, an obstacle may unintentionally encounter the end-effectors toolpath.
- obstacles e.g., clinical staff, patient's anatomy
- One particular situation where the end-effector tool may encounter an obstacle is during a tool path recovery procedure. Whenever a procedure is paused for a safety reason, the surgical team needs to investigate the problem to ensure the end-effector tool is cutting as intended and the patient is safe. To obtain a better view of the surgical site, the surgical team manually displaces (e.g., guides, removes) the end-effector tool away from the surgical site to inspect the problem. Once the problem is alleviated or addressed, the user may resume the procedure. To resume the procedure, the position of the end-effector tool needs to be recovered, meaning the end-effector tool needs to be re-positioned back to, or near, the tools previous cutting position.
- the cut-file may include a plurality of checkpoints ( 20 a , 20 b , 20 c , 20 d ) positioned along the tool path 10 to recover the position of the end-effector tool back to, or near the tool path 10 .
- the end-effector tool passes the checkpoints ( 20 a , 20 b , 20 c , 20 d ) in the tool path 10
- the last checkpoint passed when the procedure was paused may be used as a reference location to re-position the cutter. For example, if the end-effector tool was milling between checkpoints 20 c and 20 d when the pause occurred, then checkpoint 20 c is used as the last checkpoint to recover the position of the end-effector tool.
- the robot has no external information about the environment, including obstacles (e.g., un-cut bone), which may exist between the displaced position of the end-effector tool and the previous cutting position without having external sensors or a model of the environment. Therefore, robot-to-bone collisions while recovering the position of the end-effector tool are possible without manual user assistance, which may pose a risk to the patient and increase the surgical time.
- obstacles e.g., un-cut bone
- a method is provided to dynamically generate an environmental map in a robotic assisted surgery system.
- the method includes registering a physical surface contour of a bone to the robotic assisted surgical system, and determining a high point on the surface contour.
- a boundary is defined in the environmental map based on the high point and a plane non-parallel to a longitudinal axis of the bone, and regions are labeled starting at the boundary and away from the bone as free space in the environmental map, as well as regions labeled starting at the boundary and towards the bone as invalid space in the environmental map.
- the method further includes removing material from a workpiece or the bone by manipulating an end-effector tool with a manipulator arm along a tool path of the robotic surgical system, and dynamically generating the environmental map as material is being removed by labeling the removed material as free space in the environmental map and labeling the non-removed material as invalid space in the environmental map.
- a surgical system for performing the computerized method to dynamically generate an environmental map in a robotic assisted surgery system.
- the system includes a surgical robot with an end effector tool, a computing system having user-peripherals and a monitor for displaying a graphical user interface (GUI), and at least one of a mechanical digitizer or a non-mechanical tracking system.
- GUI graphical user interface
- FIG. 1A depicts a planned placement of an implant model relative to a bone model as is known in the prior art
- FIG. 1B depicts a tool path having checkpoints for removing bone according to the planned placement as is known in the prior art
- FIG. 2 depicts a surgical system having an environmental map generator software module for dynamically generating an environmental map in accordance with embodiments of the invention
- FIG. 3 depicts an exposed proximal femur and a boundary used to dynamically generate an environmental map in accordance with embodiments of the invention.
- FIGS. 4A-4C depicts the progression of a method for dynamically generating an environmental map and planning a recovery tool path in accordance with embodiments of the invention, where FIG. 4A depicts a tool removing bone, FIG. 4B depicts a position of the tool when a pause in the procedure occurs, and FIG. 4C depicts the tool displaced from the tool path and a recovery tool path for recovering the position of the tool from the displaced position back to, or near, the previous cutting position as shown in FIG. 4B .
- the present invention has utility as a system and method for dynamically generating an environmental map for use in robotic assisted surgery.
- the system and method are particularly advantageous for generating an environmental map to plan a recovery tool path that an end-effector tool can safely and efficiently follow to re-position the tool back to a cutting position following the displacement of the tool from the cutting position.
- the environmental map may be particularly useful to update a virtual representation of a bone to provide a user with visual feedback as to the progression of the end-effector tool as the tool removes material from the bone.
- the systems and methods described herein make reference to the proximal femur bone
- the systems and methods may be applied to other bones and joints in the body illustratively including the knee, ankle, elbow, wrist, skull, and spine, as well as revision of initial repair or replacement of any of the aforementioned bones or joints.
- the systems and methods described herein may be applied to industrial applications, such as the generation of an environmental map for a computer numerical control (CNC) machine that mills inanimate workpieces (e.g., wood, metal).
- CNC computer numerical control
- the term “recovery marker” refers to a physical reference marker designed to permit a measurement system, such as a mechanical tracking system, optical tracking system, electro-magnetic tracking system, ultrasound tracking system, and/or an imaging system (e.g., computed tomography (CT), X-ray, fluoroscopy, ultrasound, magnetic resonance imaging (MRI)), to determine at least one of a position or orientation of at least a portion of the reference marker.
- a measurement system such as a mechanical tracking system, optical tracking system, electro-magnetic tracking system, ultrasound tracking system, and/or an imaging system (e.g., computed tomography (CT), X-ray, fluoroscopy, ultrasound, magnetic resonance imaging (MRI)
- CT computed tomography
- X-ray X-ray
- fluoroscopy fluoroscopy
- ultrasound magnetic resonance imaging
- registration refers to the determination of the spatial relationship between two or more objects or coordinate systems such as a computer-assist device, a bone, or an image data set of a bone. Illustrative methods of registration known in the art are described in U.S. Pat. Nos. 6,033,415, 8,010,177, 8,036,441, and 8,287,522. “Re-registration” refers to any subsequent registration procedure that occurs after an initial registration and is executed with the use of the recovery markers.
- end-effector tool refers to an instrument that is manipulated/guided by an external device (e.g., surgical robot, CNC machine) and interacts with a workpiece (e.g., bone).
- an end-effector tool include a cutter, an end-mill, a burr, a probe, an electrocautery device, a reamer, an impactor, a drill bit, a screw, forceps, scissors, and a saw.
- real-time refers to the processing of input data within milliseconds such that calculated values are available within 10 seconds of computational initiation.
- FIG. 2 depicts an embodiment of a robotic surgical system 100 shown in the context of an operating room (OR).
- the robotic surgical system 100 is capable of implementing embodiments of the inventive method as described herein.
- the surgical system 100 generally includes a surgical robot 102 , a computing system 104 , and includes at least one of a mechanical digitizer 118 or a non-mechanical tracking system 106 (e.g., an optical tracking system, an electro-magnetic tracking system).
- the surgical robot 102 may include a movable base 108 , a manipulator arm 110 connected to the base 108 , an end-effector flange 112 located at a distal end of the manipulator arm 110 , and an end-effector assembly 111 removably attached to the flange 112 by way of an end-effector mount/coupler 113 .
- the end-effector assembly 111 holds and/or operates an end-effector tool 115 that interacts with a portion of a patient's anatomy.
- the base 108 includes a set of wheels 117 to maneuver the base 108 , which may be fixed into position using a braking mechanism such as a hydraulic brake.
- the base 108 may further include an actuator to adjust the height of the manipulator arm 110 .
- the manipulator arm 110 includes various joints and links to manipulate the tool 115 in various degrees of freedom. The joints are illustratively prismatic, revolute, spherical, or a combination thereof.
- the computing system 104 generally includes a planning computer 114 ; a device computer 116 ; an optional tracking computer 119 if a tracking system 106 is present; and peripheral devices.
- the planning computer 114 , device computer 116 , and tracking computer 119 may be separate entities, a single collective unit, or combinations thereof depending on the surgical system.
- the peripheral devices allow a user to interface with the robotic surgical system 100 and may include: one or more user-interfaces, such as a display or monitor 120 for displaying a graphical user interface (GUI); and user-input mechanisms, such as a keyboard 121 , mouse 122 , pendent 124 , joystick 126 , foot pedal 128 , or the monitor 120 in some inventive embodiments have touchscreen capabilities.
- GUI graphical user interface
- the planning computer 114 contains hardware (e.g., processors, controllers, and memory), software, data and utilities that are in some inventive embodiments dedicated to the planning of a surgical procedure, either pre-operatively or intra-operatively. This may include reading medical imaging data, segmenting imaging data, constructing three-dimensional (3D) virtual models, storing computer-aided design (CAD) files, providing various functions or widgets to aid a user in planning the surgical procedure, and generating surgical plan data.
- hardware e.g., processors, controllers, and memory
- software e.g., software, data and utilities that are in some inventive embodiments dedicated to the planning of a surgical procedure, either pre-operatively or intra-operatively. This may include reading medical imaging data, segmenting imaging data, constructing three-dimensional (3D) virtual models, storing computer-aided design (CAD) files, providing various functions or widgets to aid a user in planning the surgical procedure, and generating surgical plan data.
- CAD computer-aided design
- the final surgical plan includes operational data for modifying a volume of tissue that is defined relative to the anatomy, such as a set of points in a cut-file to autonomously modify the volume of bone, a set of virtual boundaries defined to haptically constrain a tool within the defined boundaries to modify the bone, a set of planes or drill holes to drill pins in the bone, or a graphically navigated set of instructions for modifying the tissue.
- the data generated from the planning computer 114 may be transferred to the device computer 116 and/or tracking computer 119 through a wired or wireless connection in the operating room (OR); or transferred via a non-transient data storage medium (e.g., a compact disc (CD), a portable universal serial bus (USB) drive) if the planning computer 114 is located outside the OR.
- a non-transient data storage medium e.g., a compact disc (CD), a portable universal serial bus (USB) drive
- data e.g., surgical plan data, image data, cut-files or others robotic instructions
- actuated LEDs as described in U.S. Pat. Pub. No. 20170245945 assigned to the assignee of the present application.
- the device computer 116 in some inventive embodiments is housed in the moveable base 108 and contains hardware (e.g., controllers), software, data and utilities that are preferably dedicated to the operation of the surgical robot 102 .
- This may include surgical device control, robotic manipulator control, the processing of kinematic and inverse kinematic data, the execution of registration algorithms, the execution of calibration routines, the execution of surgical plan data, coordinate transformation processing, providing workflow instructions to a user, utilizing position and orientation (POSE) data from the tracking system 106 , and reading data received from the mechanical digitizer 118 .
- the device computer 116 may further include an environmental map generator software module for generating an environmental map during the procedure as further described below.
- the environmental map generator software module may include a motion planner software module for dynamically planning a recovery path for the end-effector tool 115 in the event a user displaces the tool 115 during the procedure as further described below.
- the optional tracking system 106 of the surgical system 100 includes two or more optical receivers 130 to detect the position of fiducial markers (e.g., retroreflective spheres, active light emitting diodes (LEDs)) uniquely arranged on rigid bodies.
- the optical receivers are 3D laser scanners.
- the fiducial markers arranged on a rigid body are collectively referred to as a fiducial marker array 132 , where each fiducial marker array 132 has a unique arrangement of fiducial markers, or a unique transmitting wavelength/frequency if the markers are active LEDs.
- the fiducial markers are directly integrated onto or with the surgical device.
- An example of an optical tracking system is described in U.S. Pat. No. 6,061,644.
- the tracking system 106 may be built into a surgical light, located on a boom, a stand 140 , or built into the walls or ceilings of the OR.
- the tracking system computer 136 may include tracking hardware, software, data and utilities to determine the POSE of objects (e.g., bones B, surgical robot 102 ) in a local or global coordinate frame.
- the POSE of the objects is collectively referred to herein as POSE data, where this POSE data may be communicated to the device computer 116 through a wired or wireless connection.
- the device computer 116 may determine the POSE data using the position of the fiducial markers detected from the optical receivers 130 directly.
- the POSE data is determined using the position data detected from the optical receivers 130 and operations/processes such as image processing, image filtering, triangulation algorithms, geometric relationship processing, registration algorithms, calibration algorithms, and coordinate transformation processing.
- operations/processes such as image processing, image filtering, triangulation algorithms, geometric relationship processing, registration algorithms, calibration algorithms, and coordinate transformation processing.
- the POSE of a digitizer probe 138 with an attached probe fiducial marker array 132 b may be calibrated such that the probe tip is continuously known in physical space as described in U.S. Pat. No. 7,043,961.
- the POSE of the tool 115 may be known with respect to a device fiducial marker array 132 a using a calibration method as described in U.S. Prov. Pat. App. 62/128,857.
- the device fiducial marker 132 a is depicted on the manipulator arm 110 ; however, the marker 132 a may be positioned on the base 108 or the end-effector assembly 111 .
- Registration algorithms may be executed to determine the POSE and coordinate transforms between a bone B, a fiducial marker array 132 c or 132 d , a surgical plan, and the surgical robot 102 using the aforementioned registration methods.
- 3D laser scanning has been contemplated as an adjunct to surgery (Harmon, L., et al. “3D laser scanning for image-guided neurosurgery.” Ann Arbor 1001 (1994): 48113-4001), this technique has been met with limited success owing to the inadequate resolution of such scans that are nonetheless rapidly collected. This is known synonymously as light detection and ranging (LIDAR).
- LIDAR light detection and ranging
- Such 3D laser scanning is operative herein to detect a rapid and independent mapping of the surgical field for obstacles about which the surgical robotic system lacks information. The resulting scans are readily communicated and compared to the bone and/or fiducial marker registration to avoid collisions.
- the POSE data is used by the computing system 104 during the procedure to update the POSE and/or coordinate transforms of the bone B, the surgical plan, and the surgical robot 102 as the manipulator arm 110 and/or bone B move during the procedure, such that the surgical robot 102 can accurately execute the surgical plan.
- the surgical system 100 does not include a tracking system 106 , but instead employs a bone fixation and monitoring system that fixes the bone directly to the surgical robot 102 and monitors bone movement as described in U.S. Pat. No. 5,086,401.
- a particular inventive embodiment of a method for dynamically generating an environmental map is shown.
- the surgical team first exposes the bone B via an incision 146 , and the bone B is registered to the surgical robot 102 and the surgical plan (the surgical plan having the tool path 10 ) with techniques known in the art.
- a boundary 148 e.g., a hyperplane, a meshed surface generated by a camera
- space e.g., in a coordinate system associated with the surgical robot 102 or tracking system 106
- a boundary 148 e.g., a hyperplane, a meshed surface generated by a camera
- the algorithms for such registration and by extension robotic navigation are well known as provided in exemplary form in Bobrow, J. E. et al. “Time-optimal control of robotic manipulators along specified paths.” Int. J. Robot. Res., 4(3):3-17, 1985.
- the highest point on the bone may be determined by designating such point as a landmark on a 3-D virtual bone model of the bone B in the pre-operative planning software, wherein once registered to the actual bone is known in physical space.
- the ‘high point’ refers to the most proximal end point or the most distal end point on the bone depending on which end the surgical robot 102 is operating (e.g., if the surgical robot 102 is preparing the proximal femur, then the ‘high point’ is the most proximal end point of the femur).
- the X-Y plane may be defined relative to the bone B using several methods.
- a coordinate system 152 of the bone B is defined relative to a 3-D virtual bone model of the bone B in the pre-operative planning software.
- a +Z-axis 154 and ⁇ Z-axis 156 of the bone B is defined along a longitudinal axis of the bone B, such as the mechanical axis or anatomical axis of the bone B.
- the X-axis and Y-axis are then defined perpendicular to the Z-axis and perpendicular to one another using anatomical landmarks (e.g., femoral condyles).
- anatomical landmarks e.g., femoral condyles
- the +Z-axis 154 is shown pointing in a direction opposing the direction of the end-effector's tool path 10
- the ⁇ Z-axis 156 is shown pointing in a direction towards the end-effector's tool path 10 .
- the highest point 150 is defined as the most +Z point on the bone B as determined from the digitizing and registration process, plus some optional offset.
- the boundary 148 is then defined in space using the highest point 150 and the X-Y plane as defined above.
- the boundary 148 may be defined by the computing system 104 and stored in the environmental map generator software module.
- the environmental map generator then begins mapping the environment by designating anything beyond the boundary 148 in the +Z-axis direction as free space FS and anything below the surface in the ⁇ Z-axis direction as invalid space IS.
- the free space FS is defined as any region in the environment where the end-effector can safely travel
- the invalid space IS is defined as any region in the environment that may be unsafe and/or cause an unintended collision with the end-effector tool.
- FIGS. 4A-4C depicts the end-effector assembly 111 operating a tool 115 having a tool tip 142 at different stages throughout embodiments of the inventive method.
- FIG. 4A depicts the end-effector tool 115 removing material from a bone B.
- the manipulator arm 110 manipulates the end-effector tool 115 according to a tool path 10 registered to the bone B.
- the environmental map generator associated with at least one of the device computer 116 or tracking computer 119 maps the environment in real-time.
- the environmental map generator maps the environment by labeling the volume of removed material by the end-effector tool 115 as free space FS, and all other regions as invalid space IS.
- FIG. 4B depicts the end-effector tool 115 positioned farther along the tool path 10 . As such, the volume of the free space FS has increased as the end-effector tool 115 has removed more bone B. Therefore, the environmental map is generated and updated in real-time based on the movements of the end-effector tool 115 as the tool 115 removes bone B.
- the environmental map generator knows the POSE of the tool 115 as the tool 115 is removing bone B to update the free space FS based on at least one of: the kinematics of the robot manipulator arm, POSE data collected from a tracking system, or a combination thereof.
- the environmental map generator only updates the environmental map (i.e., free space FS and invalid space IS) when the end-effector tool 115 is ‘on’ (e.g., an end-mill actively spinning) and when the end-effector tool 115 has crossed over the boundary 148 from +Z to ⁇ Z. Therefore, the motion planner module does not inadvertently mislabel a region of the environment as free space FS.
- the POSE of the surgical plan having the tool path 10 and the environmental map are updated to maintain their relative POSE to the bone B.
- the POSEs may be updated in real-time based on the POSE data.
- the surgical system utilizes a fixation system, bone motion monitoring system, and recovery markers as described in U.S. Pat. No. 5,086,401 and U.S. Pat. No. 6,430,434, then the recovery markers are used to update the relative POSEs. Therefore, the labeled free space FS and labeled invalid space IS in the environmental map is not compromised in the event of bone motion.
- FIG. 4B depicts the end-effector tool 115 at point in the tool path 10 where the procedure is paused due to one of several reasons, such as a safety reason.
- a user then displaces the end-effector tool 115 to a displaced position away from the tool path 10 .
- FIG. 4C depicts the tool 115 being displaced outside of the volume of removed bone B.
- the motion planner module searches for a recovery tool path 144 that: a) minimizes the path length from the current displaced position of the tool 115 (as shown in FIG. 4C ) back to, or near, the pre-displaced position of the tool 115 on the tool path 10 (as shown in FIG. 4B ); and b) maximizes dexterity and distance away from obstacles (i.e., invalid space IS).
- the motion planner module may utilize one or more algorithms to search for the recovery tool path 144 based on the foregoing constraints (a) and (b).
- the algorithms illustratively include optimization algorithms, probability roadmaps (PRM), rapidly-exploring random trees (RRT), potential field method, or any variants of the mentioned algorithms.
- the motion planner module may further use existing landmarks such as checkpoints 20 C in the tool path 10 to navigate the end-effector tool 115 back to, or near, the pre-displaced position of the end-effector tool 115 . More specifically, if the tool path 10 for the end-effector tool 115 includes checkpoints ( 20 a , 20 b , 20 c , 20 d ), then each checkpoint ( 20 a , 20 b , 20 c , 20 d ) encountered while removing the bone B is checked for reachability.
- a reachable checkpoint is any checkpoint in the robot workspace that can be reached by the robot's end-effector since not all of the checkpoints in the cut-file may be directly on the end-effector's tool path 10 .
- the graphical user interface may display the progress of the end-effector removing bone.
- the GUI may display the 3-D bone model being milled.
- the GUI may also display the amount of bone removed based on the labelled free space. Since the software module is recording and labelling the removed bone in real-time as free space, the free space is used as a metric to update the volume of bone removed accordingly. This provides a quick method to determine and display the progression of the milling in real-time relative to a virtual representation of the bone.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Robotics (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
A system and method are provided for planning a tool path for an end-effector of a surgical robot from a first position to a second position using an environmental map. The end-effector may safely and efficiently follow the tool path to re-position the end-effector back to a cutting position following the displacement of the end-effector from the cutting position. Additionally, the environmental map is used to update a virtual representation of a bone to provide a user with visual feedback as to the progression of the end-effector tool as the tool removes material from the bone during a surgical procedure.
Description
- This application is a continuation of U.S. application Ser. No. 16/250,341, filed Jan. 17, 2019, which in turn claims priority benefit of U.S. Provisional Application Ser. No. 62/621,307, filed 24 Jan. 2018; the contents of which are hereby incorporated by reference.
- The present invention generally relates to the field of robotic-assisted orthopedic surgery, and more particularly to a system and method for dynamically generating an environmental mapping for use in robotic assisted surgery.
- Robotic surgery is an expanding field having applications in orthopedics, neurology, oncology, and soft-tissue procedures. In the field of orthopedics, robotic surgical systems generally aid in the planning and execution of a bone procedure to repair and replace a damaged joint. The TSOLUTION ONE® Surgical System (THINK Surgical, Inc., Fremont, Calif.) is one such system that aids in the planning and execution of total hip arthroplasty (THA) and total knee arthroplasty (TKA). The TSOLUTION ONE® Surgical System includes: a pre-operative planning software program to generate a surgical plan; and an autonomous surgical robot that precisely mills the bone to receive an implant according to the plan. In more detail, with reference to
FIG. 1A , a user plans the placement of an implant model IM (e.g., a computer aided design (CAD) file of the implant) relative to a three-dimensionally generated bone model BM of the patient's bone to designate the best fit, fill, and alignment for the implant in each patient case. Once completed, the plan is saved and transferred to the robot for execution intra-operatively. In the operating room (OR), the plan is registered to the bone and the surgical robot controls an end-effector tool to mill the bone according to the plan. More specifically, as shown inFIG. 1B , the end-effector tool is manipulated based on a cut-file having cutting parameters. The cutting parameters includes atool path 10, among other parameters (e.g., feed-rates, spindle speed), that defines the motion for the end-effector tool to accurately modify the bone B according to the planned implant placement. - However, the robot only knows the POSE of the registered bone and the
tool path 10 on which to control the end-effector tool. The robot, without any other sensors (e.g., laser scanner), lacks information about the environment including the presence or absence of obstacles (e.g., clinical staff, patient's anatomy) in the robot's workspace. Therefore, an obstacle may unintentionally encounter the end-effectors toolpath. - One particular situation where the end-effector tool may encounter an obstacle is during a tool path recovery procedure. Whenever a procedure is paused for a safety reason, the surgical team needs to investigate the problem to ensure the end-effector tool is cutting as intended and the patient is safe. To obtain a better view of the surgical site, the surgical team manually displaces (e.g., guides, removes) the end-effector tool away from the surgical site to inspect the problem. Once the problem is alleviated or addressed, the user may resume the procedure. To resume the procedure, the position of the end-effector tool needs to be recovered, meaning the end-effector tool needs to be re-positioned back to, or near, the tools previous cutting position. In one method, the cut-file may include a plurality of checkpoints (20 a, 20 b, 20 c, 20 d) positioned along the
tool path 10 to recover the position of the end-effector tool back to, or near thetool path 10. As the end-effector tool passes the checkpoints (20 a, 20 b, 20 c, 20 d) in thetool path 10, the last checkpoint passed when the procedure was paused may be used as a reference location to re-position the cutter. For example, if the end-effector tool was milling betweencheckpoints checkpoint 20 c is used as the last checkpoint to recover the position of the end-effector tool. - However, as previously mentioned, the robot has no external information about the environment, including obstacles (e.g., un-cut bone), which may exist between the displaced position of the end-effector tool and the previous cutting position without having external sensors or a model of the environment. Therefore, robot-to-bone collisions while recovering the position of the end-effector tool are possible without manual user assistance, which may pose a risk to the patient and increase the surgical time.
- Thus, there is a need in the art for a method to dynamically generate an environmental map for use during robot assisted surgery. There further exists a need to dynamically plan a recovery tool path that an end-effector tool can safely and efficiently follow to re-position the tool back to, or near, a cutting position following the displacement of the tool from the cutting position with the aid of the environmental map.
- A method is provided to dynamically generate an environmental map in a robotic assisted surgery system. The method includes registering a physical surface contour of a bone to the robotic assisted surgical system, and determining a high point on the surface contour. A boundary is defined in the environmental map based on the high point and a plane non-parallel to a longitudinal axis of the bone, and regions are labeled starting at the boundary and away from the bone as free space in the environmental map, as well as regions labeled starting at the boundary and towards the bone as invalid space in the environmental map. The method further includes removing material from a workpiece or the bone by manipulating an end-effector tool with a manipulator arm along a tool path of the robotic surgical system, and dynamically generating the environmental map as material is being removed by labeling the removed material as free space in the environmental map and labeling the non-removed material as invalid space in the environmental map.
- A surgical system is provided for performing the computerized method to dynamically generate an environmental map in a robotic assisted surgery system. The system includes a surgical robot with an end effector tool, a computing system having user-peripherals and a monitor for displaying a graphical user interface (GUI), and at least one of a mechanical digitizer or a non-mechanical tracking system.
- The present invention is further detailed with respect to the following drawings that are intended to show certain aspects of the present of invention, but should not be construed as limit on the practice of the invention, wherein:
-
FIG. 1A depicts a planned placement of an implant model relative to a bone model as is known in the prior art; -
FIG. 1B depicts a tool path having checkpoints for removing bone according to the planned placement as is known in the prior art; -
FIG. 2 depicts a surgical system having an environmental map generator software module for dynamically generating an environmental map in accordance with embodiments of the invention; -
FIG. 3 depicts an exposed proximal femur and a boundary used to dynamically generate an environmental map in accordance with embodiments of the invention; and -
FIGS. 4A-4C depicts the progression of a method for dynamically generating an environmental map and planning a recovery tool path in accordance with embodiments of the invention, whereFIG. 4A depicts a tool removing bone,FIG. 4B depicts a position of the tool when a pause in the procedure occurs, andFIG. 4C depicts the tool displaced from the tool path and a recovery tool path for recovering the position of the tool from the displaced position back to, or near, the previous cutting position as shown inFIG. 4B . - The present invention has utility as a system and method for dynamically generating an environmental map for use in robotic assisted surgery. The system and method are particularly advantageous for generating an environmental map to plan a recovery tool path that an end-effector tool can safely and efficiently follow to re-position the tool back to a cutting position following the displacement of the tool from the cutting position. Additionally, the environmental map may be particularly useful to update a virtual representation of a bone to provide a user with visual feedback as to the progression of the end-effector tool as the tool removes material from the bone.
- The present invention will now be described with reference to the following embodiments. As is apparent by these descriptions, this invention can be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. For example, features illustrated with respect to one embodiment can be incorporated into other embodiments, and features illustrated with respect to a particular embodiment may be deleted from the embodiment. In addition, numerous variations and additions to the embodiments suggested herein will be apparent to those skilled in the art in light of the instant disclosure, which do not depart from the instant invention. Hence, the following specification is intended to illustrate some particular embodiments of the invention, and not to exhaustively specify all permutations, combinations, and variations thereof.
- Further, it should be appreciated that although the systems and methods described herein make reference to the proximal femur bone, the systems and methods may be applied to other bones and joints in the body illustratively including the knee, ankle, elbow, wrist, skull, and spine, as well as revision of initial repair or replacement of any of the aforementioned bones or joints. It should further be appreciated that the systems and methods described herein may be applied to industrial applications, such as the generation of an environmental map for a computer numerical control (CNC) machine that mills inanimate workpieces (e.g., wood, metal).
- Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
- Unless indicated otherwise, explicitly or by context, the following terms are used herein as set forth below.
- As used in the description of the invention and the appended claims, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
- Also as used herein, “and/or” refers to and encompasses any and all possible combinations of one or more of the associated listed items, as well as the lack of combinations when interpreted in the alternative (“or”).
- As used herein, the term “recovery marker” refers to a physical reference marker designed to permit a measurement system, such as a mechanical tracking system, optical tracking system, electro-magnetic tracking system, ultrasound tracking system, and/or an imaging system (e.g., computed tomography (CT), X-ray, fluoroscopy, ultrasound, magnetic resonance imaging (MRI)), to determine at least one of a position or orientation of at least a portion of the reference marker.
- As used herein, the term “registration” refers to the determination of the spatial relationship between two or more objects or coordinate systems such as a computer-assist device, a bone, or an image data set of a bone. Illustrative methods of registration known in the art are described in U.S. Pat. Nos. 6,033,415, 8,010,177, 8,036,441, and 8,287,522. “Re-registration” refers to any subsequent registration procedure that occurs after an initial registration and is executed with the use of the recovery markers.
- As used herein, the term “end-effector tool” refers to an instrument that is manipulated/guided by an external device (e.g., surgical robot, CNC machine) and interacts with a workpiece (e.g., bone). Illustrative examples of an end-effector tool include a cutter, an end-mill, a burr, a probe, an electrocautery device, a reamer, an impactor, a drill bit, a screw, forceps, scissors, and a saw.
- As used herein, the term “real-time” refers to the processing of input data within milliseconds such that calculated values are available within 10 seconds of computational initiation.
- With reference now to the figures,
FIG. 2 depicts an embodiment of a roboticsurgical system 100 shown in the context of an operating room (OR). The roboticsurgical system 100 is capable of implementing embodiments of the inventive method as described herein. Thesurgical system 100 generally includes asurgical robot 102, acomputing system 104, and includes at least one of amechanical digitizer 118 or a non-mechanical tracking system 106 (e.g., an optical tracking system, an electro-magnetic tracking system). - The
surgical robot 102 may include amovable base 108, amanipulator arm 110 connected to thebase 108, an end-effector flange 112 located at a distal end of themanipulator arm 110, and an end-effector assembly 111 removably attached to theflange 112 by way of an end-effector mount/coupler 113. The end-effector assembly 111 holds and/or operates an end-effector tool 115 that interacts with a portion of a patient's anatomy. Thebase 108 includes a set ofwheels 117 to maneuver thebase 108, which may be fixed into position using a braking mechanism such as a hydraulic brake. The base 108 may further include an actuator to adjust the height of themanipulator arm 110. Themanipulator arm 110 includes various joints and links to manipulate thetool 115 in various degrees of freedom. The joints are illustratively prismatic, revolute, spherical, or a combination thereof. - The
computing system 104 generally includes aplanning computer 114; adevice computer 116; anoptional tracking computer 119 if atracking system 106 is present; and peripheral devices. Theplanning computer 114,device computer 116, and trackingcomputer 119, may be separate entities, a single collective unit, or combinations thereof depending on the surgical system. The peripheral devices allow a user to interface with the roboticsurgical system 100 and may include: one or more user-interfaces, such as a display or monitor 120 for displaying a graphical user interface (GUI); and user-input mechanisms, such as akeyboard 121,mouse 122, pendent 124,joystick 126,foot pedal 128, or themonitor 120 in some inventive embodiments have touchscreen capabilities. - The
planning computer 114 contains hardware (e.g., processors, controllers, and memory), software, data and utilities that are in some inventive embodiments dedicated to the planning of a surgical procedure, either pre-operatively or intra-operatively. This may include reading medical imaging data, segmenting imaging data, constructing three-dimensional (3D) virtual models, storing computer-aided design (CAD) files, providing various functions or widgets to aid a user in planning the surgical procedure, and generating surgical plan data. The final surgical plan includes operational data for modifying a volume of tissue that is defined relative to the anatomy, such as a set of points in a cut-file to autonomously modify the volume of bone, a set of virtual boundaries defined to haptically constrain a tool within the defined boundaries to modify the bone, a set of planes or drill holes to drill pins in the bone, or a graphically navigated set of instructions for modifying the tissue. The data generated from theplanning computer 114 may be transferred to thedevice computer 116 and/or trackingcomputer 119 through a wired or wireless connection in the operating room (OR); or transferred via a non-transient data storage medium (e.g., a compact disc (CD), a portable universal serial bus (USB) drive) if theplanning computer 114 is located outside the OR. In particular embodiments, data (e.g., surgical plan data, image data, cut-files or others robotic instructions) is transferred in the OR using actuated LEDs as described in U.S. Pat. Pub. No. 20170245945 assigned to the assignee of the present application. - The
device computer 116 in some inventive embodiments is housed in themoveable base 108 and contains hardware (e.g., controllers), software, data and utilities that are preferably dedicated to the operation of thesurgical robot 102. This may include surgical device control, robotic manipulator control, the processing of kinematic and inverse kinematic data, the execution of registration algorithms, the execution of calibration routines, the execution of surgical plan data, coordinate transformation processing, providing workflow instructions to a user, utilizing position and orientation (POSE) data from thetracking system 106, and reading data received from themechanical digitizer 118. Thedevice computer 116 may further include an environmental map generator software module for generating an environmental map during the procedure as further described below. The environmental map generator software module may include a motion planner software module for dynamically planning a recovery path for the end-effector tool 115 in the event a user displaces thetool 115 during the procedure as further described below. - The
optional tracking system 106 of thesurgical system 100 includes two or moreoptical receivers 130 to detect the position of fiducial markers (e.g., retroreflective spheres, active light emitting diodes (LEDs)) uniquely arranged on rigid bodies. In still other embodiments the optical receivers are 3D laser scanners. The fiducial markers arranged on a rigid body are collectively referred to as a fiducial marker array 132, where each fiducial marker array 132 has a unique arrangement of fiducial markers, or a unique transmitting wavelength/frequency if the markers are active LEDs. In an embodiment, the fiducial markers are directly integrated onto or with the surgical device. An example of an optical tracking system is described in U.S. Pat. No. 6,061,644. Thetracking system 106 may be built into a surgical light, located on a boom, astand 140, or built into the walls or ceilings of the OR. The tracking system computer 136 may include tracking hardware, software, data and utilities to determine the POSE of objects (e.g., bones B, surgical robot 102) in a local or global coordinate frame. The POSE of the objects is collectively referred to herein as POSE data, where this POSE data may be communicated to thedevice computer 116 through a wired or wireless connection. Alternatively, thedevice computer 116 may determine the POSE data using the position of the fiducial markers detected from theoptical receivers 130 directly. - The POSE data is determined using the position data detected from the
optical receivers 130 and operations/processes such as image processing, image filtering, triangulation algorithms, geometric relationship processing, registration algorithms, calibration algorithms, and coordinate transformation processing. For example, the POSE of adigitizer probe 138 with an attached probefiducial marker array 132 b may be calibrated such that the probe tip is continuously known in physical space as described in U.S. Pat. No. 7,043,961. The POSE of thetool 115 may be known with respect to a devicefiducial marker array 132 a using a calibration method as described in U.S. Prov. Pat. App. 62/128,857. It should be appreciated, that the devicefiducial marker 132 a is depicted on themanipulator arm 110; however, themarker 132 a may be positioned on the base 108 or the end-effector assembly 111. Registration algorithms may be executed to determine the POSE and coordinate transforms between a bone B, afiducial marker array surgical robot 102 using the aforementioned registration methods. - While 3D laser scanning has been contemplated as an adjunct to surgery (Harmon, L., et al. “3D laser scanning for image-guided neurosurgery.” Ann Arbor 1001 (1994): 48113-4001), this technique has been met with limited success owing to the inadequate resolution of such scans that are nonetheless rapidly collected. This is known synonymously as light detection and ranging (LIDAR). Such 3D laser scanning is operative herein to detect a rapid and independent mapping of the surgical field for obstacles about which the surgical robotic system lacks information. The resulting scans are readily communicated and compared to the bone and/or fiducial marker registration to avoid collisions.
- The POSE data is used by the
computing system 104 during the procedure to update the POSE and/or coordinate transforms of the bone B, the surgical plan, and thesurgical robot 102 as themanipulator arm 110 and/or bone B move during the procedure, such that thesurgical robot 102 can accurately execute the surgical plan. In another embodiment, thesurgical system 100 does not include atracking system 106, but instead employs a bone fixation and monitoring system that fixes the bone directly to thesurgical robot 102 and monitors bone movement as described in U.S. Pat. No. 5,086,401. - With reference to
FIG. 3 , a particular inventive embodiment of a method for dynamically generating an environmental map is shown. The surgical team first exposes the bone B via anincision 146, and the bone B is registered to thesurgical robot 102 and the surgical plan (the surgical plan having the tool path 10) with techniques known in the art. After registration, a boundary 148 (e.g., a hyperplane, a meshed surface generated by a camera) is defined in space (e.g., in a coordinate system associated with thesurgical robot 102 or tracking system 106) as: a) thehighest point 150 on the bone B, plus some optional offset; and b) parallel to an X-Y plane that is defined perpendicular to a longitudinal axis of the bone B as further described. The algorithms for such registration and by extension robotic navigation are well known as provided in exemplary form in Bobrow, J. E. et al. “Time-optimal control of robotic manipulators along specified paths.” Int. J. Robot. Res., 4(3):3-17, 1985. The highest point on the bone may be determined by designating such point as a landmark on a 3-D virtual bone model of the bone B in the pre-operative planning software, wherein once registered to the actual bone is known in physical space. In specific embodiments, the ‘high point’ refers to the most proximal end point or the most distal end point on the bone depending on which end thesurgical robot 102 is operating (e.g., if thesurgical robot 102 is preparing the proximal femur, then the ‘high point’ is the most proximal end point of the femur). The X-Y plane may be defined relative to the bone B using several methods. In a specific embodiment, a coordinatesystem 152 of the bone B is defined relative to a 3-D virtual bone model of the bone B in the pre-operative planning software. Generally, a +Z-axis 154 and −Z-axis 156 of the bone B is defined along a longitudinal axis of the bone B, such as the mechanical axis or anatomical axis of the bone B. The X-axis and Y-axis are then defined perpendicular to the Z-axis and perpendicular to one another using anatomical landmarks (e.g., femoral condyles). Here, as illustrated inFIG. 3 , the +Z-axis 154 is shown pointing away from the bone B and the −Z-axis is shown pointing towards the bone B. In other words, the +Z-axis 154 is shown pointing in a direction opposing the direction of the end-effector'stool path 10, and the −Z-axis 156 is shown pointing in a direction towards the end-effector'stool path 10. Thehighest point 150 is defined as the most +Z point on the bone B as determined from the digitizing and registration process, plus some optional offset. Theboundary 148 is then defined in space using thehighest point 150 and the X-Y plane as defined above. - The
boundary 148 may be defined by thecomputing system 104 and stored in the environmental map generator software module. The environmental map generator then begins mapping the environment by designating anything beyond theboundary 148 in the +Z-axis direction as free space FS and anything below the surface in the −Z-axis direction as invalid space IS. The free space FS is defined as any region in the environment where the end-effector can safely travel, while the invalid space IS is defined as any region in the environment that may be unsafe and/or cause an unintended collision with the end-effector tool. - With reference to
FIGS. 4A-4C , particular embodiments for furthering the generation of the environmental map as the end-effector tool removes bone is shown.FIGS. 4A-4C depicts the end-effector assembly 111 operating atool 115 having atool tip 142 at different stages throughout embodiments of the inventive method.FIG. 4A depicts the end-effector tool 115 removing material from a bone B. Themanipulator arm 110 manipulates the end-effector tool 115 according to atool path 10 registered to the bone B. As the end-effector tool 115 removes bone B, the environmental map generator associated with at least one of thedevice computer 116 or trackingcomputer 119 maps the environment in real-time. The environmental map generator maps the environment by labeling the volume of removed material by the end-effector tool 115 as free space FS, and all other regions as invalid space IS.FIG. 4B depicts the end-effector tool 115 positioned farther along thetool path 10. As such, the volume of the free space FS has increased as the end-effector tool 115 has removed more bone B. Therefore, the environmental map is generated and updated in real-time based on the movements of the end-effector tool 115 as thetool 115 removes bone B. The environmental map generator knows the POSE of thetool 115 as thetool 115 is removing bone B to update the free space FS based on at least one of: the kinematics of the robot manipulator arm, POSE data collected from a tracking system, or a combination thereof. In particular inventive embodiments, the environmental map generator only updates the environmental map (i.e., free space FS and invalid space IS) when the end-effector tool 115 is ‘on’ (e.g., an end-mill actively spinning) and when the end-effector tool 115 has crossed over theboundary 148 from +Z to −Z. Therefore, the motion planner module does not inadvertently mislabel a region of the environment as free space FS. - In the event of bone motion at any moment during the procedure, the POSE of the surgical plan having the
tool path 10 and the environmental map are updated to maintain their relative POSE to the bone B. If atracking system 106 is present, the POSEs may be updated in real-time based on the POSE data. If the surgical system utilizes a fixation system, bone motion monitoring system, and recovery markers as described in U.S. Pat. No. 5,086,401 and U.S. Pat. No. 6,430,434, then the recovery markers are used to update the relative POSEs. Therefore, the labeled free space FS and labeled invalid space IS in the environmental map is not compromised in the event of bone motion. - With reference still to
FIGS. 4A to 4C , particular embodiments of a method for dynamically planning a recovery tool path with the environmental map is also shown.FIG. 4B depicts the end-effector tool 115 at point in thetool path 10 where the procedure is paused due to one of several reasons, such as a safety reason. As shown inFIG. 4C , a user then displaces the end-effector tool 115 to a displaced position away from thetool path 10.FIG. 4C depicts thetool 115 being displaced outside of the volume of removed bone B. Once the user is ready to resume the procedure, the motion planner module searches for arecovery tool path 144 based on the environmental map (i.e., free space FS and invalid space IS). More specifically, the motion planner module searches for arecovery tool path 144 that: a) minimizes the path length from the current displaced position of the tool 115 (as shown inFIG. 4C ) back to, or near, the pre-displaced position of thetool 115 on the tool path 10 (as shown inFIG. 4B ); and b) maximizes dexterity and distance away from obstacles (i.e., invalid space IS). The motion planner module may utilize one or more algorithms to search for therecovery tool path 144 based on the foregoing constraints (a) and (b). The algorithms illustratively include optimization algorithms, probability roadmaps (PRM), rapidly-exploring random trees (RRT), potential field method, or any variants of the mentioned algorithms. The motion planner module may further use existing landmarks such as checkpoints 20C in thetool path 10 to navigate the end-effector tool 115 back to, or near, the pre-displaced position of the end-effector tool 115. More specifically, if thetool path 10 for the end-effector tool 115 includes checkpoints (20 a, 20 b, 20 c, 20 d), then each checkpoint (20 a, 20 b, 20 c, 20 d) encountered while removing the bone B is checked for reachability. A reachable checkpoint is any checkpoint in the robot workspace that can be reached by the robot's end-effector since not all of the checkpoints in the cut-file may be directly on the end-effector'stool path 10. Reachable checkpoints are recorded and labeled as valid points to be used as vertexes during recovery path planning. The end-effector tool 115 then travels along therecovery tool path 144 back to, or near, the pre-displaced position to resume the removal of bone B. - During milling, the graphical user interface may display the progress of the end-effector removing bone. The GUI may display the 3-D bone model being milled. To better display the progression of the milling, the GUI may also display the amount of bone removed based on the labelled free space. Since the software module is recording and labelling the removed bone in real-time as free space, the free space is used as a metric to update the volume of bone removed accordingly. This provides a quick method to determine and display the progression of the milling in real-time relative to a virtual representation of the bone.
- While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the described embodiments in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient roadmap for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes may be made in the function and arrangement of elements without departing from the scope as set forth in the appended claims and the legal equivalents thereof.
Claims (20)
1. A method for planning a tool path for an end-effector attached to a surgical robot comprising:
providing motion planner software on a computer having a processor and an environmental map with physical areas designated in the environmental map as free space or invalid space, wherein the free space is a volume based on removed material and the invalid space is intact material; and
planning the tool path for the end-effector from a first position to a second position, with the motion planner software, using the environmental map.
2. The method of claim 1 wherein the material is a workpiece or a bone.
3. The method of claim 1 further comprising generating the environmental map by labelling the free space and the invalid space relative to the end-effector.
4. The method of claim 3 further comprising labelling of the removed material as free space only when the end-effector is in an ‘on’ operating state.
5. The method of claim 3 further comprising the labelling of the removed material as free space in only when the end-effector crosses a boundary in a direction towards the workpiece or the bone.
6. The method of claim 3 further comprising defining a boundary in the environmental map based a plane non-parallel to a longitudinal axis of the workpiece or the bone.
7. The method of claim 6 wherein the free space is labelled starting at the boundary and away from the workpiece or the bone and the invalid space is labelled starting at the boundary and towards the orthopedic workpiece or the bone.
8. The method of claim 1 wherein the environmental map is defined with reference to a robotic coordinate system or a tracking system coordinate system.
9. The method of claim 1 further comprising registering a bone relative to the surgical robot.
10. The method of claim 1 wherein the first position is a displaced position of the end-effector from a previous tool path and the second position resides along the previous tool path.
11. The method of claim 1 wherein the tool path is limited to the free space.
12. The method of claim 1 wherein the tool path is planned by minimizing a path length from the first position to the second position in the free space.
13. The method of claim 11 wherein the tool path is planned by avoiding obstacles utilizing the invalid space.
14. The method of claim 11 wherein the minimizing of the path length is accomplished with an algorithm including at least one of: optimization algorithms, probability roadmaps (PRM), rapidly-exploring random trees (RRT), or potential field methods.
15. The method of claim 1 further comprising identifying one or more checkpoints between the first position and the second position wherein the planning of the tool path from the first position to, or near, the second position utilizes the one or more of the checkpoints.
16. The method of claim 14 further comprising determining which of the one or more checkpoints is in closest proximity to the second position and planning the tool path based on one or more of said checkpoints.
17. The method of claim 1 further comprising removing the material with the end-effector.
18. A surgical system comprising:
a surgical robot with an end effector; and
a computer comprising a processor and motion planner software to plan the tool path of claim 1 .
19. The system of claim 17 further comprising an environmental map generator software for generating an environmental map.
20. The system of claim of claim 18 further comprising at least one of a mechanical digitizer or a non-mechanical tracking system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/502,500 US20220031412A1 (en) | 2018-01-24 | 2021-10-15 | Planning a tool path for an end-effector using an environmental map |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862621307P | 2018-01-24 | 2018-01-24 | |
US16/250,341 US11154369B2 (en) | 2018-01-24 | 2019-01-17 | Environmental mapping for robotic assisted surgery |
US17/502,500 US20220031412A1 (en) | 2018-01-24 | 2021-10-15 | Planning a tool path for an end-effector using an environmental map |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/250,341 Continuation US11154369B2 (en) | 2018-01-24 | 2019-01-17 | Environmental mapping for robotic assisted surgery |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220031412A1 true US20220031412A1 (en) | 2022-02-03 |
Family
ID=67297977
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/250,341 Active 2039-10-13 US11154369B2 (en) | 2018-01-24 | 2019-01-17 | Environmental mapping for robotic assisted surgery |
US17/502,500 Pending US20220031412A1 (en) | 2018-01-24 | 2021-10-15 | Planning a tool path for an end-effector using an environmental map |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/250,341 Active 2039-10-13 US11154369B2 (en) | 2018-01-24 | 2019-01-17 | Environmental mapping for robotic assisted surgery |
Country Status (1)
Country | Link |
---|---|
US (2) | US11154369B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220265364A1 (en) * | 2019-09-05 | 2022-08-25 | Curexo, Inc. | Device for guiding position of robot, method therefor, and system comprising same |
US12089905B1 (en) * | 2023-05-22 | 2024-09-17 | Ix Innovation Llc | Computerized control and navigation of a robotic surgical apparatus |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017151751A1 (en) * | 2016-03-02 | 2017-09-08 | Think Surgical, Inc. | Method for recovering a registration of a bone |
CN110114019B (en) | 2016-12-08 | 2022-04-12 | 安托踏实公司 | Surgical system for cutting an anatomical structure according to at least one target plane |
EP3551099B1 (en) | 2016-12-08 | 2024-03-20 | Orthotaxy | Surgical system for cutting an anatomical structure according to at least one target plane |
US11633233B2 (en) | 2016-12-08 | 2023-04-25 | Orthotaxy S.A.S. | Surgical system for cutting an anatomical structure according to at least one target cutting plane |
US11911111B2 (en) * | 2018-08-24 | 2024-02-27 | University Of Hawaii | Autonomous system and method for planning, tracking, and controlling the operation of steerable surgical devices |
CN110974426A (en) * | 2019-12-24 | 2020-04-10 | 上海龙慧医疗科技有限公司 | Robot system for orthopedic joint replacement surgery |
JP2023519880A (en) * | 2020-03-27 | 2023-05-15 | マコ サージカル コーポレーション | Systems and methods for controlling robotic motion of tools based on virtual boundaries |
AU2021369671A1 (en) | 2020-10-30 | 2023-06-15 | Mako Surgical Corp. | Robotic surgical system with slingshot prevention |
CN113876429B (en) * | 2021-06-23 | 2023-01-20 | 上海极睿医疗科技有限公司 | Path planning system of spine surgery robot and robot system |
USD1044829S1 (en) | 2021-07-29 | 2024-10-01 | Mako Surgical Corp. | Display screen or portion thereof with graphical user interface |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140039517A1 (en) * | 2012-08-03 | 2014-02-06 | Stryker Corporation | Navigation System for use with a Surgical Manipulator Operable in Manual or Semi-Autonomous Modes |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5086401A (en) | 1990-05-11 | 1992-02-04 | International Business Machines Corporation | Image-directed robotic system for precise robotic surgery including redundant consistency checking |
JP3537229B2 (en) | 1995-07-28 | 2004-06-14 | ファナック株式会社 | Robot control method |
US6061644A (en) | 1997-12-05 | 2000-05-09 | Northern Digital Incorporated | System for determining the spatial position and orientation of a body |
US6033415A (en) | 1998-09-14 | 2000-03-07 | Integrated Surgical Systems | System and method for performing image directed robotic orthopaedic procedures without a fiducial reference system |
US6430434B1 (en) | 1998-12-14 | 2002-08-06 | Integrated Surgical Systems, Inc. | Method for determining the location and orientation of a bone for computer-assisted orthopedic procedures using intraoperatively attached markers |
EP1364183B1 (en) | 2001-01-30 | 2013-11-06 | Mako Surgical Corp. | Tool calibrator and tracker system |
US7570791B2 (en) | 2003-04-25 | 2009-08-04 | Medtronic Navigation, Inc. | Method and apparatus for performing 2D to 3D registration |
CA2651782C (en) | 2006-05-19 | 2018-03-06 | Mako Surgical Corp. | System and method for verifying calibration of a surgical device |
US8010177B2 (en) | 2007-04-24 | 2011-08-30 | Medtronic, Inc. | Intraoperative image registration |
DE102011106321A1 (en) | 2011-07-01 | 2013-01-03 | Kuka Laboratories Gmbh | Method and control means for controlling a robot |
JP6220877B2 (en) | 2012-08-15 | 2017-10-25 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | System and method for joint motion cancellation using zero space |
KR102379623B1 (en) | 2014-02-05 | 2022-03-29 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | System and method for dynamic virtual collision objects |
WO2016081931A1 (en) | 2014-11-21 | 2016-05-26 | Think Surgical, Inc. | Visible light communication system for transmitting data between visual tracking systems and tracking markers |
WO2016141378A1 (en) | 2015-03-05 | 2016-09-09 | Think Surgical, Inc. | Methods for locating and tracking a tool axis |
WO2017115235A1 (en) | 2015-12-29 | 2017-07-06 | Koninklijke Philips N.V. | Image-based adaptive path planning for a robot |
US10499997B2 (en) * | 2017-01-03 | 2019-12-10 | Mako Surgical Corp. | Systems and methods for surgical navigation |
-
2019
- 2019-01-17 US US16/250,341 patent/US11154369B2/en active Active
-
2021
- 2021-10-15 US US17/502,500 patent/US20220031412A1/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140039517A1 (en) * | 2012-08-03 | 2014-02-06 | Stryker Corporation | Navigation System for use with a Surgical Manipulator Operable in Manual or Semi-Autonomous Modes |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220265364A1 (en) * | 2019-09-05 | 2022-08-25 | Curexo, Inc. | Device for guiding position of robot, method therefor, and system comprising same |
US11666392B2 (en) * | 2019-09-05 | 2023-06-06 | Curexo, Inc. | Device for guiding position of robot, method therefor, and system including the same |
US12089905B1 (en) * | 2023-05-22 | 2024-09-17 | Ix Innovation Llc | Computerized control and navigation of a robotic surgical apparatus |
Also Published As
Publication number | Publication date |
---|---|
US11154369B2 (en) | 2021-10-26 |
US20190223962A1 (en) | 2019-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220031412A1 (en) | Planning a tool path for an end-effector using an environmental map | |
AU2022203687B2 (en) | Method and system for guiding user positioning of a robot | |
US10772685B2 (en) | System and method for bone re-registration and marker installation | |
US20200297440A1 (en) | Interactive anatomical positioner and a robotic system therewith | |
EP4142610A1 (en) | Collaborative surgical robotic platform for autonomous task execution | |
US11185373B2 (en) | Method for recovering a registration of a bone | |
US12064184B2 (en) | System and method for installing bone hardware outside an end-effectors tool path | |
US20220071713A1 (en) | Method of verifying tracking array positional accuracy | |
WO2019135805A1 (en) | Interactive anatomical positioner and a robotic system therewith | |
US20200197113A1 (en) | Light guided digitization method to register a bone | |
US11986246B2 (en) | Method to determine bone placement in a robot workspace | |
WO2022006029A1 (en) | System and method to align an implant keel punch | |
US20230329813A1 (en) | Systems And Methods For Guided Placement Of A Robotic Manipulator | |
KR20220024055A (en) | Tracking System Field of View Positioning System and Method | |
US20200390506A1 (en) | Workflow control with tracked devices | |
KR20230010216A (en) | Measurement-guided reprocessing during robotic ablation | |
US20240173096A1 (en) | System and method for detecting a potential collision between a bone and an end-effector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THINK SURGICAL, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROLDAN, JAY;HANSON, RANDALL;REEL/FRAME:057817/0094 Effective date: 20181008 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |