Nothing Special   »   [go: up one dir, main page]

CN114948199A - Surgical operation auxiliary system and operation path planning method - Google Patents

Surgical operation auxiliary system and operation path planning method Download PDF

Info

Publication number
CN114948199A
CN114948199A CN202210534426.XA CN202210534426A CN114948199A CN 114948199 A CN114948199 A CN 114948199A CN 202210534426 A CN202210534426 A CN 202210534426A CN 114948199 A CN114948199 A CN 114948199A
Authority
CN
China
Prior art keywords
surgical
path
planning
module
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210534426.XA
Other languages
Chinese (zh)
Other versions
CN114948199B (en
Inventor
顾佩华
陈光耀
韩磊
王慧聪
胡顺顺
王凯峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang International Institute Of Innovative Design And Intelligent Manufacturing Tianjin University
Tianjin University
Original Assignee
Zhejiang International Institute Of Innovative Design And Intelligent Manufacturing Tianjin University
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang International Institute Of Innovative Design And Intelligent Manufacturing Tianjin University, Tianjin University filed Critical Zhejiang International Institute Of Innovative Design And Intelligent Manufacturing Tianjin University
Priority to CN202210534426.XA priority Critical patent/CN114948199B/en
Publication of CN114948199A publication Critical patent/CN114948199A/en
Application granted granted Critical
Publication of CN114948199B publication Critical patent/CN114948199B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a surgical operation auxiliary system, which comprises a preoperative operation planning system and an intraoperative operation navigation system; the preoperative surgical planning system comprises: the system comprises an image processing module for three-dimensional model reconstruction, a database for storing medical information of a patient and a surgical planning module for planning a moving path of a surgical instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module, a surgical navigation module and a path indicating device, wherein the positioning module is used for measuring the spatial positions of a surgical instrument and a to-be-operated part; the operation planning module comprises an operation path optimization module and/or an operation path planning model established based on a neural network; the operation path optimization module is used for solving an optimal operation path in the range of the operation constraint area; the surgical path planning model outputs an auxiliary surgical path for assisting a doctor in decision making. The invention assists the doctor in planning the operation path and can prompt the doctor to perform the operation according to the planned path.

Description

Surgical operation auxiliary system and operation path planning method
Technical Field
The invention relates to the field of medical treatment, in particular to a surgical operation auxiliary system and an operation path planning method.
Background
At present, doctors usually perform operations on patients mainly by past experiences during the operation. Taking a cosmetic surgery as an example, doctors generally record the state of patients according to observation and photographing modes, and the actual surgery effect greatly depends on the clinical experience and skill of the surgery of the doctors. If the doctor does not have the standard operation or the doctor does not have enough experience, unnecessary skin and tissue injuries and other problems can be caused, and the operation can even fail. The face appearance of a person has important influence on the aspects of personal growth, job hunting and family building, and the whole image is influenced by local tiny defects. At present, most people have a careful attitude for cosmetic surgery, the surgical effect is difficult to guarantee as the most main reason, and the problem of improving the surgical quality is urgently needed to be solved.
With the development and improvement of the computer level, computer-assisted surgery becomes a new development direction of medical clinical surgery. The computer can obtain a three-dimensional model through a three-dimensional reconstruction method according to the original data of the patient, so that a doctor can conveniently make an operation scheme, the visual limitation of the surgeon is overcome, the data measurement is more accurate, and the diagnosis is more accurate. However, the medical three-dimensional reconstruction technology is mainly used for improving the understanding degree of the doctor on the patient condition, and an auxiliary tool for the doctor to make the operation path planning is lacked. In the operation process, a plurality of medical auxiliary robot auxiliary operations exist, wherein the DaVinci operation robot is widely used and can be used for abdominal surgery, urinary surgery and the like. Although this type of surgical assistant robot is convenient for the surgeon to perform the surgery, it is generally expensive and requires additional machine operation learning. Therefore, a surgical operation assisting system capable of assisting a doctor in preoperative diagnosis and surgical planning and convenient and fast to operate in an operation is particularly needed.
Disclosure of Invention
The invention provides a surgical operation auxiliary system and a surgical path planning method for solving the technical problems in the prior art.
The technical scheme adopted by the invention for solving the technical problems in the prior art is as follows: a surgical operation auxiliary system comprises a preoperative operation planning system and an intraoperative operation navigation system; the preoperative surgical planning system comprises: the system comprises an image processing module for three-dimensional model reconstruction, a database for storing patient medical information and a surgical planning module for planning a moving path of a surgical instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module, a surgical navigation module and a path indicating device, wherein the positioning module is used for measuring the spatial positions of a surgical instrument and a to-be-operated part; the operation planning module comprises an operation path optimization module and/or an operation path planning model established based on a neural network; the surgical path optimization module is used for solving an optimal surgical path within a surgical constrained region range based on a spatial trajectory planning algorithm; the surgical path planning model is trained by adopting a training set constructed by preoperative medical information, surgical paths and postoperative result data of operated patients, inputs medical information and postoperative expected data of patients to be operated, and outputs auxiliary surgical paths for assisting doctors in decision making.
Further, the image processing module comprises a point cloud data three-dimensional reconstruction module and a medical image three-dimensional reconstruction module; the point cloud data three-dimensional reconstruction module comprises a three-dimensional scanning device for performing three-dimensional scanning on the surface of the part to be operated and acquiring point cloud data, and a three-dimensional model reconstruction module for performing three-dimensional reconstruction on the point cloud data by adopting a neural network to obtain a three-dimensional model of the part to be operated; and the medical image three-dimensional reconstruction module is used for obtaining a three-dimensional model of the to-be-operated part region and peripheral nerves and blood vessels thereof through a three-dimensional reconstruction algorithm based on the to-be-operated part slice image.
Further, the positioning module comprises an electromagnetic positioning module and/or an optical positioning module; the electromagnetic positioning module comprises an electromagnetic type position positioner; the optical positioning module comprises a binocular vision positioning system, and the binocular vision positioning system is used for three-dimensional coordinate positioning of the space points.
Further, the surgical navigation module records the spatial position of the surgical instrument in real time, compares and analyzes the actual surgical path with the preoperative planned path, on one hand, sends the path information of the surgical instrument in the next step to the path indicating device, and on the other hand, sends the early warning signal of the corresponding grade according to the grade of the deviation degree when the actual surgical path deviates from the preoperative planned path.
Further, the path indicating device comprises a light path indicating device or an augmented reality path indicating device; the light path indicating device comprises a laser lamp with an adjustable indicating angle, the laser lamp receives signals from the surgical navigation module, the moving path of the surgical instrument is indicated by the on and off of the indicating lamp, and the path deviation degree is indicated by the color difference of the indicating lamp; the augmented reality path indicating device comprises wearable augmented reality glasses or a wearable augmented reality helmet; the display screen of the augmented reality path indicating device is used for displaying the planned path before the operation and the actual deviation degree.
The invention also provides a surgical path planning method, which is provided with a preoperative surgical planning system and an intraoperative surgical navigation system; setting a preoperative surgical planning system: the system comprises an image processing module for three-dimensional model reconstruction, a database for storing patient medical information and a surgical planning module for planning a moving path of a surgical instrument; setting an intraoperative surgical navigation system: the surgical instrument comprises a positioning module, a surgical navigation module and a path indicating device, wherein the positioning module is used for measuring the spatial positions of a surgical instrument and a to-be-operated part; the operation planning module is provided with an operation path optimization module and/or an operation path planning model established based on a neural network; the surgical path optimization module is used for solving an optimal surgical path in the surgical constrained region range based on a spatial trajectory planning algorithm; the surgical path planning model is trained by adopting a training set constructed by preoperative medical information, surgical paths and postoperative result data of operated patients, inputs medical information and postoperative expected data of patients to be operated, and outputs auxiliary surgical paths for assisting doctors in decision making.
Further, the method for planning the movement path of the surgical instrument by the surgical planning module comprises the following steps:
establishing a medical image data set, labeling important blood vessels, nerves, key tissues and organs in images of the data set, and training a neural network model for medical image segmentation by a deep learning method;
step two, after image data of a focus of a patient are obtained, important blood vessels, nerves, key tissues and organs in a focus area are automatically identified and segmented according to a neural network model for medical image segmentation, and a three-dimensional model of the focus area of the patient is constructed through a medical image three-dimensional reconstruction technology;
acquiring a body surface three-dimensional model of the focus region of the patient through a three-dimensional scanning device, and registering the three-dimensional model of the focus region of the patient and the body surface three-dimensional model;
step four, defining the range of the operation constraint area by taking the safe distance which is kept in accordance with the medical requirements with the blood vessels, nerves, key tissues and organs to be avoided as the constraint condition;
and step five, obtaining the optimal surgical path through a space trajectory planning algorithm in the surgical constraint area range, and realizing automatic planning of the surgical path.
Further, the step five comprises the following sub-steps:
step C1, in the operation constrained region scope, creating n parallel two-dimensional section surfaces, selecting one of the two-dimensional section surfaces, and using the space path planning algorithm on the plane to obtain the ideal operation path in the plane;
step C2, sequentially obtaining an ideal operation path based on the plane for the n two-dimensional section surfaces, and superposing the obtained n plane paths in the direction perpendicular to the plane to obtain a three-dimensional curved surface of the space;
and step C3, determining the safe activity interval of the surgical instrument according to the medical prior condition and the medical requirement, and selecting a smooth three-dimensional curved surface in the interval as the optimal surgical path according to the operation difficulty of the doctor.
Further, the space trajectory planning algorithm adopts a Q-learning algorithm to iteratively calculate an optimal surgical path, and the specific steps are as follows:
step a1, initializing parameters: establishing a Q-value table, setting the current iteration times as I, the maximum iteration times as I, and initializing I as 0; defining the current state as s t The action in this state is a t Setting Q(s) in Q-value table i ,a i ) 0; the initial learning rate α is 0.2, the discount rate γ is 0.8, and the prize obtained after each action is set to r t (ii) a Setting an upper boundary point of an operation area range as a starting point and a lower boundary point as an end point; setting the operation constraint area range of a three-dimensional model of a lesion area of a patient as an environment space E, dispersing the operation constraint area range into n effective two-dimensional sections, selecting one two-dimensional section, establishing a coordinate system for the two-dimensional section, performing discretization to obtain t optional states, and setting an optional action set as A;
step A2, setting the starting point as the initial state and the end point as the terminal state, and selecting the optimal action a according to the epsilon-greedy strategy t Selecting action a t The probability of (c) is:
Figure BDA0003647088920000041
wherein epsilon is a greedy value; 1- ε to select the optimal action a t The probability of (d);
Figure BDA0003647088920000042
to estimate the maximum prize available in the future; a represents an action; s represents a state; prob (a) t ) To select action a t The probability of (d);
step A3, select action a t Then, a prize r is obtained t For determining the next action; and obtaining the state s of the next step t+1
Step A4, updating Q-value table and greedy value epsilon;
the Q-value updating formula is as follows;
Figure BDA0003647088920000043
the greedy value epsilon update formula is:
Figure BDA0003647088920000044
step A5, let s t =s t+1 Judging whether the terminal state and Q(s) are reached t ,a t ) And (4) whether the values are converged, if the values do not meet the conditions, returning to the step A2 until the iteration conditions are met, and obtaining the optimal planned path after T times of iterative training.
Further, the path indicating device adopts an augmented reality path indicating device; the display screen of the augmented reality path indicating device is used for displaying the preoperative planned path and the actual deviation degree; the surgical path prompt information is displayed on the display screen in an image mode, and the prompt information is the feeding direction and the feeding depth.
The invention has the advantages and positive effects that: setting an operation path planning model established based on a neural network; the operation path planning model inputs medical information and postoperative expected data of a patient to be operated and can output an operation path for assisting a doctor to make a decision. Can provide reference for the doctor to make the operation path. The spatial trajectory planning algorithm module is arranged, so that a doctor can be helped to find the optimal surgical path in the range of the surgical constrained region. The path indicating device of the intraoperative surgical navigation system can provide path indication for a doctor in real time during an operation, and can send out an early warning signal after an actual surgical path deviates, so that the surgical accuracy of the doctor is improved. The path indicating device is arranged, and a doctor can perform an operation according to a planned path according to prompt information such as sound and light signals.
Drawings
Fig. 1 is a schematic view of a surgical operation assistance system of the present invention.
Detailed Description
For further understanding of the contents, features and effects of the present invention, the following embodiments are enumerated in conjunction with the accompanying drawings, and the following detailed description is given:
referring to fig. 1, a surgical operation assisting system includes a preoperative operation planning system and an intraoperative operation navigation system; the preoperative surgical planning system comprises: the system comprises an image processing module for three-dimensional model reconstruction, a database for storing patient medical information and a surgical planning module for planning a moving path of a surgical instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module, a surgical navigation module and a path indicating device, wherein the positioning module is used for measuring the spatial positions of a surgical instrument and a to-be-operated part; the operation planning module comprises an operation path optimization module and/or an operation path planning model established based on a neural network; the surgical path optimization module is used for solving an optimal surgical path within a surgical constrained region range based on a spatial trajectory planning algorithm; the surgical path planning model is trained by adopting a training set constructed by preoperative medical information, surgical paths and postoperative result data of operated patients, inputs medical information and postoperative expected data of patients to be operated, and outputs auxiliary surgical paths for assisting doctors in decision making.
Preferably, the image processing module can comprise a point cloud data three-dimensional reconstruction module and a medical image three-dimensional reconstruction module; the point cloud data three-dimensional reconstruction module can comprise a three-dimensional scanning device for performing three-dimensional scanning on the surface of the part to be operated and acquiring point cloud data, and a three-dimensional model reconstruction module for performing three-dimensional reconstruction on the point cloud data by adopting a neural network to obtain a three-dimensional model of the part to be operated; and the medical image three-dimensional reconstruction module can obtain a three-dimensional model of the to-be-operated part region and peripheral nerves and blood vessels thereof through a three-dimensional reconstruction algorithm based on the to-be-operated part slice image.
Preferably, the positioning module may comprise an electromagnetic positioning module and/or an optical positioning module; the electromagnetic positioning module may comprise an electromagnetic position locator; the optical positioning module may include a binocular vision positioning system for three-dimensional coordinate positioning of the spatial points.
Preferably, the surgical navigation module may record the spatial position of the surgical instrument in real time, compare and analyze the actual surgical path with the preoperative planned path, and on the one hand, send the path information of the surgical instrument in the next step to the path indicating device, and on the other hand, send the early warning signal of the corresponding level according to the level of the deviation degree when the actual surgical path deviates from the preoperative planned path.
Preferably, the path indicating device may comprise a light path indicating device or an augmented reality path indicating device; the light path indicating device can comprise a laser lamp with an adjustable indicating angle, can receive signals from the surgical navigation module, can indicate the moving path of the surgical instrument by turning on and off the indicating lamp, and can indicate the deviation degree of the path by the color difference of the indicating lamp; the augmented reality path indicating device may comprise wearable augmented reality glasses or a wearable augmented reality helmet; the display screen of the augmented reality path indicating device can be used for displaying the planned path before the operation and the actual deviation degree.
The invention also provides a surgical path planning method, which is provided with a preoperative surgical planning system and an intraoperative surgical navigation system; setting a preoperative surgical planning system: the system comprises an image processing module for three-dimensional model reconstruction, a database for storing patient medical information and a surgical planning module for planning a moving path of a surgical instrument; setting an intraoperative surgical navigation system: the surgical instrument comprises a positioning module, a surgical navigation module and a path indicating device, wherein the positioning module is used for measuring the spatial positions of a surgical instrument and a to-be-operated part; the operation planning module is provided with an operation path optimization module and/or an operation path planning model established based on a neural network; the surgical path optimization module is used for solving an optimal surgical path in the surgical constrained region range based on a spatial trajectory planning algorithm; the surgical path planning model is trained by adopting a training set constructed by preoperative medical information, surgical paths and postoperative result data of operated patients, inputs medical information and postoperative expected data of patients to be operated, and outputs auxiliary surgical paths for assisting doctors in decision making.
Preferably, the method for planning the movement path of the surgical instrument by the surgical planning module may comprise the steps of:
firstly, a medical image data set can be established, important blood vessels, nerves, key tissues and organs in images of the data set are labeled, and a neural network model for medical image segmentation is trained through a deep learning method;
step two, after image data of the focus of the patient is obtained, important blood vessels, nerves, key tissues and organs in the focus area can be automatically identified and segmented according to a neural network model for medical image segmentation, and a three-dimensional model of the focus area of the patient is constructed through a medical image three-dimensional reconstruction technology;
step three, acquiring a body surface three-dimensional model of the focus region of the patient through a three-dimensional scanning device, and registering the three-dimensional model of the focus region of the patient and the body surface three-dimensional model;
step four, the safe distance meeting the medical requirements can be kept between the blood vessel, the nerve, the key tissue and the organ to be avoided as a constraint condition, and the range of the operation constraint area is defined;
and step five, obtaining the optimal surgical path through a space trajectory planning algorithm in the surgical constrained region range, and realizing automatic planning of the surgical path.
Preferably, step five may comprise the following substeps:
step C1, in the operation constrained region, n parallel two-dimensional section planes can be created, one of the two-dimensional section planes is selected, and the ideal operation path in the plane is obtained by using a space path planning algorithm on the plane;
step C2, sequentially obtaining an ideal operation path based on the plane for the n two-dimensional section planes, and superposing the obtained n plane paths in the direction perpendicular to the plane to obtain a three-dimensional curved surface of the space
And step C3, determining the safe activity interval of the surgical instrument according to the medical prior condition and the medical requirement, and selecting a smooth three-dimensional curved surface in the interval as the optimal surgical path according to the operation difficulty of the doctor.
Preferably, the space trajectory planning algorithm iteratively calculates the optimal surgical path by using a Q-learning algorithm. The Q-Learning algorithm is a value-based algorithm in a reinforcement Learning algorithm, wherein Q is Q (S, a), namely, under the S state (S belongs to S) at a certain moment, action a (a belongs to A) is taken to obtain an expected value of income; s represents a set of states; a represents a set of actions: s is an element in the set of states; a is an element in the set of actions; the environment feeds back corresponding benefit r according to the Action a of an agent, a Q-value (Q value) table is constructed by State and Action to store the Q value, then the Action a capable of obtaining the maximum benefit r is selected according to the Q value, and the benefit r is also called as reward r; the specific steps can be as follows:
step a1, initializing parameters: establishing a Q-value table, setting the current iteration times as I, the maximum iteration times as I, and initializing I as 0; defining the current state as s t The action in this state is a t Setting Q(s) in Q-value table i ,a i ) 0; the initial learning rate alpha is 0.2, the discount rate gamma is 0.8, and the prize obtained after each action is set as r t (ii) a Setting an upper boundary point of an operation area range as a starting point and a lower boundary point as an end point; setting the operation constraint area range of a three-dimensional model of a lesion area of a patient as an environment space E, dispersing the operation constraint area range into n effective two-dimensional sections, selecting one two-dimensional section, establishing a coordinate system for the two-dimensional section, performing discretization to obtain t optional states, and setting an optional action set as A;
step A2, setting the starting point as initial state and the end point as terminal state; in order to enable the path searching process to meet the condition that the maximum reward is obtained under different behaviors as far as possible, the greedy value epsilon needs to be adaptively changed to prevent the algorithm from falling into local optimization. Thus, to ensure that the algorithm can quickly converge to an optimal Q value, the optimal action a may be selected according to the ε -greedy policy t Selecting action a t The probability of (c) is:
Figure BDA0003647088920000071
wherein epsilon is a greedy value; 1- ε to select the optimal action a t The probability of (d);
Figure BDA0003647088920000072
to estimate the maximum prize available in the future; a represents an action; s represents a state; prob (a) t ) To select action a t The probability of (c).
Step A3, select action a t Then, a prize r is obtained t For determining the next action; and obtaining the state s of the next step t+1
Step A4, updating Q-value table and greedy value epsilon;
the Q-value updating formula is as follows;
Figure BDA0003647088920000081
the greedy value epsilon update formula is:
Figure BDA0003647088920000082
step A5, let s t =s t+1 Judging whether the terminal state and Q(s) are reached t ,a t ) And (4) whether the values are converged, if the values do not meet the conditions, returning to the step A2 until the iteration conditions are met, and obtaining the optimal planned path after T times of iterative training.
An epsilon-greedy strategy, a greedy strategy, also called greedy strategy, takes the optimal selection (local optimal solution) at the current state at each step, so that it is desired to deduce a global optimal solution.
argmax is a function that is a function of parameters (sets) to the function. When we have another function y ═ f (x), if there is result x0 ═ argmax (f (x)), then it means that when function f (x) takes x ═ x0, then f (x) gets the maximum value of the value range; if there are multiple points such that f (x) takes the same maximum, the result of argmax (f (x)) is a set of points. In other words, argmax (f (x)) is the variable point x (or set of x) corresponding to the maximum value of f (x). arg is argument, which is meant herein as "argument". "
Preferably, the path indicating device adopts an augmented reality path indicating device; the display screen of the augmented reality path indicating device is used for displaying the preoperative planned path and the actual deviation degree; the surgical path prompt information is displayed on the display screen in an image mode, and the prompt information is the feeding direction and the feeding depth.
The above-mentioned image processing module, database, operation planning module, positioning module, operation navigation module, path indicating device, operation path optimizing module, operation path planning model, point cloud data three-dimensional reconstruction module, medical image three-dimensional reconstruction module, electromagnetic positioning module, optical positioning module, electromagnetic position locator, binocular vision positioning system, light path indicating device, augmented reality path indicating device, laser lamp, wearable augmented reality glasses and wearable augmented reality helmet and other functional modules and devices can all adopt the applicable functional modules and devices in the prior art, or adopt the functional modules and devices in the prior art and adopt the conventional technical means to construct.
The structure and operation of the present invention will be further described with reference to a preferred embodiment of the present invention.
As shown in fig. 1, a surgical assistance system includes a preoperative surgical planning system and an intraoperative surgical navigation system; the preoperative surgical planning system comprises: the system comprises an image processing module for three-dimensional model reconstruction, a database for storing patient medical information and a surgical planning module for planning a moving path of a surgical instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module, a surgical navigation module and a path indicating device, wherein the positioning module is used for measuring the spatial positions of a surgical instrument and a to-be-operated part; the operation planning module comprises an operation path planning model established based on a neural network; the surgical path planning model is trained by adopting a training set constructed by preoperative medical information, surgical paths and postoperative result data of operated patients, inputs medical information and postoperative expected data of patients to be operated, and outputs auxiliary surgical paths for assisting doctors in decision making.
A preoperative surgical planning system, comprising: the system comprises an image processing module for three-dimensional model reconstruction, an acquired surgical information database and a surgical planning module for surgical scheme formulation.
The image processing module comprises a point cloud data three-dimensional reconstruction module and a medical image three-dimensional reconstruction module; the point cloud data three-dimensional reconstruction module comprises a three-dimensional scanning device for performing three-dimensional scanning on the surface of the part to be operated and acquiring point cloud data, and a three-dimensional model reconstruction module for performing three-dimensional reconstruction on the point cloud data by adopting a neural network to obtain a three-dimensional model of the part to be operated; and the medical image three-dimensional reconstruction module is used for obtaining a three-dimensional model of the to-be-operated part region and peripheral nerves and blood vessels thereof through a three-dimensional reconstruction algorithm based on the to-be-operated part slice image.
The image processing module aims at three-dimensional scanning reconstruction and medical image three-dimensional reconstruction of the body surface of a patient. Three-dimensional scanning reconstruction means that a high-precision three-dimensional scanning device is used for scanning the surface of a focus region of a patient, and a three-dimensional model of the body surface of the patient is obtained through three-dimensional reconstruction based on point cloud data; the three-dimensional reconstruction of the medical image refers to obtaining a three-dimensional model of a lesion region, peripheral nerves and blood vessels of a patient through a three-dimensional reconstruction algorithm based on a patient slice image, such as a Computed Tomography (CT) image and a Magnetic Resonance Imaging (MRI) image.
The operation information database stores information (including personal basic information, diagnosis and treatment information and the like) of patients who have completed operations, preoperative operation schemes and operation path planning to be taken, actually completed intraoperative operation schemes and operation paths, postoperative effect diagnosis and other information, and provides reference for formulation of the operation schemes of the existing patients and planning of the operation paths.
The mode of making the operation scheme and the operation path in the operation planning module is computer-aided, a computer can simulate an expected result after the operation according to the condition of a patient, and the path planning is automatically completed through an operation path planning method to assist a doctor to make the operation scheme. It should be noted that the results of the computerized planning of the surgical plan may provide a reference to the physician or may be confirmed by the physician and applied to the clinical procedure, and is not the only way to plan the surgical plan.
The operation planning module comprises an operation path planning model established based on a neural network; a surgical path planning model, which may be trained using a training set constructed from preoperative medical information, surgical paths and postoperative outcome data of a patient who has been operated on, inputs medical information and postoperative expectation data of a patient to be operated on, and outputs an auxiliary surgical path for assisting a physician in making a decision.
The operation planning module also comprises a space trajectory planning algorithm module which is used for solving the optimal operation path in the operation constraint area range.
The surgical path planning can automatically generate a surgical scheme and a surgical path plan through a surgical path planning model established based on a neural network. The operation path planning can also be realized by analyzing the existing patient information, making an ideal operation scheme in the operation constrained region range and solving the optimal operation path planning through a space trajectory planning algorithm. The operation path planning device and the operation path planning method can be used as a part of a surgical operation auxiliary system, and can also be independently used for assisting a doctor to make an operation scheme and plan an operation path.
The intraoperative surgical navigation system comprises a positioning module used for measuring the spatial positions of surgical instruments and a patient, a surgical navigation module used for recording a surgical path and a path indicating device used for prompting a doctor.
The positioning module comprises an electromagnetic positioning module and/or an optical positioning module; the electromagnetic positioning module comprises an electromagnetic type position positioner; the optical positioning module comprises a binocular vision positioning system, and the binocular vision positioning system is used for three-dimensional coordinate positioning of the space points. The electromagnetic position locator emits a magnetic field near a patient operated part by a magnetic field generator, and calculates the spatial position by signal feedback of a positioning sensor and an electromagnetic positioning probe in the magnetic field.
The positioning mode of the positioning module includes but is not limited to electromagnetic positioning and optical positioning. When the interference of the electromagnetic positioning signal in the operation environment is small, the electromagnetic positioning mode is preferably selected; conversely, optical positioning is preferred.
The surgical instrument can be a scalpel, and the positioning mode can be electromagnetic positioning. In order to reduce the interference of metal materials on the electromagnetic positioning precision, the material of the scalpel handle is replaced by a non-metal material which can be plastic and the like. The handle of the scalpel is internally provided with an electromagnetic positioning sensor, and each surgical instrument has a unique number and can be called at any time when a doctor plans the operation.
The surgical navigation module can record the spatial position of surgical instruments in real time, and compare and analyze the actual surgical path with the preoperative planned path, on one hand, the path information of the surgical instruments in the next step is sent to the path indicating device, and on the other hand, when the actual surgical path deviates from the preoperative planned path, the early warning signals of corresponding levels are sent according to the deviation degree level.
Implementations of the path indicating device include, but are not limited to, light indicating and augmented reality technologies. The path indicating device based on the light indicating mode is a laser lamp with an adjustable indicating angle, the operation path of a doctor is indicated in real time, the deviation degree of the path can be indicated through color difference, and the doctor does not need to wear additional auxiliary equipment. Path indicating device based on augmented reality technique can be for wearing formula augmented reality glasses or wearing formula augmented reality helmet, and the doctor dresses augmented reality equipment in the operation, shows planning route and actual degree of deviating before the art on electronic screen, and picture quality depends on augmented reality equipment technical development degree.
The path indicating means may be a laser light with an adjustable indication angle. After receiving the path planning information, the rotating mechanism provides steering capacity, and after the steering capacity is adjusted to a proper angle, the laser lamp emits low-brightness indicating light, so that a doctor can perform operation according to the prompt of the position of a light spot, and the path deviation degree can be indicated through the color difference of the laser lamp.
The path indicating device can also be a pair of wearable augmented reality glasses, and a doctor can observe the matching degree of a preset operation scheme model and an actually completed operation scheme model and real-time operation path prompt information in an operation through equipment such as Google Glass, and can complete operation under the condition of not converting a visual angle. The surgical path prompt information can be displayed on the Google Glass in an image mode, and the prompt information can be the feeding direction and the feeding depth, so that accurate surgical guidance is realized. In the operation process, the operation navigation module detects and tracks the pose change of the operated part by using the electromagnetic positioning probe, and when the pose change of the operated part exceeds a set threshold value, the operation navigation module can correct the path planning prompt information in a Google Glass display interface in real time, so that the accuracy of the relative pose of the preoperative planned path and the operated part in the operation process is ensured. When doctors encounter difficult problems in the operation process, the operation pictures can be transmitted to online expert group members in real time through a camera on the Google Glass, and the expert group members can provide real-time operation guidance or suggestion for the doctors in an online audio mode. In addition, the whole operation process can be recorded from the visual angle of a doctor, and postoperative evaluation and doctor operation training are realized by combining a database technology.
To further illustrate the present disclosure, a nasal plastic surgery embodiment of facial cosmetic surgery is described.
In the implementation of the facial cosmetic surgery, the invention can improve the rationality of the surgical scheme and the accuracy of the surgical operation. As shown, the preferred surgical planning implementation method of the present invention is shown in steps S101-S104.
Step S101, selecting and labeling body surface mark points of the face of the patient. Selecting a certain number of key points of the face of the patient as body surface mark points, and pasting a certain number of markers on the body surface mark points. It should be noted that the marker is affixed to the patient in a position that avoids the surgical area of the patient's nose.
Step S102, scanning the face of the patient through a three-dimensional scanning device, and acquiring the three-dimensional data of the face of the patient before operation.
And step S103, extracting the position information of the three-dimensional model feature mark points of the face of the patient in the image space.
And step S104, planning the operation path with the assistance of a computer. And after receiving the image data of the patient, the operation planning module simulates an optimal result after the nose plastic operation according to the personalized features of the face of the patient and automatically generates an operation path according to the simulation result.
The doctor can complete the operation path planning by referring to the scheme of the operation planning module and combining the self pathology knowledge and the operation experience. The specific implementation steps of the surgical planning are as follows:
step S201, acquiring CT and MRI image data (n is 200 in this embodiment) of nasal regions of n patients, and establishing a nasal medical image dataset; important blood vessels, nerves and key tissues in the image are manually marked as sample images by experienced doctors. And training a three-dimensional convolution neural network model for medical image segmentation by a deep learning method.
Step S202, acquiring image data of the nose of a patient, and automatically identifying and segmenting important blood vessels, nerves and key tissues in the nose plastic region according to a three-dimensional convolution neural network model for medical image segmentation. And constructing three-dimensional models of important blood vessels, nerves, key tissues and the like in the nose of the patient by a medical image three-dimensional reconstruction technology.
And S203, obtaining a body surface three-dimensional model of the nose of the patient through a three-dimensional scanning device, and registering the three-dimensional model inside the nose reconstructed in the previous step with the three-dimensional model of the nose as a reference to obtain a transformation matrix among different models.
And step S204, setting a constrained region range of the nasal plastic surgery by taking a safe distance which is kept in accordance with medical requirements with blood vessels, nerves, key tissues and organs to be avoided as a constrained condition according to medical prior information and medical acceptable errors and combined with a simulation expected optimal result.
Step S205, in the range of the operation restriction area, the blood vessels, nerves, tissues and organs to be avoided are taken as obstacles, the safe distance meeting the medical requirements with the obstacles is taken as the restriction condition, and the shortest path or the smallest wound and the like are taken as targets; and obtaining the optimal surgical path through a space trajectory planning algorithm, and realizing automatic planning of the surgical path.
Preferably, the surgical path generation in step S205 may include the steps of:
step C1, in the nose plastic surgery restriction area scope, establish n parallel two-dimensional section, choose one of them two-dimensional section, utilize the space path planning algorithm on the plane, get the ideal operation route in the plane;
step C2, sequentially obtaining an ideal operation path based on the plane for the n two-dimensional section planes, and superposing the obtained n plane paths in the direction perpendicular to the plane, thereby obtaining a three-dimensional curved surface of the space;
and step C3, determining the safe activity space of the surgical instrument according to the medical prior condition and the medical requirement, and selecting a smooth three-dimensional curved surface in the interval as the optimal surgical path according to the operation difficulty of the doctor, namely determining the cutting direction and depth of the scalpel in the nasal plastic surgery, thereby realizing surgical path planning.
Preferably, the surgical path generation in step S205 may also be a path planning method based on path learning, including the following steps:
d1, recording the actual path of the scalpel when the doctor with abundant experience performs the nose plastic surgery, establishing a nose plastic surgery path sample database, extracting path information characteristics, and manufacturing a training data set and a test data set; training a three-dimensional convolutional neural network model for path planning by a deep learning method;
and D2, acquiring real-time image data of the patient nose, and obtaining the corresponding relation between the patient nose and the surgical path according to the three-dimensional convolutional neural network model for path planning, thereby obtaining the surgical planning path for reshaping the patient nose.
And S206, three-dimensionally simulating the nasal plastic surgery process, verifying the safety of the surgery planning path result, and checking whether the surgical instruments can damage important blood vessels, nerves and other parts. And after safety inspection, planning the operation path.
The space trajectory planning algorithm adopts a reinforcement learning algorithm to iteratively calculate an optimal surgical path, and a Q-learning algorithm is selected for path planning, and the method specifically comprises the following steps:
step S301, initializing each parameter: establishing a Q-value table, setting the current iteration frequency as I, setting the maximum iteration frequency as I, and initializing I as 0; defining the current state as s t The action in this state is a t Setting Q(s) in Q-value table i ,a i )=0; the initial learning rate α is 0.2, the discount rate γ is 0.8, and the prize obtained after each action is set to r t (ii) a Setting an upper boundary point of an operation area range as a starting point and a lower boundary point as an end point; setting the operation constraint region range of a three-dimensional model of a lesion region of a patient as an environment space E, dispersing the operation constraint region range into n effective two-dimensional sections, selecting one two-dimensional section, establishing a coordinate system for the two-dimensional section, and performing discretization to obtain t selectable states, setting a selectable action set as A, wherein the action in the example comprises four actions, namely front, back, left and right;
step S302, setting the starting point as the initial state S 0 Selecting the optimal action a according to the epsilon-greedy strategy when the terminal point is in the terminal state t Selecting action a t The probability of (c) is:
Figure BDA0003647088920000131
wherein 1-epsilon is the optimal action selected as a t The probability of (d);
step S303, selecting action a t Then, a prize r is obtained t For determining the next action; and obtaining the state s of the next step t+1
Step S304, update Q-value table and greedy value epsilon.
The Q-value updating formula is as follows;
Figure BDA0003647088920000132
the greedy value epsilon update formula is:
Figure BDA0003647088920000133
step S305, let S t =s t+1 Judging whether the terminal state and Q(s) are reached t ,a t ) And (5) whether the value is converged, if the value is not satisfied, continuing to step (S302) until the iteration condition is satisfied, and obtaining the optimal planning path after T times of iterative training.
The preferred surgical navigation implementation method of the invention comprises the following steps: and registering the three-dimensional image of the face of the patient with the actual face image to realize the conversion between the image space and the real space. The magnetic field generator in the positioning module is arranged near the operated part of the patient to emit a magnetic field, and the electromagnetic positioning probe measures the position information of the mark points on the face of the patient to complete the registration process in the operation.
In this embodiment, the mathematical model of the registration algorithm can be expressed as: a set of patient facial marker points is known, the coordinate set in real space is T, the coordinate in image space is M, and the two have the following relationship:
Figure BDA0003647088920000134
obtaining a transformation matrix from T to M through operation
Figure BDA0003647088920000141
The position conversion relation of two different spaces can be obtained, and the registration is realized.
And calibrating a set of surgical instruments with a positioning function. In this embodiment, an optical positioning method is adopted for calibration, and the calibration method includes the following steps:
and E1, detecting the displacement of the tail end of the surgical instrument in the optical positioning mode and the electromagnetic positioning mode simultaneously, and recording the displacement data. According to the requirement of calibration precision, the acquired data volume and the acquisition method can be adjusted, and preferably, the acquired group data is compared.
Step E2, calculating the calibration coefficient x of the coordinates by using the calibration coefficient calculation formula c 、y c 、z c And further obtain the compensated position coordinates, thereby improving the accuracy of the positioning of the surgical instrument.
The doctor performs the operation according to the planned path according to the prompting information such as the sound signal, the optical signal and the like. The prompt message is transmitted to the doctor through the path indicating device, and the transmission mode of the message can be a voice prompt or a track lamp prompt, but is not limited to the voice prompt or the track lamp prompt.
The above-mentioned embodiments are only for illustrating the technical ideas and features of the present invention, and the purpose thereof is to enable those skilled in the art to understand the contents of the present invention and to carry out the same, and the present invention shall not be limited to the embodiments, i.e. the equivalent changes or modifications made within the spirit of the present invention shall fall within the scope of the present invention.

Claims (10)

1. A surgical operation auxiliary system is characterized by comprising a preoperative operation planning system and an intraoperative operation navigation system; the preoperative surgical planning system comprises: the system comprises an image processing module for three-dimensional model reconstruction, a database for storing patient medical information and a surgical planning module for planning a moving path of a surgical instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module, a surgical navigation module and a path indicating device, wherein the positioning module is used for measuring the spatial positions of a surgical instrument and a to-be-operated part; the operation planning module comprises an operation path optimization module and/or an operation path planning model established based on a neural network; the surgical path optimization module is used for solving an optimal surgical path in the surgical constrained region range based on a spatial trajectory planning algorithm; the surgical path planning model is trained by adopting a training set constructed by preoperative medical information, surgical paths and postoperative result data of operated patients, inputs medical information and postoperative expected data of patients to be operated, and outputs auxiliary surgical paths for assisting doctors in decision making.
2. The surgical assistance system according to claim 1, wherein the image processing module includes a point cloud data three-dimensional reconstruction module and a medical image three-dimensional reconstruction module; the point cloud data three-dimensional reconstruction module comprises a three-dimensional scanning device for performing three-dimensional scanning on the surface of the part to be operated and acquiring point cloud data, and a three-dimensional model reconstruction module for performing three-dimensional reconstruction on the point cloud data by adopting a neural network to obtain a three-dimensional model of the part to be operated; and the medical image three-dimensional reconstruction module is used for obtaining a three-dimensional model of the to-be-operated part region and peripheral nerves and blood vessels thereof through a three-dimensional reconstruction algorithm based on the to-be-operated part slice image.
3. A surgical assistance system according to claim 1 wherein the positioning module comprises an electromagnetic positioning module and/or an optical positioning module; the electromagnetic positioning module comprises an electromagnetic type position positioner; the optical positioning module comprises a binocular vision positioning system, and the binocular vision positioning system is used for three-dimensional coordinate positioning of the space points.
4. The surgical operation assisting system as claimed in claim 1, wherein the surgical navigation module records the spatial position of the surgical instrument in real time, and compares and analyzes the actual surgical path with the preoperative planned path, on one hand, sends the next step of surgical instrument path information to the path indicating device, and on the other hand, sends the early warning signal of the corresponding level according to the level of the degree of deviation when the actual surgical path deviates from the preoperative planned path.
5. A surgical assistance system according to claim 4 wherein the path indicating means comprises a light path indicating means or an augmented reality path indicating means; the light path indicating device comprises a laser lamp with an adjustable indicating angle, the laser lamp receives signals from the surgical navigation module, the moving path of the surgical instrument is indicated by the on and off of the indicating lamp, and the path deviation degree is indicated by the color difference of the indicating lamp; the augmented reality path indicating device comprises wearable augmented reality glasses or a wearable augmented reality helmet; the display screen of the augmented reality path indicating device is used for displaying the planned path before the operation and the actual deviation degree.
6. A surgical path planning method is characterized in that the method is provided with a preoperative surgical planning system and an intraoperative surgical navigation system; setting a preoperative surgical planning system: the system comprises an image processing module for three-dimensional model reconstruction, a database for storing medical information of a patient and a surgical planning module for planning a moving path of a surgical instrument; setting an intraoperative surgical navigation system: the surgical instrument comprises a positioning module, a surgical navigation module and a path indicating device, wherein the positioning module is used for measuring the spatial positions of a surgical instrument and a to-be-operated part; the operation planning module is provided with an operation path optimization module and/or an operation path planning model established based on a neural network; the surgical path optimization module is used for solving an optimal surgical path in the surgical constrained region range based on a spatial trajectory planning algorithm; the surgical path planning model is trained by adopting a training set constructed by preoperative medical information, surgical paths and postoperative result data of operated patients, inputs medical information and postoperative expected data of patients to be operated, and outputs auxiliary surgical paths for assisting doctors in decision making.
7. The surgical path planning method according to claim 6, wherein the method for planning the movement path of the surgical instrument by the surgical planning module comprises the following steps:
establishing a medical image data set, labeling important blood vessels, nerves, key tissues and organs in images of the data set, and training a neural network model for medical image segmentation by a deep learning method;
step two, after image data of a focus of a patient are obtained, important blood vessels, nerves, key tissues and organs in a focus area are automatically identified and segmented according to a neural network model for medical image segmentation, and a three-dimensional model of the focus area of the patient is constructed through a medical image three-dimensional reconstruction technology;
acquiring a body surface three-dimensional model of the focus region of the patient through a three-dimensional scanning device, and registering the three-dimensional model of the focus region of the patient and the body surface three-dimensional model;
step four, defining the range of the operation constraint area by taking the safe distance which is kept in accordance with the medical requirements with the blood vessels, nerves, key tissues and organs to be avoided as the constraint condition;
and step five, obtaining the optimal surgical path through a space trajectory planning algorithm in the surgical constraint area range, and realizing automatic planning of the surgical path.
8. The surgical path planning method according to claim 7, wherein step five includes the sub-steps of:
step C1, in the operation constrained region scope, creating n parallel two-dimensional section surfaces, selecting one of the two-dimensional section surfaces, and using the space path planning algorithm on the plane to obtain the ideal operation path in the plane;
step C2, sequentially obtaining an ideal operation path based on the plane for the n two-dimensional section surfaces, and superposing the obtained n plane paths in the direction perpendicular to the plane to obtain a three-dimensional curved surface of the space;
and step C3, determining the safe activity interval of the surgical instrument according to the medical prior condition and the medical requirement, and selecting a smooth three-dimensional curved surface in the interval as the optimal surgical path according to the operation difficulty of the doctor.
9. The surgical path planning method according to claim 6, wherein the spatial trajectory planning algorithm iteratively calculates the optimal surgical path by using a Q-learning algorithm, which comprises the following specific steps:
step a1, initializing parameters: establishing a Q-value table, setting the current iteration times as I, the maximum iteration times as I, and initializing I to be 0; defining the current state as s t The action in this state is a t Setting Q(s) in Q-value table i ,a i ) 0; the initial learning rate α is 0.2, the discount rate γ is 0.8, and the prize obtained after each action is set to r t (ii) a Setting an upper boundary point of an operation area range as a starting point and a lower boundary point as an end point; setting the operation constraint area range of a three-dimensional model of a lesion area of a patient as an environment space E, dispersing the operation constraint area range into n effective two-dimensional sections, selecting one two-dimensional section, establishing a coordinate system for the two-dimensional section, performing discretization to obtain t optional states, and setting an optional action set as A;
step A2, setting the starting point as the initial state and the end point as the terminal state, and selecting the optimal action a according to the epsilon-greedy strategy t Selecting action a t The probability of (c) is:
Figure FDA0003647088910000031
wherein epsilon is a greedy value; 1- ε to select the optimal action a t The probability of (d);
Figure FDA0003647088910000032
to estimate the maximum prize available in the future; a represents an action; s represents a state; prob (a) t ) To select action a t The probability of (d);
step A3, select action a t Then, a prize r is obtained t For determining the next action; and obtaining the state s of the next step t+1
Step A4, updating a Q-value table and a greedy value epsilon;
the Q-value updating formula is as follows;
Figure FDA0003647088910000033
the greedy value epsilon update formula is:
Figure FDA0003647088910000034
step A5, let s t =s t+1 Judging whether the terminal state and Q(s) are reached t ,a t ) And (4) whether the values are converged, if the values do not meet the conditions, returning to the step A2 until the iteration conditions are met, and obtaining the optimal planned path after T times of iterative training.
10. The surgical path planning method according to claim 6, wherein the path indicating device is an augmented reality path indicating device; the display screen of the augmented reality path indicating device is used for displaying the preoperative planned path and the actual deviation degree; the surgical path prompt information is displayed on the display screen in an image mode, and the prompt information is the feeding direction and the feeding depth.
CN202210534426.XA 2022-05-17 2022-05-17 Surgical operation auxiliary system and operation path planning method Active CN114948199B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210534426.XA CN114948199B (en) 2022-05-17 2022-05-17 Surgical operation auxiliary system and operation path planning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210534426.XA CN114948199B (en) 2022-05-17 2022-05-17 Surgical operation auxiliary system and operation path planning method

Publications (2)

Publication Number Publication Date
CN114948199A true CN114948199A (en) 2022-08-30
CN114948199B CN114948199B (en) 2023-08-18

Family

ID=82983069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210534426.XA Active CN114948199B (en) 2022-05-17 2022-05-17 Surgical operation auxiliary system and operation path planning method

Country Status (1)

Country Link
CN (1) CN114948199B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115359896A (en) * 2022-10-20 2022-11-18 山东曲阜康尔健医疗科技有限公司 Operation and monitoring analysis system based on data analysis and remote control
CN116473673A (en) * 2023-06-20 2023-07-25 浙江华诺康科技有限公司 Path planning method, device, system and storage medium for endoscope
CN116919599A (en) * 2023-09-19 2023-10-24 中南大学 Haptic visual operation navigation system based on augmented reality
CN116935009A (en) * 2023-09-19 2023-10-24 中南大学 Operation navigation system for prediction based on historical data analysis
CN117274506A (en) * 2023-11-20 2023-12-22 华中科技大学同济医学院附属协和医院 Three-dimensional reconstruction method and system for interventional target scene under catheter
CN117393107A (en) * 2023-12-12 2024-01-12 北京唯迈医疗设备有限公司 Iterative learning method and system for automatic surgical intervention robot and storage medium
CN118453115A (en) * 2024-05-20 2024-08-09 南通市传染病防治院(南通市第三人民医院) Real-time image guidance system based on surgery

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030220557A1 (en) * 2002-03-01 2003-11-27 Kevin Cleary Image guided liver interventions based on magnetic tracking of internal organ motion
JP2004223128A (en) * 2003-01-27 2004-08-12 Hitachi Ltd Medical practice supporting apparatus and method
US20070238981A1 (en) * 2006-03-13 2007-10-11 Bracco Imaging Spa Methods and apparatuses for recording and reviewing surgical navigation processes
EP2277441A1 (en) * 2009-07-22 2011-01-26 Surgica Robotica S.p.A. Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures
CN103479430A (en) * 2013-09-22 2014-01-01 江苏美伦影像系统有限公司 Image guiding intervention operation navigation system
US20140270441A1 (en) * 2013-03-15 2014-09-18 Covidien Lp Pathway planning system and method
US20160070436A1 (en) * 2013-03-15 2016-03-10 Monroe M. Thomas Planning, navigation and simulation systems and methods for minimally invasive therapy
CN106890025A (en) * 2017-03-03 2017-06-27 浙江大学 A kind of minimally invasive operation navigating system and air navigation aid
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
CN110464459A (en) * 2019-07-10 2019-11-19 丽水市中心医院 Intervention plan navigation system and its air navigation aid based on CT-MRI fusion
CN112155729A (en) * 2020-10-15 2021-01-01 中国科学院合肥物质科学研究院 Intelligent automatic planning method and system for surgical puncture path and medical system
WO2021114226A1 (en) * 2019-12-12 2021-06-17 珠海横乐医学科技有限公司 Surgical navigation system employing intrahepatic blood vessel registration
CN113081257A (en) * 2019-12-23 2021-07-09 四川医枢科技股份有限公司 Automatic planning method for operation path
CN113693725A (en) * 2021-10-22 2021-11-26 杭州维纳安可医疗科技有限责任公司 Needle insertion path planning method, device, equipment and storage medium
CN113940755A (en) * 2021-09-30 2022-01-18 南开大学 Surgical operation planning and navigation method integrating operation and image
US20220110682A1 (en) * 2020-10-08 2022-04-14 National Central University Computer-Implemented Method, Computer-Assisted Processing Device and Non-Transitory Computer-Readable Medium for Computer-Assisted Planning of Surgical Path

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030220557A1 (en) * 2002-03-01 2003-11-27 Kevin Cleary Image guided liver interventions based on magnetic tracking of internal organ motion
JP2004223128A (en) * 2003-01-27 2004-08-12 Hitachi Ltd Medical practice supporting apparatus and method
US20070238981A1 (en) * 2006-03-13 2007-10-11 Bracco Imaging Spa Methods and apparatuses for recording and reviewing surgical navigation processes
EP2277441A1 (en) * 2009-07-22 2011-01-26 Surgica Robotica S.p.A. Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures
US20140270441A1 (en) * 2013-03-15 2014-09-18 Covidien Lp Pathway planning system and method
US20160070436A1 (en) * 2013-03-15 2016-03-10 Monroe M. Thomas Planning, navigation and simulation systems and methods for minimally invasive therapy
CN103479430A (en) * 2013-09-22 2014-01-01 江苏美伦影像系统有限公司 Image guiding intervention operation navigation system
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
CN106890025A (en) * 2017-03-03 2017-06-27 浙江大学 A kind of minimally invasive operation navigating system and air navigation aid
CN110464459A (en) * 2019-07-10 2019-11-19 丽水市中心医院 Intervention plan navigation system and its air navigation aid based on CT-MRI fusion
WO2021114226A1 (en) * 2019-12-12 2021-06-17 珠海横乐医学科技有限公司 Surgical navigation system employing intrahepatic blood vessel registration
CN113081257A (en) * 2019-12-23 2021-07-09 四川医枢科技股份有限公司 Automatic planning method for operation path
US20220110682A1 (en) * 2020-10-08 2022-04-14 National Central University Computer-Implemented Method, Computer-Assisted Processing Device and Non-Transitory Computer-Readable Medium for Computer-Assisted Planning of Surgical Path
CN112155729A (en) * 2020-10-15 2021-01-01 中国科学院合肥物质科学研究院 Intelligent automatic planning method and system for surgical puncture path and medical system
CN113940755A (en) * 2021-09-30 2022-01-18 南开大学 Surgical operation planning and navigation method integrating operation and image
CN113693725A (en) * 2021-10-22 2021-11-26 杭州维纳安可医疗科技有限责任公司 Needle insertion path planning method, device, equipment and storage medium

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115359896A (en) * 2022-10-20 2022-11-18 山东曲阜康尔健医疗科技有限公司 Operation and monitoring analysis system based on data analysis and remote control
CN116473673A (en) * 2023-06-20 2023-07-25 浙江华诺康科技有限公司 Path planning method, device, system and storage medium for endoscope
CN116473673B (en) * 2023-06-20 2024-02-27 浙江华诺康科技有限公司 Path planning method, device, system and storage medium for endoscope
CN116919599A (en) * 2023-09-19 2023-10-24 中南大学 Haptic visual operation navigation system based on augmented reality
CN116935009A (en) * 2023-09-19 2023-10-24 中南大学 Operation navigation system for prediction based on historical data analysis
CN116935009B (en) * 2023-09-19 2023-12-22 中南大学 Operation navigation system for prediction based on historical data analysis
CN116919599B (en) * 2023-09-19 2024-01-09 中南大学 Haptic visual operation navigation system based on augmented reality
CN117274506A (en) * 2023-11-20 2023-12-22 华中科技大学同济医学院附属协和医院 Three-dimensional reconstruction method and system for interventional target scene under catheter
CN117274506B (en) * 2023-11-20 2024-02-02 华中科技大学同济医学院附属协和医院 Three-dimensional reconstruction method and system for interventional target scene under catheter
CN117393107A (en) * 2023-12-12 2024-01-12 北京唯迈医疗设备有限公司 Iterative learning method and system for automatic surgical intervention robot and storage medium
CN117393107B (en) * 2023-12-12 2024-03-15 北京唯迈医疗设备有限公司 Iterative learning method and system for automatic surgical intervention robot and storage medium
CN118453115A (en) * 2024-05-20 2024-08-09 南通市传染病防治院(南通市第三人民医院) Real-time image guidance system based on surgery

Also Published As

Publication number Publication date
CN114948199B (en) 2023-08-18

Similar Documents

Publication Publication Date Title
CN114948199B (en) Surgical operation auxiliary system and operation path planning method
US8116847B2 (en) System and method for determining an optimal surgical trajectory
KR101949114B1 (en) System and method for navigation to a target anatomical object in medical imaging-based procedures
US10980601B2 (en) System and methods for intraoperative guidance feedback
US20230044620A1 (en) Algorithm-based methods for predicting and/or detecting a clinical condition related to insertion of a medical instrument toward an internal target
JP5702861B2 (en) Assisted automatic data collection method for anatomical surfaces
JP7418352B2 (en) Automatic tumor identification during surgery using machine learning
CN113966204B (en) Method for automatically planning a trajectory for medical interventions
EP3282994B1 (en) Method and apparatus to provide updated patient images during robotic surgery
US11413095B2 (en) System and method for surgical planning
JP2021521949A (en) Interactive coronary labeling with interventional x-ray images and deep learning
EP3165192B1 (en) Updating a volumetric map
CN108510506A (en) A kind of tubular structure image partition method
CN116650111A (en) Simulation and navigation method and system for bronchus foreign body removal operation
CN116712167A (en) Navigation method and system for pulmonary nodule operation
US20240216073A1 (en) Method and device for generating an uncertainty map for guided percutaneous procedures
CN114073581B (en) Bronchus electromagnetic navigation system
US20220354579A1 (en) Systems and methods for planning and simulation of minimally invasive therapy
US20230240750A1 (en) Systems for evaluating registerability of anatomic models and associated methods
CN118453115B (en) Real-time image guidance system based on surgery
US12137985B2 (en) System and methods for intraoperative guidance feedback
Leaf et al. Cutting-edge care: unleashing artificial intelligence's potential in gynecologic surgery
CN118800439A (en) Surgical scheme auxiliary planning method based on three-dimensional medical model
Zagzoog Feasibility of Skull Base Neuronavigation Using Optical Topographic Imaging Toward Guidance of Human and Robotic Drill Operators
CN114288523A (en) Detection method and device of flexible instrument, surgical system, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant