Nothing Special   »   [go: up one dir, main page]

CN113349931A - Focus registration method of high-precision surgical navigation system - Google Patents

Focus registration method of high-precision surgical navigation system Download PDF

Info

Publication number
CN113349931A
CN113349931A CN202110678224.8A CN202110678224A CN113349931A CN 113349931 A CN113349931 A CN 113349931A CN 202110678224 A CN202110678224 A CN 202110678224A CN 113349931 A CN113349931 A CN 113349931A
Authority
CN
China
Prior art keywords
model
origin
positioning
calculating
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110678224.8A
Other languages
Chinese (zh)
Other versions
CN113349931B (en
Inventor
李石
梁禧
姬红
夏曦煜
朱思仰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan Weile Digital Medical Technology Co ltd
Original Assignee
Yunnan Weile Digital Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan Weile Digital Medical Technology Co ltd filed Critical Yunnan Weile Digital Medical Technology Co ltd
Priority to CN202110678224.8A priority Critical patent/CN113349931B/en
Publication of CN113349931A publication Critical patent/CN113349931A/en
Application granted granted Critical
Publication of CN113349931B publication Critical patent/CN113349931B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention relates to the technical field of medical treatment, and particularly discloses a focus registration method of a high-precision surgical navigation system, which comprises focus three-dimensional reconstruction, tracking unit setting, model combination and positioning ball installation; matching the model origin and the like. The invention matches the origin of the complete model with the positioning ball with the origin of the optical positioning device software by adjusting, so that the tracking object in the complete model is superposed with the entity model of the tracking object, and the registration of the focus of the patient is completed.

Description

Focus registration method of high-precision surgical navigation system
Technical Field
The invention relates to the technical field of medical treatment, in particular to a focus registration method of a high-precision surgical navigation system.
Background
Currently, an optical navigation device is used to acquire six-dimensional information (position and rotation information of three coordinate axes x, y, and z) of a real space of a target object in real time, so that a model object of a virtual space also acquires the same information, and coordinates of the real space are matched with a virtual space coordinate system. If the optical navigation device is used to spatially locate a plurality of objects at the same time, the matching accuracy of the objects becomes a problem to be solved. The real space has two elongated objects A and B, the virtual space also has two models xA and xB with the same shape, A and B in the real space keep a relative posture, the real position of the object is obtained A, B through the optical navigation device, then the data is matched to the xA and xB virtual objects in the virtual space, and the error between the relative posture of xA and xB in the virtual space and the relative posture of A, B in the real space is the accuracy difference.
The optical navigation device can be applied to various fields, design, construction, industry, medical treatment, and the like. When the method is used, the information of a plurality of objects needs to be collected, and the related position and rotation information among the objects needs to be calculated, the precision is particularly large for the final calculation result image. During surgery, it is often necessary to register the actual surgical site with a three-dimensional image of the surgical site obtained from a pre-operative scan to guide the surgery. In the existing registration method of the surgical navigation system, a doctor manually selects and confirms feature matching points in the registration process to realize registration, but the precision difference generated by the operation error of the doctor is large, and the registration precision is low and is difficult to meet the requirement of minimally invasive surgery.
Disclosure of Invention
The invention aims to provide a lesion registration method of a high-precision surgical navigation system, which can solve the problems of high precision difference and low registration precision in the existing registration method.
In order to solve the technical problems, the invention adopts the following technical scheme:
a lesion registration method of a high-precision surgical navigation system comprises the following steps,
three-dimensional reconstruction of a focus: acquiring image data of a focus of a patient by using a medical image scanner, and then performing three-dimensional reconstruction on the focus of the patient by using a medical three-dimensional model reconstruction system;
setting a tracking unit: designing a tracking object A through modeling software, setting an installation position of a positioning small ball on the tracking object A, and then manufacturing a solid structure of the tracking object A, wherein a three-dimensional model coordinate system of the tracking object A is C1;
combining the models: combining a three-dimensional model of a tracked object A and a three-dimensional model reconstructed at a focus of a patient in a medical three-dimensional model reconstruction system to form a model B, wherein the specific combination method comprises the following steps: in three-dimensional model software, two models are introduced into the same project, then the two models are placed at a preset position according to a plan through translation and rotation, and then the two models are exported into an integral model file through the three-dimensional model software;
installing a positioning ball: setting positioning balls, respectively installing the positioning balls at positions designed in advance of the tracked object A, identifying the relative positions of the positioning balls through an optical positioning system, and matching the coordinate origin of the positioning balls with the origin of a coordinate system C1 of the tracked object A;
matching the origin of the model: matching the origin of the complete model with the positioner with the origin of the optical positioning system software to complete the accurate positioning of the focus of the patient; because the focus image data of a patient has a coordinate system of the focus image data, the origin of the coordinate system is generally positioned at the lower corner of a CT machine, the coordinate systems of CT machines with different brands are different, model reconstruction can be carried out in a medical three-dimensional model reconstruction system according to the actual origin of the CT machine, the derived origin of the model coordinate system is also positioned at the origin of the CT machine, the origin of the model to be positioned is different from the origin identified in the matched software of an optical positioning system, and positioning cannot be carried out, so that the origin of the model needs to be adjusted to be matched.
Further, the model origin matching comprises the following steps,
(1) setting the position of a tracking object A model at an original point, and finding three model vertexes with characteristic shapes on the tracking object A model, wherein world coordinates of the three model vertexes are pa, pb and pc respectively;
(2) setting the position of the model B at an original point, and finding corresponding point feature points pa ', pb ' and pc ' of three corresponding features pa, pb and pc on the tracked object A on the model B;
(3) and according to the deviation calculation of the three characteristic points, shifting and rotating the model B to enable the model of the tracked object A on the model B to be overlapped with the model of the tracked object A.
Further, the calculation of the deviation of the three feature points includes the steps of,
calculating the center CENTARA of a plane formed by three points pa, pb and pc;
calculating the center point CENTERB of the plane formed by three points pa ', pb ' and pc ';
calculating vector V2 from the origin to the centerB;
fourthly, calculating a Normal1 of a plane consisting of the pa point, the pb point and the pc point;
calculating Normal2 of a plane formed by three points pa ', pb ' and pc ';
sixthly, calculating quaternion Q1 of rotation of Normal1 and Normal 2;
seventhly, vectors of the points pa ', pb ', pc ' and the point centrB are respectively calculated, and then the obtained vectors are rotated by Q1 to obtain vectors s1, s2 and s 3;
calculating vectors va, vb and vc of the points pa, pb and pc and the point centrA;
ninthly, calculating an average angle aveAngle1 of the angles of s1, s2 and s3 and the angles of va, vb and vc;
r rotates V2 using quaternion Q1;
11, calculating a quaternion Q2 of the vector rotation aveAngle1 angle of Normal 1;
12 rotating model B with Q2 as the rotation value of model B;
13 as the position of model B, centerA plus V2;
14 the tracked object a model of the final model B will be completely coincident with the object a model with no errors.
Further, the image data of the patient's lesion includes CT or nuclear magnetic data in dicom format; the medical three-dimensional model reconstruction system adopts a mimics reconstruction tool, and other software capable of completing medical reconstruction can be actually selected according to user requirements.
Further, the tracked object A is subjected to model design by adopting 3D printing modeling software, then an entity of the tracked object A is printed out by a high-precision 3D printer, the printing material is printed by adopting photosensitive resin, and the printing precision is controlled within 0.1 mm.
Furthermore, the number of the positioning balls is at least 4, and the positioning balls are fluorescent small balls with the diameter of 1 cm.
Compared with the prior art, the invention has the following beneficial effects:
the invention matches the origin of the complete model with the positioning ball with the origin of the optical positioning device software by adjusting, so that the tracking object in the complete model is superposed with the entity model of the tracking object, and the registration of the focus of the patient is completed.
Description of the drawings:
fig. 1 is a flow chart of the registration step of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Examples
A focus registration method of a high-precision operation navigation system comprises the steps of firstly, obtaining image data of a focus of a patient by using a medical image scanner, wherein the original image data comprises CT or nuclear magnetic data in a dicom format, then, carrying out three-dimensional reconstruction on the focus of the patient by using a medical three-dimensional model reconstruction system, wherein a reconstruction tool adopts software capable of finishing medical model reconstruction, and in the embodiment, mimics software is adopted for reconstruction; because the image data in the dicom format has a set of coordinate system, the origin of the coordinate system is generally positioned at the lower corner of the CT machine, the coordinate systems of the CT machines with different brands are different, the coordinate system of the image data can be determined according to the CT machine which is actually used, model reconstruction can be carried out in three-dimensional reconstruction software according to the actual origin of the CT machine, and the derived origin of the model coordinate system is also the same as the origin of the coordinate of the CT machine;
designing a tracking object A through 3D modeling software, setting installation positions of the positioning small balls on the tracking object A, wherein the number of the set installation positions is at least four, printing an entity structure of the tracking object A through a high-precision 3D printer by adopting a photosensitive resin material, controlling the printing precision within 0.1mm by adopting the photosensitive resin material, and designing a model coordinate system of the tracking object A to be C1;
combining a three-dimensional model of a tracked object A and a three-dimensional model reconstructed at a focus of a patient in a mimics software system to form a model B, introducing the two models into the same project in three-dimensional model software, then placing the two models at a preset position according to a plan through translation and rotation, and then exporting the two models into an integral model file through the three-dimensional model software;
then, installing fluorescent positioning small balls at the pre-designed positions of the tracked object A, setting the diameter of each positioning small ball to be 1cm, enabling the positioning small balls to be identified and positioned only by installing at least 4 positioning small balls, then identifying the relative positions of the positioning small balls through optical positioning equipment, and then matching the coordinate origin of the positioning small balls with the origin of a coordinate system C1 of the tracked object A, wherein the optical positioning equipment adopts pst iris positioning equipment in the embodiment;
the model origin to be positioned is different from the origin identified in the pst iris equipment matching software, so that positioning cannot be performed, the origin needs to be adjusted to match the origin, and the adjustment comprises the following steps:
(1) setting the position of a tracking object A model at an original point, and finding three model vertexes with characteristic shapes on the tracking object A model, wherein world coordinates of the three model vertexes are pa, pb and pc respectively;
(2) setting the position of the model B at an original point, and finding corresponding point feature points pa ', pb ' and pc ' of three corresponding features pa, pb and pc on the tracked object A on the model B;
(3) according to deviation calculation of the three characteristic points, shifting and rotating the model B to enable the model of the tracked object A on the model B to be overlapped with the model of the tracked object A, wherein the deviation calculation step is as follows:
calculating the center CENTARA of a plane formed by three points pa, pb and pc;
calculating the center point CENTERB of the plane formed by three points pa ', pb ' and pc ';
calculating vector V2 from the origin to the centerB;
fourthly, calculating a Normal1 of a plane consisting of the pa point, the pb point and the pc point;
calculating Normal2 of a plane formed by three points pa ', pb ' and pc ';
sixthly, calculating quaternion Q1 of rotation of Normal1 and Normal 2;
seventhly, calculating a vector A of the point centerB and the point pa ', rotating the vector A by using a quaternion Q1 to generate a new vector s1, calculating a vector B of the point centerB and the point pb ', rotating the vector B by using a quaternion Q1 to generate a new vector s2, calculating a vector C of the point centerB and the point pc ', and rotating the vector C by using a quaternion Q1 to generate a new vector s 3;
calculating vectors va, vb and vc of the points pa, pb and pc and the point centrA;
ninthly, calculating an average angle aveAngle1 of the angles of s1, s2 and s3 and the angles of va, vb and vc;
r rotates V2 using quaternion Q1;
11, calculating a quaternion Q2 of the vector rotation aveAngle1 angle of Normal 1;
12 rotating model B with Q2 as the rotation value of model B;
13 as the position of model B, centerA plus V2;
14 the tracked object a model of the final model B will be completely coincident with the object a model with no errors.
The registration method adopted in the embodiment can be used for accurately adjusting through deviation calculation, so that the precision difference can be reduced, the registration precision can be effectively improved, and the registration method can better meet the requirements of minimally invasive surgery.
Reference throughout this specification to "one embodiment," "another embodiment," "an embodiment," "a preferred embodiment," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment described generally in this application. The appearances of the same phrase in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with any embodiment, it is submitted that it is within the scope of the invention to effect such feature, structure, or characteristic in connection with other embodiments.
Although the invention has been described herein with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More specifically, various variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, other uses will also be apparent to those skilled in the art.

Claims (6)

1. A lesion registration method of a high-precision surgical navigation system is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
three-dimensional reconstruction of a focus: acquiring image data of a focus of a patient by using a medical image scanner, and then performing three-dimensional reconstruction on the focus of the patient by using a medical three-dimensional model reconstruction system;
setting a tracking unit: designing a tracking object A through modeling software, setting an installation position of a positioning small ball on the tracking object A, and then manufacturing a solid structure of the tracking object A, wherein a three-dimensional model coordinate system of the tracking object A is C1;
combining the models: combining the three-dimensional model of the tracked object A and the three-dimensional model reconstructed at the focus of the patient in a medical three-dimensional model reconstruction system to form a model B;
installing a positioning ball: setting positioning balls, respectively installing the positioning balls at positions designed in advance of the tracked object A, identifying the relative positions of the positioning balls through an optical positioning system, and matching the coordinate origin of the positioning balls with the origin of a coordinate system C1 of the tracked object A;
matching the origin of the model: matching the origin of the complete model with the positioner with the origin of the optical positioning system software to complete the accurate positioning of the focus of the patient; because the focus image data of a patient has a coordinate system of the focus image data, the origin of the coordinate system is generally positioned at the lower corner of a CT machine, the coordinate systems of CT machines with different brands are different, model reconstruction can be carried out in a medical three-dimensional model reconstruction system according to the actual origin of the CT machine, the derived origin of the model coordinate system is also positioned at the origin of the CT machine, the origin of the model to be positioned is different from the origin identified in the matched software of an optical positioning system, and positioning cannot be carried out, so that the origin of the model needs to be adjusted to be matched.
2. The lesion registration method of a high-precision surgical navigation system according to claim 1, wherein: the model origin matching comprises the following steps,
(1) setting the position of a tracking object A model at an original point, and finding three model vertexes with characteristic shapes on the tracking object A model, wherein world coordinates of the three model vertexes are pa, pb and pc respectively;
(2) setting the position of the model B at an original point, and finding corresponding point feature points pa ', pb ' and pc ' of three corresponding features pa, pb and pc on the tracked object A on the model B;
(3) and according to the deviation calculation of the three characteristic points, shifting and rotating the model B to enable the model of the tracked object A on the model B to be overlapped with the model of the tracked object A.
3. The lesion registration method of a high-precision surgical navigation system according to claim 2, wherein:
the calculation of the deviations of the three feature points comprises the following steps,
calculating the center CENTARA of a plane formed by three points pa, pb and pc;
calculating the center point CENTERB of the plane formed by three points pa ', pb ' and pc ';
calculating vector V2 from the origin to the centerB;
fourthly, calculating a Normal1 of a plane consisting of the pa point, the pb point and the pc point;
calculating Normal2 of a plane formed by three points pa ', pb ' and pc ';
sixthly, calculating quaternion Q1 of rotation of Normal1 and Normal 2;
seventhly, vectors of the points pa ', pb ', pc ' and the point centrB are respectively calculated, and then the obtained vectors are rotated by Q1 to obtain vectors s1, s2 and s 3;
calculating vectors va, vb and vc of the points pa, pb and pc and the point centrA;
ninthly, calculating an average angle aveAngle1 of the angles of s1, s2 and s3 and the angles of va, vb and vc;
r rotates V2 using quaternion Q1;
11, calculating a quaternion Q2 of the vector rotation aveAngle1 angle of Normal 1;
12 rotating model B with Q2 as the rotation value of model B;
13 as the position of model B, centerA plus V2;
14 the tracked object a model of the final model B will be completely coincident with the object a model with no errors.
4. The lesion registration method of a high-precision surgical navigation system according to claim 1, wherein:
the image data of the patient's lesion includes CT or nuclear magnetic data in dicom format; the medical three-dimensional model reconstruction system adopts a mimics reconstruction tool, and other software capable of completing medical reconstruction can be actually selected according to user requirements.
5. The lesion registration method of a high-precision surgical navigation system according to claim 1, wherein:
the tracking object A is subjected to model design by adopting 3D printing modeling software, then an entity of the tracking object A is printed out by a high-precision 3D printer, the printing material is printed by adopting photosensitive resin, and the printing precision is controlled within 0.1 mm.
6. The lesion registration method of a high-precision surgical navigation system according to claim 1, wherein: the number of the positioning balls is at least 4, and the positioning balls are fluorescent small balls with the diameter of 1 cm.
CN202110678224.8A 2021-06-18 2021-06-18 Focus registration method for high-precision operation navigation system Active CN113349931B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110678224.8A CN113349931B (en) 2021-06-18 2021-06-18 Focus registration method for high-precision operation navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110678224.8A CN113349931B (en) 2021-06-18 2021-06-18 Focus registration method for high-precision operation navigation system

Publications (2)

Publication Number Publication Date
CN113349931A true CN113349931A (en) 2021-09-07
CN113349931B CN113349931B (en) 2024-06-04

Family

ID=77535041

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110678224.8A Active CN113349931B (en) 2021-06-18 2021-06-18 Focus registration method for high-precision operation navigation system

Country Status (1)

Country Link
CN (1) CN113349931B (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020077543A1 (en) * 2000-06-27 2002-06-20 Robert Grzeszczuk Method and apparatus for tracking a medical instrument based on image registration
US20050245820A1 (en) * 2004-04-28 2005-11-03 Sarin Vineet K Method and apparatus for verifying and correcting tracking of an anatomical structure during surgery
CN101474075A (en) * 2009-01-15 2009-07-08 复旦大学附属中山医院 Navigation system of minimal invasive surgery
CN101697869A (en) * 2009-10-19 2010-04-28 沈国芳 Fixing scaffold for surgical guidance
CN102999902A (en) * 2012-11-13 2013-03-27 上海交通大学医学院附属瑞金医院 Optical navigation positioning system based on CT (computed tomography) registration result and navigation method thereof
CN104359464A (en) * 2014-11-02 2015-02-18 天津理工大学 Mobile robot positioning method based on stereoscopic vision
CN106384554A (en) * 2016-10-08 2017-02-08 上海光韵达数字医疗科技有限公司 Operation training model and manufacturing method thereof and operation navigation system
CN107392995A (en) * 2017-07-05 2017-11-24 天津大学 Human body lower limbs method for registering in mechanical axis navigation system
CN108759826A (en) * 2018-04-12 2018-11-06 浙江工业大学 A kind of unmanned plane motion tracking method based on mobile phone and the more parameter sensing fusions of unmanned plane
CN109453505A (en) * 2018-12-03 2019-03-12 浙江大学 A kind of multi-joint method for tracing based on wearable device
CN109498106A (en) * 2018-12-26 2019-03-22 哈尔滨工程大学 A kind of positioning and air navigation aid of the intramedullary needle nail hole based on 3-D image
CN109498156A (en) * 2017-09-14 2019-03-22 北京大华旺达科技有限公司 A kind of head operation air navigation aid based on 3-D scanning
CN109925057A (en) * 2019-04-29 2019-06-25 苏州大学 A kind of minimally invasive spine surgical navigation methods and systems based on augmented reality
CN110025378A (en) * 2018-01-12 2019-07-19 中国科学院沈阳自动化研究所 A kind of operation auxiliary navigation method based on optical alignment method
CN110215281A (en) * 2019-06-11 2019-09-10 北京和华瑞博科技有限公司 A kind of femur or shin bone method for registering and device based on total knee replacement
CN111388092A (en) * 2020-03-17 2020-07-10 京东方科技集团股份有限公司 Positioning tracking piece, registration method, storage medium and electronic equipment
CN111494009A (en) * 2020-04-27 2020-08-07 上海霖晏医疗科技有限公司 Image registration method and device for surgical navigation and surgical navigation system
CN111803212A (en) * 2019-11-14 2020-10-23 苏州铸正机器人有限公司 Titanium nail registration system and method for cochlear implant navigation surgery
CN112451092A (en) * 2020-12-01 2021-03-09 杭州柳叶刀机器人有限公司 Joint replacement registration device and method

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020077543A1 (en) * 2000-06-27 2002-06-20 Robert Grzeszczuk Method and apparatus for tracking a medical instrument based on image registration
US20050245820A1 (en) * 2004-04-28 2005-11-03 Sarin Vineet K Method and apparatus for verifying and correcting tracking of an anatomical structure during surgery
CN101474075A (en) * 2009-01-15 2009-07-08 复旦大学附属中山医院 Navigation system of minimal invasive surgery
CN101697869A (en) * 2009-10-19 2010-04-28 沈国芳 Fixing scaffold for surgical guidance
CN102999902A (en) * 2012-11-13 2013-03-27 上海交通大学医学院附属瑞金医院 Optical navigation positioning system based on CT (computed tomography) registration result and navigation method thereof
CN104359464A (en) * 2014-11-02 2015-02-18 天津理工大学 Mobile robot positioning method based on stereoscopic vision
CN106384554A (en) * 2016-10-08 2017-02-08 上海光韵达数字医疗科技有限公司 Operation training model and manufacturing method thereof and operation navigation system
CN107392995A (en) * 2017-07-05 2017-11-24 天津大学 Human body lower limbs method for registering in mechanical axis navigation system
CN109498156A (en) * 2017-09-14 2019-03-22 北京大华旺达科技有限公司 A kind of head operation air navigation aid based on 3-D scanning
CN110025378A (en) * 2018-01-12 2019-07-19 中国科学院沈阳自动化研究所 A kind of operation auxiliary navigation method based on optical alignment method
CN108759826A (en) * 2018-04-12 2018-11-06 浙江工业大学 A kind of unmanned plane motion tracking method based on mobile phone and the more parameter sensing fusions of unmanned plane
CN109453505A (en) * 2018-12-03 2019-03-12 浙江大学 A kind of multi-joint method for tracing based on wearable device
CN109498106A (en) * 2018-12-26 2019-03-22 哈尔滨工程大学 A kind of positioning and air navigation aid of the intramedullary needle nail hole based on 3-D image
CN109925057A (en) * 2019-04-29 2019-06-25 苏州大学 A kind of minimally invasive spine surgical navigation methods and systems based on augmented reality
CN110215281A (en) * 2019-06-11 2019-09-10 北京和华瑞博科技有限公司 A kind of femur or shin bone method for registering and device based on total knee replacement
CN111803212A (en) * 2019-11-14 2020-10-23 苏州铸正机器人有限公司 Titanium nail registration system and method for cochlear implant navigation surgery
CN111388092A (en) * 2020-03-17 2020-07-10 京东方科技集团股份有限公司 Positioning tracking piece, registration method, storage medium and electronic equipment
CN111494009A (en) * 2020-04-27 2020-08-07 上海霖晏医疗科技有限公司 Image registration method and device for surgical navigation and surgical navigation system
CN112451092A (en) * 2020-12-01 2021-03-09 杭州柳叶刀机器人有限公司 Joint replacement registration device and method

Also Published As

Publication number Publication date
CN113349931B (en) 2024-06-04

Similar Documents

Publication Publication Date Title
EP3254621B1 (en) 3d image special calibrator, surgical localizing system and method
Sun et al. Automated dental implantation using image-guided robotics: registration results
CN104936556B (en) System and method for navigating and controlling implant positioner
CN111494009B (en) Image registration method and device for surgical navigation and surgical navigation system
CN110101452A (en) A kind of optomagnetic integrated positioning navigation method for surgical operation
CN113940755A (en) Surgical operation planning and navigation method integrating operation and image
CN115068110A (en) Image registration method and system for femoral neck fracture surgery navigation
CN113524201B (en) Active adjusting method and device for pose of mechanical arm, mechanical arm and readable storage medium
Wen et al. In situ spatial AR surgical planning using projector-Kinect system
Cash et al. Incorporation of a laser range scanner into an image-guided surgical system
CN112190328A (en) Holographic perspective positioning system and positioning method
CN109124835A (en) The localization method and system of late-segmental collapse point
Lee et al. Simultaneous optimization of patient–image registration and hand–eye calibration for accurate augmented reality in surgery
CN114821031A (en) Intraoperative image matching method, device and system based on C-arm machine
CN115500940A (en) Positioning display method of surgical needle and related device
Li et al. A vision-based navigation system with markerless image registration and position-sensing localization for oral and maxillofacial surgery
CN108420531B (en) Surgical tool adjusting method, electronic device and clamping device
CN113349931A (en) Focus registration method of high-precision surgical navigation system
Kaushik et al. Robot-based autonomous neuroregistration and neuronavigation: implementation and case studies
JP7577378B2 (en) Method for 2D/3D image alignment and surgical robot system for performing same
WO2022133049A1 (en) Systems and methods for registering a 3d representation of a patient with a medical device for patient alignment
CN112754664A (en) Method for finding hip joint center and knee joint implant
CN117549328B (en) Positioning system and method of surgical robot and surgical robot system
EP3794554A1 (en) An alignment system for liver surgery
Knoerlein et al. Comparison of tracker-based to tracker-less haptic device calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant