CN115414120A - Endoscope navigation system - Google Patents
Endoscope navigation system Download PDFInfo
- Publication number
- CN115414120A CN115414120A CN202211381785.2A CN202211381785A CN115414120A CN 115414120 A CN115414120 A CN 115414120A CN 202211381785 A CN202211381785 A CN 202211381785A CN 115414120 A CN115414120 A CN 115414120A
- Authority
- CN
- China
- Prior art keywords
- target
- positioning chip
- coordinate system
- processor
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 claims abstract description 23
- 230000033001 locomotion Effects 0.000 claims description 23
- 238000000034 method Methods 0.000 abstract description 16
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 210000000621 bronchi Anatomy 0.000 description 8
- 230000003902 lesion Effects 0.000 description 6
- 210000001519 tissue Anatomy 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 210000003445 biliary tract Anatomy 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000002591 computed tomography Methods 0.000 description 3
- 210000000244 kidney pelvis Anatomy 0.000 description 3
- 210000000056 organ Anatomy 0.000 description 3
- 210000000626 ureter Anatomy 0.000 description 3
- 238000004026 adhesive bonding Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000004800 polyvinyl chloride Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 206010016717 Fistula Diseases 0.000 description 1
- 230000003266 anti-allergic effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000000227 bioadhesive Substances 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000003890 fistula Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 210000004400 mucous membrane Anatomy 0.000 description 1
- 229920000915 polyvinyl chloride Polymers 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- 210000002417 xiphoid bone Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00154—Holding or positioning arrangements using guiding arrangements for insertion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G13/00—Operating tables; Auxiliary appliances therefor
- A61G13/02—Adjustable operating tables; Controls therefor
- A61G13/06—Adjustable operating tables; Controls therefor raising or lowering of the whole table surface
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G13/00—Operating tables; Auxiliary appliances therefor
- A61G13/10—Parts, details or accessories
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Endoscopes (AREA)
Abstract
The application is applicable to the technical field of medical assistance, provides an endoscope navigation system, includes: the endoscope equipment is provided with a target positioning chip in the head end; the operation table is used for bearing a target object, body surface labels are arranged at a plurality of anatomical positions of the target object, an auxiliary positioning chip is arranged in the body surface labels, and a detection device is arranged on the operation table; a display; the treater, the treater is connected with scope equipment, detection device and display respectively, and the treater is used for: constructing a coordinate system by taking any point in an operating room as an origin; determining the position of the auxiliary positioning chip in a coordinate system; converting the three-dimensional reconstruction model of the target object into a coordinate system; determining the position of the target positioning chip in the coordinate system, and displaying the coordinate system, the three-dimensional reconstruction model and the position of the target positioning chip in the coordinate system through a display; the direction of travel of the endoscopic device at the bifurcation is determined and displayed via a display. The method and the device can be used for navigating the advancing direction of the endoscope at the bifurcation.
Description
Technical Field
The application belongs to the technical field of medical assistance, and particularly relates to an endoscope navigation system.
Background
Human body duct structures such as biliary tract, bronchus, ureter renal pelvis and the like generally have a branch-like bifurcation structure, and lesions are usually located in a certain tiny branch of the branch-like bifurcation. The development of the endoscope technology provides possibility for disease diagnosis and treatment of organs such as biliary tract, bronchus, ureter renal pelvis and other natural cavities without cutting human tissues. The correct use of endoscopic instruments to reach the lesion through multiple bifurcate intersections is a very important step. However, the bifurcations of different dendritic bifurcation structures have no specific direction-resolved information to identify under endoscopic view. Therefore, the endoscopic approach to the target lesion in the correct bifurcation route is often determined empirically and repeated route attempts are often made. The tissues of the structures of the biliary tract, the bronchus, the ureter and the renal pelvis of the human body are very soft and sensitive, repeated endoscopic path attempts easily damage the mucous membranes of the structures, and severe patients can cause the tissues of the pipelines to be torn and broken, so that bleeding and even internal fistula occur, and serious complications are brought.
It is seen that there is a great need for a system that can navigate the direction of travel of an endoscope at a bifurcation.
Disclosure of Invention
The embodiment of the application provides an endoscope navigation system, which can solve the problem that tissue and organs are easily damaged due to repeated path attempts of an endoscope.
The embodiment of the application provides an endoscope navigation system, including:
the endoscope equipment is characterized in that a target positioning chip is arranged in the head end of the endoscope equipment;
the operation table is used for bearing a target object, body surface labels are arranged at a plurality of anatomical positions of the target object, an auxiliary positioning chip is arranged in the body surface labels, and a plurality of detection devices are arranged on the operation table;
a display;
the treater, the treater is connected with scope equipment, detection device and display respectively, and the treater is used for:
constructing a coordinate system by taking any point in an operating room as an origin;
determining the positions of a plurality of auxiliary positioning chips in a coordinate system according to the first signals detected by the detection device; the first signal is a signal transmitted by a plurality of auxiliary positioning chips;
converting a pre-constructed three-dimensional reconstruction model of the target object into a coordinate system; the position corresponding to the anatomical position in the three-dimensional reconstruction model is superposed with the position of the anatomical position in the coordinate system;
according to the second signal detected by the detection device, determining the position of the target positioning chip in the coordinate system, and displaying the position of the coordinate system, the three-dimensional reconstruction model and the target positioning chip in the coordinate system through a display; the second signal is a signal transmitted by the target positioning chip;
according to the real image obtained by image acquisition of the bifurcation when the head end of the endoscope equipment enters the body of the target object and reaches the bifurcation of the target part, the virtual image is obtained by image acquisition of the virtual camera at the position corresponding to the bifurcation in the three-dimensional reconstruction model, and the advancing direction of the endoscope equipment at the bifurcation is determined and displayed through the display.
Optionally, a gyroscope is further arranged in the head end of the endoscope device, and the gyroscope is connected with the processor.
Optionally, the target positioning chip and the auxiliary positioning chip both include: the radio frequency chip is connected with the antenna.
Optionally, the diameter of the cross section of the rf chip is 2 to 4 mm, and the height of the rf chip is 9 to 10 mm.
Optionally, the surgical bed comprises:
a liftable bed column;
the bed plate is used for bearing a target object and is arranged at the top of the lifting end of the liftable bed column;
the lifting bed comprises an L-shaped support frame, wherein the bottom end of the L-shaped support frame is connected with a base of a lifting bed column, a horizontal frame at the top end of the L-shaped support frame is positioned above a bed plate, each vertex of the horizontal frame at the top end is provided with a detection device, and a first side edge and a second side edge, which are parallel to each other, of the horizontal frame at the top end are guide rails;
the first moving pulley is arranged on the first side edge; a second moving pulley disposed on the second side;
one end of the slidable guide rail is connected with the longitudinal rod of the first moving pulley, and the other end of the slidable guide rail is connected with the longitudinal rod of the second moving pulley;
a third moving pulley arranged on the slidable guide rail, wherein a laser is arranged on a longitudinal rod of the third moving pulley;
the driving motor of first motion pulley, the driving motor of second motion pulley and the driving motor of third motion pulley all are connected with the treater, and the treater still is used for: and controlling the motion amount of the driving motor of the first motion pulley, the motion amount of the driving motor of the second motion pulley and the motion amount of the driving motor of the third motion pulley according to the position of the target positioning chip in the coordinate system, so that the laser beam of the laser points to the target positioning chip in the target object body vertically.
Optionally, the body surface label includes the second shell and sets up in the subsides of gluing of second shell bottom surface, and the assistance-localization real-time chip passes through the chip anchor and sets up in the second shell.
Optionally, the detecting device includes a base, a spherical shell disposed on the base, a signal transceiver disposed in the spherical shell, and a transceiver antenna disposed on the base, and the transceiver antenna is connected to the signal transceiver and the processor respectively.
Optionally, the processor is further configured to plan a navigation path of the endoscopic device to the target to be examined in the three-dimensional reconstruction model according to the position of the target to be examined at the target portion.
Optionally, when the processor determines and displays the traveling direction of the endoscope apparatus at the bifurcation through the display according to the real image and the virtual image, the processor is specifically configured to:
carrying out image registration on the virtual image and the real image to obtain a registration image;
and taking the traveling direction of the navigation path in the registered image as the traveling direction of the endoscope equipment at the bifurcation, and displaying the registered image through the display.
Optionally, when the processor displays the position of the target positioning chip in the coordinate system through the display, the processor is specifically configured to perform highlighting processing on the position of the target positioning chip in the coordinate system, and to perform highlighting display on the position of the target positioning chip in the coordinate system through the display.
The above scheme of this application has following beneficial effect:
in the embodiment of the application, the endoscope navigation system constructs a coordinate system for the operating table through the processor, converts the pre-constructed three-dimensional reconstruction model into the coordinate system, displays the three-dimensional reconstruction model and the position of the endoscope head end at the three-dimensional reconstruction model in the display, determines and displays the advancing direction of the endoscope equipment at the bifurcation through the processor when the endoscope equipment reaches the bifurcation, displays the advancing direction of the endoscope equipment at the bifurcation in the display, and realizes the navigation of the endoscope equipment at the bifurcation.
Other advantages of the present application will be described in detail in the detailed description that follows.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an endoscopic navigation system according to an embodiment of the present application;
fig. 2 is a first structural schematic diagram of a body surface label according to an embodiment of the present disclosure;
FIG. 3 is a second schematic structural view of a body surface label provided in accordance with an embodiment of the present application;
fig. 4 is a first schematic structural diagram of a positioning chip according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a positioning chip according to an embodiment of the present application;
fig. 6 is a first schematic structural diagram of a detection device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a detection apparatus according to an embodiment of the present application;
fig. 8 is a schematic diagram illustrating a positional relationship between an auxiliary positioning chip and each of the detecting devices according to an embodiment of the present disclosure;
FIG. 9 is a schematic view of an endoscopic device according to an embodiment of the present application capturing an actual image of a target object;
FIG. 10 is a schematic diagram of a virtual camera for acquiring a virtual image in a three-dimensional reconstruction model according to an embodiment of the present application;
fig. 11 is a schematic layout diagram of an endoscopic device, an internal gyroscope and a target positioning chip according to an embodiment of the present application;
FIG. 12 is an enlarged view of a portion of FIG. 1 at A;
FIG. 13 is an enlarged view of a portion of FIG. 1 at B;
fig. 14 is a partial view of a surgical bed according to an embodiment of the present application;
fig. 15 is a partial view of a second surgical bed according to an embodiment of the present application.
[ description of reference ]
10. An endoscopic device; 101. a gyroscope; 102. a target positioning chip; 20. an operating bed; 201. a liftable bed column; 202. a bed board; 203. an L-shaped support frame; 204. a first side edge; 205. a second side edge; 206. a first moving pulley; 2061. a longitudinal bar; 207. a slidable guide rail; 208. a laser; 209. a third moving sheave; 2091. a drive motor; 2092. a transmission gear; 30. a display; 40. a processor; 50. a detection device; 501. a base; 502. a spherical shell; 503. a signal transceiver; 504. a transmit-receive antenna; 60. body surface labeling; 601. a second housing; 602. auxiliary positioning of the chip; 603. chip fixing anchors; 604. gluing; 70. positioning the chip; 701. a first housing; 702. a radio frequency chip; 703. an antenna; 80. a target site; 81. an object to be inspected; 82. and navigating the path.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless otherwise specifically stated.
Currently, endoscopes require repeated routing attempts to reach the target lesion, and therefore tissue and organs are easily damaged during the procedure.
In view of the above problems, an embodiment of the present application provides an endoscope navigation system, where a processor constructs a coordinate system for an operating table, a pre-constructed three-dimensional reconstruction model is converted into the coordinate system, and the three-dimensional reconstruction model and a position of an endoscope head end at the three-dimensional reconstruction model are displayed in a display, and meanwhile, when an endoscope device reaches a bifurcation, a traveling direction of the endoscope device at the bifurcation is determined and displayed by the processor, and the traveling direction of the endoscope device at the bifurcation is displayed in the display, so as to implement navigation of the endoscope device at the bifurcation.
The endoscope navigation system provided by the application is exemplarily described below with reference to specific embodiments.
As shown in fig. 1, an embodiment of the present application provides an endoscope navigation system, including: an endoscopic device 10, an operating bed 20 for carrying a target object, a display 30 and a processor 40 (in fig. 1 the processor is located in a host device connected to the display).
Wherein, be equipped with the target location chip in the head end of scope equipment 10, be provided with a plurality of detecting device 50 on the operation table 20, a plurality of anatomical position departments that are located the target object on the operation table 20 all are equipped with the body surface label, are equipped with the assistance-localization real-time chip in this body surface label, and treater 40 is connected with scope equipment 10 (only having shown the processor in the picture and 1 detecting device's relation of connection), each detecting device 50 and display 30 respectively.
In some embodiments of the present application, the target object may be a person whose relevant portion needs to be examined through an endoscope, and the plurality of anatomical positions may be related to the portion of the person whose relevant portion needs to be examined, for example, when the portion of the person whose relevant portion needs to be examined is a bronchus, the plurality of anatomical positions may be positions of a body surface "xiphoid process", "anterior superior iliac spine", and the like.
In some embodiments of the present application, in order to facilitate the placement of the body surface label at the anatomical location of the target object, as shown in fig. 2 to 3, the body surface label includes a second housing 601 and a sticker 604 disposed on the bottom surface of the second housing 601, and the auxiliary positioning chip 602 is disposed in the second housing 601 through a chip anchor 603.
Illustratively, in some embodiments of the present application, the second housing may be a polyvinyl chloride (PVC) medical housing, and the sticker may be a double-sided tape made of an anti-allergic bio-adhesive.
In some embodiments of the present application, the target positioning chip is mainly configured to transmit a signal at a certain frequency, so that the detecting device transmits the signal strength to the processor after receiving the signal and the signal strength, and the processor calculates and determines the position of the target positioning chip.
Similar to the target positioning chip, the auxiliary positioning chip is also mainly used for transmitting a signal at a certain frequency, so that the detection device transmits the signal strength to the processor after receiving the signal and the signal strength, and the processor calculates and determines the position of the auxiliary positioning chip. It should be noted that the frequencies of the signals transmitted by the target positioning chip and the auxiliary positioning chips are different, so that the processor can distinguish the received signal strengths by the frequencies to determine the positions of the target positioning chip and the auxiliary positioning chips, respectively.
In some embodiments of the present application, the target positioning chip and the auxiliary positioning chip are identical in structure, and only the transmitted signal frequencies are different. For the convenience of describing the structures of the target positioning chip and the auxiliary positioning chip in conjunction with the drawings, the target positioning chip and the auxiliary positioning chip are collectively referred to as a positioning chip. Specifically, as shown in fig. 4 to 5, the positioning chip 70 includes: the first housing 701, and the rf chip 702 and the antenna 703 disposed in the first housing 701, wherein the rf chip 702 is connected to the antenna 703. The antenna 703 may be specifically a metal antenna plate.
The first shell can be a PVC medical shell, and the radio frequency chip is mainly used for transmitting signals and transmitting the signals to the detection device through the antenna. As an alternative example, the diameter of the cross section of the rf chip may be 2 to 4 mm, and the height of the rf chip may be 9 to 10 mm. It will be understood, of course, that the frequencies of the signals transmitted by the target positioning chip and the auxiliary positioning chips are different from each other.
In some embodiments of the present application, the detecting devices are configured to detect signals and signal strengths sent by the object-locating chip and the auxiliary-locating chip, so that the processor determines the positions of the object-locating chip and the auxiliary-locating chip according to the corresponding signal strengths.
Specifically, as shown in fig. 6 to fig. 7, the detecting device 50 includes a base 501, a spherical housing 502 disposed on the base 501, a signal transceiver 503 disposed in the spherical housing 502, and a transceiver antenna 504 disposed on the base 501, wherein the transceiver antenna 504 is connected to the signal transceiver 503 and the processor, respectively.
Wherein, the signal transceiver is mainly used for: and receiving signals and signal intensity transmitted by the target positioning chip and each auxiliary positioning chip through the transceiving antenna, and transmitting the received signals and signal intensity to the processor through the transceiving antenna.
The processor is used for executing the following actions (namely, the steps one to five) to realize navigation:
step one, a coordinate system is constructed by taking any point in an operating room as an origin.
In some embodiments of the present application, as shown in fig. 1, the surgical bed 20 described above includes: a liftable column 201, a bed plate 202 for bearing a target object, and an L-shaped support frame 203. The bed board 202 is arranged at the top of the lifting end of the lifting bed column 201, the bottom end of the L-shaped support frame 203 is connected with the base of the lifting bed column 201, the top horizontal frame of the L-shaped support frame 203 is located above the bed board 202, and each vertex of the top horizontal frame is provided with a detection device 50. In practical applications, the lifting of the liftable column 201 can be controlled by a control end (e.g., a control switch) of the liftable column 201, so that the bed board 202 is at a proper height.
In some embodiments of the present application, the processor may construct the coordinate system with the position of any of the detection devices in the operating room as the origin when constructing the coordinate system.
And secondly, determining the positions of the plurality of auxiliary positioning chips in the coordinate system according to the first signals (the first signals are signals emitted by the plurality of auxiliary positioning chips) detected by the detection device.
In some embodiments of the present application, for convenience of description, the 4 detection devices on the operating table are respectively denoted as P m-a 、P m-b 、P m-c 、P m-d Will be given by P m-a Three coordinate axes of a coordinate system constructed by taking the position as an origin O are respectively marked as an X axis, a Y axis and a Z axis, and the coordinate of the auxiliary positioning chip under the coordinate system is assumed to be (X) 1 ,y 1 ,z 1 ) And the horizontal frame at the top end of the L-shaped support frame is a square frame. Based on this, for each of the plurality of auxiliary positioning chips, a specific process of determining the position of the auxiliary positioning chip in the coordinate system may be: first, as shown in FIG. 8, P is calculated separately m-a 、P m-b 、P m-c 、P m-d And a distance from the auxiliary positioning chip, wherein P m-a The distance between the auxiliary positioning chip and the auxiliary positioning chip is,P m-b The distance between the auxiliary positioning chip and the auxiliary positioning chip is,P m-c The distance between the auxiliary positioning chip and the auxiliary positioning chip is,P m-d The distance between the auxiliary positioning chip and the auxiliary positioning chip is(ii) a Then jointly solve the problemFour equations are listed, and coordinate values (x) of the auxiliary positioning chip are solved 1 ,y 1 ,z 1 ):
Wherein, the distanceIs P m-a 、P m-b 、P m-c 、P m-d The distance between two adjacent detecting devices is measured,、、andthe solution process of (2) is similar. To avoid excessive repetition, the method comprisesThe distance calculation method between the auxiliary positioning chip and the detection device is exemplarily described as follows:rssi is P m-a The received signal strength (the signal strength is transmitted by the assistant positioning chip), A is the assistant positioning chip and P m-a The signal strength at 1m distance, n is the ambient attenuation factor.
And step three, converting the pre-constructed three-dimensional reconstruction model of the target object into a coordinate system.
In the coordinate system, for each anatomical position in the plurality of anatomical positions, a position corresponding to the anatomical position in the three-dimensional reconstruction model coincides with a position of the anatomical position in the coordinate system, so that a relative position of the three-dimensional reconstruction model and the operating table in the coordinate system is consistent with a relative position of the target object and the operating table in reality, and accuracy of subsequent navigation is further ensured.
In some embodiments of the present application, a three-dimensional reconstruction model of the target object may be constructed in advance according to electronic Computed Tomography (CT) data of the target object, and during the process of transferring the three-dimensional reconstruction model to the coordinate system, the three-dimensional reconstruction model needs to be moved and rotated until a position corresponding to the anatomical position in the three-dimensional reconstruction model coincides with the position of the anatomical position in the coordinate system.
And step four, determining the position of the target positioning chip in the coordinate system according to a second signal (the second signal is a signal transmitted by the target positioning chip) detected by the detection device, and displaying the coordinate system, the three-dimensional reconstruction model and the position of the target positioning chip in the coordinate system through a display.
In some embodiments of the present application, a specific process of determining the position of the target positioning chip in the coordinate system is the same as a specific process of determining the position of the auxiliary positioning chip in the coordinate system, and therefore, in order to avoid too many repetitions, the process is not described herein again.
In some embodiments of the present application, after determining the position of the target positioning chip in the coordinate system, the three-dimensional reconstruction model, and the position of the target positioning chip may be displayed on a display. It should be noted that, because the three-dimensional reconstruction model and the position of the target positioning chip are in the same coordinate system, the display can display the position of the target positioning chip in the three-dimensional reconstruction model in real time, where the position is the position of the head end of the endoscope device in the target object, so that the operator can clearly see the specific position of the endoscope device in the target object through the display.
As a preferred example, after determining the position of the target positioning chip in the coordinate system, the processor may perform a highlighting process (e.g., highlighting) on the position of the target positioning chip in the coordinate system, and perform a highlighting process (e.g., highlighting) on the position of the target positioning chip in the coordinate system through the display, so that the operator can more simply and clearly see the specific position of the endoscopic device in the target object.
And fifthly, according to a real image obtained by image acquisition of the bifurcation when the head end of the endoscope device enters the body of the target object and reaches the bifurcation of the target part, acquiring an image of a position, corresponding to the bifurcation, in the three-dimensional reconstruction model by using the virtual camera to obtain a virtual image, and determining and displaying the advancing direction of the endoscope device at the bifurcation by using the display.
The target part is a part (such as a bronchus) to be inspected through an endoscope in a target object, the target part has a dendritic bifurcation structure, and an object to be inspected (such as a lesion) is positioned in a tiny branch of the dendritic bifurcation.
It should be noted that, the processor can determine the position of the target to be inspected at the target position through the CT data, and simultaneously, according to the position of the target to be inspected at the target position, a navigation path from the endoscope apparatus to the target to be inspected can be planned in the three-dimensional reconstruction model by using a common path planning method, and the navigation path is marked in the three-dimensional reconstruction model.
In some embodiments of the present application, when the head end of the endoscopic device enters the body of the target object and reaches the bifurcation of the target portion, the processor may control the endoscopic device to perform image capture on the center of the bifurcation, and obtain a real image of the endoscopic device obtained through image capture from the endoscopic device. Meanwhile, calling a virtual camera at the current position of the target positioning chip (namely the target part fork) in the three-dimensional reconstruction model, and acquiring images at the target part fork according to the horizontal direction of the visual field for acquiring the images by the endoscope equipment to obtain a virtual image.
After the processor obtains the virtual image and the real image, a registration image can be obtained by carrying out image registration on the virtual image and the real image; and then taking the traveling direction of the navigation path in the registered image as the traveling direction of the endoscope equipment at the bifurcation, and displaying the registered image through a display to realize navigation.
It should be noted that, because the three-dimensional reconstruction model is labeled with the navigation path, the virtual image acquired by the virtual camera carries the traveling direction of the navigation path at the bifurcation.
Illustratively, as shown in fig. 9 and 10, assuming that the target portion 80 is a bronchus and the target 81 to be inspected is on a small branch of the bronchus, when the endoscope apparatus 10 enters the target object and reaches the bifurcation of the bronchus, the real image acquired by the endoscope apparatus is shown as part C in fig. 9, and at the same time, the virtual image acquired by the virtual camera is shown as part D in fig. 10, and as can be seen from parts C and D, the three-dimensional reconstruction model is marked with a navigation path 82, and the virtual image carries the traveling direction of the navigation path at the bifurcation.
Since the angles of the endoscope device and the virtual camera during image acquisition are likely to be different, even 180 degrees, the images acquired by the endoscope device and the virtual camera are not matched, that is, the navigation path cannot accurately navigate. In order to avoid the situation, in some embodiments of the application, when the processor calls the virtual camera to acquire the virtual image, the processor controls the virtual camera to acquire the virtual image according to the horizontal direction of the field of view for image acquisition of the endoscope device, so that the angle of deflection of the virtual camera is always consistent with the angle of deflection of the endoscope device in reality, the rotation angle of a picture acquired by the virtual camera is further ensured to be consistent with reality, and the navigation accuracy is improved.
In some embodiments of the present application, to ensure that the processor can accurately know the angle of the endoscope apparatus, a gyroscope may be disposed in the head end of the endoscope apparatus, and the gyroscope is connected to the processor. The gyroscope is mainly used for knowing the deflection angle between the endoscope equipment and the earth gravity line and transmitting the deflection angle value to the processor, so that the processor can control the deflection angle of the virtual camera conveniently. In the linear process of the forward movement of the endoscope apparatus (the position adjustment process of the non-divergent port), the field of view can be kept at the same level without the left-right spiral change of the field of view according to the numerical value of the gyroscope.
In some embodiments of the present application, as shown in fig. 11, a separate compartment in the center of the interior of the head end of the endoscopic device can be opened by a cover at the front end of the lens, and the gyroscope 101 and the target positioning chip 102 can be arranged in a longitudinal row at the center of the lens.
It should be noted that, in some embodiments of the present application, in order to prevent a gyroscope error, the acquired real image and the virtual image are subjected to image registration comparison, so as to ensure that the two images have been completely rotated left and right to the same angle, thereby ensuring the accuracy of navigation. It is further noted that this registration will be useful when facing from 3 to 4 or even more divergent channels in front, where the image angle sensitivity requirement will be high and a slight rotation may result in entering the wrong channel.
In some embodiments of the present application, in order to make the operator more intuitively know the position of the endoscope apparatus in the target object, a laser capable of emitting low-power visible laser may be further disposed on the operating bed, and a laser beam of the laser is controlled to vertically point to a target positioning chip in the target object, so that the laser just forms a light spot (the head end of the endoscope is located below the light spot) on the body surface of the target object, and the operator can observe the position of the light spot on the body surface of the target object to know the position of the head end of the endoscope apparatus at present.
In order to set the laser, as shown in fig. 1 and fig. 12 to 15, a first side 204 and a second side 205 parallel to each other on a top horizontal frame of the L-shaped support frame 203 are both guide rails, and the operating table 20 further includes: a first motion pulley 206 disposed on the first side 204; a second moving pulley disposed on the second side 205; a slidable rail 207, one end of the slidable rail 207 being connected to the vertical rod 2061 of the first moving pulley 206, and the other end being connected to the vertical rod of the second moving pulley; a third moving pulley 209 provided on the slidable rail 207, and a laser 208 provided on a longitudinal bar of the third moving pulley 209. It should be noted that the first moving pulley 206, the second moving pulley, and the third moving pulley 209 have the same structure, and each of them includes a driving motor, a cross bar, a longitudinal bar, a transmission gear, and a pulley. Only the structure of the third moving pulley 209 is shown in the drawing, in which the vertical bar is cross-connected to the horizontal bar, one side of the driving gear 2092 is connected to the driving motor 2091, the other side is connected to the horizontal bar, and the pulleys are provided at both ends of the horizontal bar.
And the driving motor of the first moving pulley, the driving motor of the second moving pulley and the driving motor of the third moving pulley are connected with the processor. Correspondingly, the processor is further configured to: and controlling the motion amount of a driving motor of the first motion pulley, the motion amount of a driving motor of the second motion pulley and the motion amount of a driving motor of the third motion pulley according to the position of the target positioning chip in the coordinate system, so that the laser beam of the laser is vertically directed to the target positioning chip in the target object body.
It should be noted that, the driving motor of the first moving pulley, the driving motor of the second moving pulley, and the driving motor of the third moving pulley may be servo motors, and the operating states of the three are controlled by the processor, the slidable guide rail may move along the length direction of the first side edge under the driving of the driving motors of the first moving pulley and the second moving pulley and the transmission gear, and the laser may move along the length direction of the slidable guide rail under the driving of the third moving pulley.
Illustratively, the operating table may be a standard operating table, 200mm long and 90mm wide, and the processor may be a Central Processing Unit (CPU).
It should be noted that, in the endoscope navigation system provided in the embodiment of the present application, a coordinate system is constructed for the operating table through the processor, the three-dimensional reconstruction model constructed in advance is converted into the coordinate system, and the three-dimensional reconstruction model and the position of the endoscope head end at the three-dimensional reconstruction model are displayed in the display, and meanwhile, when the endoscope device reaches the bifurcation, the advancing direction of the endoscope device at the bifurcation is determined and displayed through the processor, and the advancing direction of the endoscope device at the bifurcation is displayed in the display, so that navigation of the endoscope device at the bifurcation is realized.
Namely, the endoscope navigation system can at least improve the visualization degree, the operation precision and the path navigation during the operation of the existing endoscope, reduce the lesion damage caused by repeated path trial operation due to the difficulty of path selection under the operation of the endoscope, and reduce the difficulty of the operation of the endoscope.
In addition, it should be noted that the endoscope navigation system is suitable for endoscope navigation of various dendritic bifurcation structures, such as bronchoscopes, cholangioscopes, nephroscopes, and the like.
While the foregoing is directed to the preferred embodiment of the present application, it will be appreciated by those skilled in the art that various changes and modifications may be made therein without departing from the principles of the application, and it is intended that such changes and modifications be covered by the scope of the application.
Claims (10)
1. An endoscopic navigation system, comprising:
the endoscope equipment is internally provided with a target positioning chip at the head end;
the operation table is used for bearing a target object, body surface labels are arranged at a plurality of anatomical positions of the target object, an auxiliary positioning chip is arranged in the body surface labels, and a plurality of detection devices are arranged on the operation table;
a display;
a processor, the processor being connected to the endoscopic device, the detection device and the display, respectively, the processor being configured to:
constructing a coordinate system by taking any point in the operating room as an origin;
determining the positions of a plurality of auxiliary positioning chips in the coordinate system according to the first signals detected by the detection device; the first signal is a signal transmitted by a plurality of auxiliary positioning chips;
converting a pre-constructed three-dimensional reconstruction model of the target object into the coordinate system; the position corresponding to the anatomical position in the three-dimensional reconstruction model is superposed with the position of the anatomical position in the coordinate system;
according to the second signal detected by the detection device, determining the position of the target positioning chip in the coordinate system, and displaying the coordinate system, the three-dimensional reconstruction model and the position of the target positioning chip in the coordinate system through the display; the second signal is a signal transmitted by the target positioning chip;
and according to a real image obtained by image acquisition of the bifurcation when the endoscope equipment enters the target object at the head end and reaches the bifurcation of a target part, and an image acquisition of a position corresponding to the bifurcation in the three-dimensional reconstruction model through a virtual camera to obtain a virtual image, determining and displaying the advancing direction of the endoscope equipment at the bifurcation through the display.
2. An endoscopic navigation system according to claim 1, wherein a gyroscope is further disposed within said head end of said endoscopic device, said gyroscope being connected to said processor.
3. An endoscopic navigation system according to claim 1, wherein said target positioning chip and said auxiliary positioning chip each comprise: the radio frequency chip is connected with the antenna.
4. An endoscopic navigation system according to claim 3, wherein said radiofrequency chip has a cross-sectional diameter of 2 to 4 mm and a height of 9 to 10 mm.
5. An endoscopic navigation system according to claim 1, wherein said operating bed comprises:
a liftable bed column;
the bed plate is used for bearing the target object and is arranged at the top of the lifting end of the liftable bed column;
the bottom end of the L-shaped support frame is connected with the base of the liftable bedpost, the horizontal frame at the top end of the L-shaped support frame is positioned above the bed plate, each vertex of the horizontal frame at the top end is provided with one detection device, and the first side edge and the second side edge, which are parallel to each other, of the horizontal frame at the top end are guide rails;
a first moving pulley disposed on the first side; a second moving pulley disposed on the second side;
one end of the slidable guide rail is connected with the longitudinal rod of the first moving pulley, and the other end of the slidable guide rail is connected with the longitudinal rod of the second moving pulley;
a third moving pulley arranged on the sliding guide rail, wherein a laser is arranged on a longitudinal rod of the third moving pulley;
the driving motor of the first moving pulley, the driving motor of the second moving pulley and the driving motor of the third moving pulley are all connected with the processor, and the processor is further used for: and controlling the motion amount of a driving motor of the first motion pulley, the motion amount of a driving motor of the second motion pulley and the motion amount of a driving motor of the third motion pulley according to the position of the target positioning chip in the coordinate system, so that the laser beam of the laser is vertically directed to the target positioning chip in the target object.
6. An endoscopic navigation system according to claim 1, wherein said body surface label comprises a second housing and a sticker disposed on a bottom surface of said second housing, said auxiliary positioning chip is disposed in said second housing via a chip anchor.
7. An endoscopic navigation system according to claim 1, wherein said detecting means comprises a base, a spherical shell disposed on said base, a signal transceiver disposed in said spherical shell, and a transceiver antenna disposed on said base, said transceiver antenna being connected to said signal transceiver and said processor, respectively.
8. An endoscopic navigation system according to claim 1, wherein said processor is further configured to plan a navigation path of said endoscopic device to said target to be inspected in said three-dimensional reconstructed model based on a position of said target to be inspected at said target site.
9. An endoscopic navigation system according to claim 8, wherein said processor, when determining and displaying via said display a direction of travel of said endoscopic device at said bifurcation from said real image and said virtual image, is specifically configured to:
carrying out image registration on the virtual image and the real image to obtain a registration image;
and taking the traveling direction of the navigation path in the registered image as the traveling direction of the endoscope equipment at the bifurcation, and displaying the registered image through the display.
10. An endoscopic navigation system according to claim 1, wherein said processor is specifically configured to highlight the position of said target positioning chip in said coordinate system when the position of said target positioning chip in said coordinate system is displayed by said display, and to highlight the position of said target positioning chip in said coordinate system by said display.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211381785.2A CN115414120A (en) | 2022-11-07 | 2022-11-07 | Endoscope navigation system |
PCT/CN2023/104730 WO2024098804A1 (en) | 2022-11-07 | 2023-06-30 | Endoscope navigation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211381785.2A CN115414120A (en) | 2022-11-07 | 2022-11-07 | Endoscope navigation system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115414120A true CN115414120A (en) | 2022-12-02 |
Family
ID=84208252
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211381785.2A Pending CN115414120A (en) | 2022-11-07 | 2022-11-07 | Endoscope navigation system |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115414120A (en) |
WO (1) | WO2024098804A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024098804A1 (en) * | 2022-11-07 | 2024-05-16 | 中南大学 | Endoscope navigation system |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102470014A (en) * | 2009-06-29 | 2012-05-23 | 皇家飞利浦电子股份有限公司 | Method and apparatus for tracking in a medical procedure |
CN102885650A (en) * | 2012-10-12 | 2013-01-23 | 杭州三坛医疗科技有限公司 | Surgical location and navigation device attached to C-arm X ray machine |
CN204364123U (en) * | 2014-11-07 | 2015-06-03 | 刘弘毅 | Medical treatment navigation system |
CN104755009A (en) * | 2013-04-15 | 2015-07-01 | 奥林巴斯医疗株式会社 | Endoscope system |
CN105288865A (en) * | 2015-11-10 | 2016-02-03 | 康健 | Skin laser treatment auxiliary robot and auxiliary method thereof |
CN105939647A (en) * | 2013-10-24 | 2016-09-14 | 奥瑞斯外科手术机器人公司 | Robotically-assisted endoluminal surgical systems and related methods |
CN109922752A (en) * | 2016-10-28 | 2019-06-21 | 柯惠有限合伙公司 | Electromagnetic navigation antenna module and electromagnetic navigation system including the component |
CN110368089A (en) * | 2019-08-07 | 2019-10-25 | 湖南省华芯医疗器械有限公司 | A kind of bronchial endoscope three-dimensional navigation method |
CN110831481A (en) * | 2018-05-31 | 2020-02-21 | 奥瑞斯健康公司 | Path-based navigation of tubular networks |
CN112741692A (en) * | 2020-12-18 | 2021-05-04 | 上海卓昕医疗科技有限公司 | Rapid navigation method and system for realizing device navigation to target tissue position |
CN113616333A (en) * | 2021-09-13 | 2021-11-09 | 上海微创医疗机器人(集团)股份有限公司 | Catheter movement assistance method, catheter movement assistance system, and readable storage medium |
CN113749767A (en) * | 2020-06-03 | 2021-12-07 | 柯惠有限合伙公司 | Surgical tool navigation using sensor fusion |
CN114577100A (en) * | 2022-02-21 | 2022-06-03 | 成都思瑞定生命科技有限公司 | Magnetic field target positioning calculation method |
CN115120346A (en) * | 2022-08-30 | 2022-09-30 | 中国科学院自动化研究所 | Target point positioning method and device, electronic equipment and bronchoscope system |
CN115252992A (en) * | 2022-07-28 | 2022-11-01 | 北京大学第三医院(北京大学第三临床医学院) | Trachea cannula navigation system based on structured light stereoscopic vision |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101797182A (en) * | 2010-05-20 | 2010-08-11 | 北京理工大学 | Nasal endoscope minimally invasive operation navigating system based on augmented reality technique |
CN110141360A (en) * | 2018-02-11 | 2019-08-20 | 四川英捷达医疗科技有限公司 | Digital technology air navigation aid |
WO2021253943A1 (en) * | 2020-06-15 | 2021-12-23 | 湖南卓世创思科技有限公司 | Laser locating frame system |
CN113855239B (en) * | 2021-09-24 | 2023-10-20 | 深圳高性能医疗器械国家研究院有限公司 | Guide wire navigation system and method in vascular intervention operation |
CN113940755B (en) * | 2021-09-30 | 2023-05-02 | 南开大学 | Surgical planning and navigation method integrating surgical operation and image |
CN114652443A (en) * | 2022-03-08 | 2022-06-24 | 深圳高性能医疗器械国家研究院有限公司 | Ultrasonic operation navigation system and method, storage medium and device |
CN115414116B (en) * | 2022-11-07 | 2023-03-24 | 中南大学 | Simulation positioning selection system for optimal site of laparoscope stab card |
CN115414120A (en) * | 2022-11-07 | 2022-12-02 | 中南大学 | Endoscope navigation system |
-
2022
- 2022-11-07 CN CN202211381785.2A patent/CN115414120A/en active Pending
-
2023
- 2023-06-30 WO PCT/CN2023/104730 patent/WO2024098804A1/en unknown
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102470014A (en) * | 2009-06-29 | 2012-05-23 | 皇家飞利浦电子股份有限公司 | Method and apparatus for tracking in a medical procedure |
CN102885650A (en) * | 2012-10-12 | 2013-01-23 | 杭州三坛医疗科技有限公司 | Surgical location and navigation device attached to C-arm X ray machine |
CN104755009A (en) * | 2013-04-15 | 2015-07-01 | 奥林巴斯医疗株式会社 | Endoscope system |
CN105939647A (en) * | 2013-10-24 | 2016-09-14 | 奥瑞斯外科手术机器人公司 | Robotically-assisted endoluminal surgical systems and related methods |
CN204364123U (en) * | 2014-11-07 | 2015-06-03 | 刘弘毅 | Medical treatment navigation system |
CN105288865A (en) * | 2015-11-10 | 2016-02-03 | 康健 | Skin laser treatment auxiliary robot and auxiliary method thereof |
CN109922752A (en) * | 2016-10-28 | 2019-06-21 | 柯惠有限合伙公司 | Electromagnetic navigation antenna module and electromagnetic navigation system including the component |
CN110831481A (en) * | 2018-05-31 | 2020-02-21 | 奥瑞斯健康公司 | Path-based navigation of tubular networks |
CN110368089A (en) * | 2019-08-07 | 2019-10-25 | 湖南省华芯医疗器械有限公司 | A kind of bronchial endoscope three-dimensional navigation method |
CN113749767A (en) * | 2020-06-03 | 2021-12-07 | 柯惠有限合伙公司 | Surgical tool navigation using sensor fusion |
CN112741692A (en) * | 2020-12-18 | 2021-05-04 | 上海卓昕医疗科技有限公司 | Rapid navigation method and system for realizing device navigation to target tissue position |
CN113616333A (en) * | 2021-09-13 | 2021-11-09 | 上海微创医疗机器人(集团)股份有限公司 | Catheter movement assistance method, catheter movement assistance system, and readable storage medium |
CN114577100A (en) * | 2022-02-21 | 2022-06-03 | 成都思瑞定生命科技有限公司 | Magnetic field target positioning calculation method |
CN115252992A (en) * | 2022-07-28 | 2022-11-01 | 北京大学第三医院(北京大学第三临床医学院) | Trachea cannula navigation system based on structured light stereoscopic vision |
CN115120346A (en) * | 2022-08-30 | 2022-09-30 | 中国科学院自动化研究所 | Target point positioning method and device, electronic equipment and bronchoscope system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024098804A1 (en) * | 2022-11-07 | 2024-05-16 | 中南大学 | Endoscope navigation system |
Also Published As
Publication number | Publication date |
---|---|
WO2024098804A1 (en) | 2024-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110891469B (en) | System and method for registration of positioning sensors | |
US11529197B2 (en) | Device and method for tracking the position of an endoscope within a patient's body | |
CN106890025B (en) | Minimally invasive surgery navigation system and navigation method | |
EP1103229B1 (en) | System and method for use with imaging devices to facilitate planning of interventional procedures | |
US7585273B2 (en) | Wireless determination of endoscope orientation | |
US8414476B2 (en) | Method for using variable direction of view endoscopy in conjunction with image guided surgical systems | |
JP4265698B2 (en) | X-ray guided surgical positioning system using extended mapping space | |
JP3589505B2 (en) | 3D image processing and display device | |
US7671887B2 (en) | System and method of navigating a medical instrument | |
CN101474075B (en) | Navigation system of minimal invasive surgery | |
US20070276234A1 (en) | Systems and Methods for Intraoperative Targeting | |
US20050054895A1 (en) | Method for using variable direction of view endoscopy in conjunction with image guided surgical systems | |
WO2014076931A1 (en) | Image-processing apparatus, image-processing method, and program | |
US20210393338A1 (en) | Medical instrument driving | |
CN101219061A (en) | Coloring electroanatomical maps to indicate ultrasound data acquisiton | |
JP2009531113A (en) | Image guided surgery system | |
JP2001061861A (en) | System having image photographing means and medical work station | |
JP2008018172A (en) | Surgery supporting system | |
US20210393344A1 (en) | Control scheme calibration for medical instruments | |
CN107019513A (en) | Intravascular virtual endoscope imaging system and its method of work based on electromagnetic location composite conduit | |
CN115500868B (en) | B-ultrasonic positioning system capable of interactively confirming position information with detected target | |
CN115414120A (en) | Endoscope navigation system | |
CN102008283B (en) | Electronic bronchoscope system with color Doppler ultrasonic scanning function | |
CN115414116B (en) | Simulation positioning selection system for optimal site of laparoscope stab card | |
Duan | Magnetic tracking and positioning in endoscopy |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20221202 |