WO2023085253A1 - 画像処理装置、画像処理方法、プログラム、及び画像処理システム - Google Patents
画像処理装置、画像処理方法、プログラム、及び画像処理システム Download PDFInfo
- Publication number
- WO2023085253A1 WO2023085253A1 PCT/JP2022/041503 JP2022041503W WO2023085253A1 WO 2023085253 A1 WO2023085253 A1 WO 2023085253A1 JP 2022041503 W JP2022041503 W JP 2022041503W WO 2023085253 A1 WO2023085253 A1 WO 2023085253A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- interest
- image processing
- region
- processing apparatus
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 723
- 238000003672 processing method Methods 0.000 title description 5
- 210000004204 blood vessel Anatomy 0.000 claims abstract description 206
- 238000011282 treatment Methods 0.000 claims abstract description 142
- 238000001356 surgical procedure Methods 0.000 claims abstract description 52
- 239000003550 marker Substances 0.000 claims description 124
- 238000000034 method Methods 0.000 claims description 107
- 206010002329 Aneurysm Diseases 0.000 claims description 106
- 230000010102 embolization Effects 0.000 claims description 61
- 230000033001 locomotion Effects 0.000 claims description 39
- 238000001514 detection method Methods 0.000 claims description 33
- 238000003384 imaging method Methods 0.000 claims description 33
- 230000003073 embolic effect Effects 0.000 claims description 29
- 238000003780 insertion Methods 0.000 claims description 23
- 230000037431 insertion Effects 0.000 claims description 23
- 230000001133 acceleration Effects 0.000 claims description 20
- 238000007689 inspection Methods 0.000 claims description 20
- 230000011218 segmentation Effects 0.000 claims description 20
- 230000002792 vascular Effects 0.000 claims description 20
- 238000010801 machine learning Methods 0.000 claims description 19
- 239000002872 contrast media Substances 0.000 claims description 16
- 230000008859 change Effects 0.000 claims description 15
- 208000031481 Pathologic Constriction Diseases 0.000 claims description 14
- 239000000463 material Substances 0.000 claims description 14
- 239000007924 injection Substances 0.000 claims description 12
- 238000002347 injection Methods 0.000 claims description 12
- 230000003902 lesion Effects 0.000 claims description 12
- 230000036262 stenosis Effects 0.000 claims description 12
- 208000037804 stenosis Diseases 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 11
- 239000007788 liquid Substances 0.000 claims description 10
- 208000007536 Thrombosis Diseases 0.000 claims description 9
- 201000010099 disease Diseases 0.000 claims description 8
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims description 8
- 230000008034 disappearance Effects 0.000 claims description 6
- 238000005520 cutting process Methods 0.000 claims description 4
- 230000004807 localization Effects 0.000 claims description 4
- 210000005259 peripheral blood Anatomy 0.000 claims description 4
- 239000011886 peripheral blood Substances 0.000 claims description 4
- 230000003068 static effect Effects 0.000 claims description 4
- 238000013151 thrombectomy Methods 0.000 claims description 4
- 238000012549 training Methods 0.000 claims description 4
- 230000003111 delayed effect Effects 0.000 claims description 2
- 230000009545 invasion Effects 0.000 claims description 2
- 239000012141 concentrate Substances 0.000 abstract description 7
- 238000010521 absorption reaction Methods 0.000 abstract description 2
- 238000003860 storage Methods 0.000 description 52
- 238000010586 diagram Methods 0.000 description 51
- 230000006870 function Effects 0.000 description 26
- 238000009826 distribution Methods 0.000 description 20
- 238000013527 convolutional neural network Methods 0.000 description 16
- 201000008450 Intracranial aneurysm Diseases 0.000 description 15
- 238000010191 image analysis Methods 0.000 description 14
- 238000000605 extraction Methods 0.000 description 12
- 238000002583 angiography Methods 0.000 description 11
- 210000001367 artery Anatomy 0.000 description 10
- 239000000284 extract Substances 0.000 description 9
- 238000012276 Endovascular treatment Methods 0.000 description 7
- 206010028980 Neoplasm Diseases 0.000 description 7
- 210000004004 carotid artery internal Anatomy 0.000 description 7
- 238000002594 fluoroscopy Methods 0.000 description 7
- 238000005259 measurement Methods 0.000 description 7
- 206010073306 Exposure to radiation Diseases 0.000 description 6
- 210000004369 blood Anatomy 0.000 description 6
- 239000008280 blood Substances 0.000 description 6
- 210000000988 bone and bone Anatomy 0.000 description 6
- 230000002490 cerebral effect Effects 0.000 description 6
- 239000003086 colorant Substances 0.000 description 6
- 238000013135 deep learning Methods 0.000 description 6
- 230000015654 memory Effects 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 210000003128 head Anatomy 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 239000000126 substance Substances 0.000 description 5
- 206010069729 Collateral circulation Diseases 0.000 description 4
- 206010053648 Vascular occlusion Diseases 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 4
- 210000001715 carotid artery Anatomy 0.000 description 4
- 239000003795 chemical substances by application Substances 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 210000002216 heart Anatomy 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000009467 reduction Effects 0.000 description 4
- 208000002263 Intracranial Arteriovenous Malformations Diseases 0.000 description 3
- 206010068149 Vessel perforation Diseases 0.000 description 3
- 210000002376 aorta thoracic Anatomy 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 3
- 206010008118 cerebral infarction Diseases 0.000 description 3
- 208000026106 cerebrovascular disease Diseases 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 3
- 210000004013 groin Anatomy 0.000 description 3
- 238000002595 magnetic resonance imaging Methods 0.000 description 3
- 210000003657 middle cerebral artery Anatomy 0.000 description 3
- 239000002245 particle Substances 0.000 description 3
- 230000035515 penetration Effects 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 208000021331 vascular occlusion disease Diseases 0.000 description 3
- 208000000483 Central Nervous System Vascular Malformations Diseases 0.000 description 2
- 208000034710 Cerebral arteriovenous malformation Diseases 0.000 description 2
- 208000005189 Embolism Diseases 0.000 description 2
- 206010015866 Extravasation Diseases 0.000 description 2
- 206010047163 Vasospasm Diseases 0.000 description 2
- 206010047249 Venous thrombosis Diseases 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 2
- 210000003484 anatomy Anatomy 0.000 description 2
- 201000000034 arteriovenous malformations of the brain Diseases 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 210000001627 cerebral artery Anatomy 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000002224 dissection Methods 0.000 description 2
- 230000036251 extravasation Effects 0.000 description 2
- 210000003414 extremity Anatomy 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 238000013508 migration Methods 0.000 description 2
- 230000005012 migration Effects 0.000 description 2
- 210000001636 ophthalmic artery Anatomy 0.000 description 2
- 238000010422 painting Methods 0.000 description 2
- 210000004197 pelvis Anatomy 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 230000002980 postoperative effect Effects 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 230000002966 stenotic effect Effects 0.000 description 2
- 210000001994 temporal artery Anatomy 0.000 description 2
- CNHDIAIOKMXOLK-UHFFFAOYSA-N toluquinol Chemical compound CC1=CC(O)=CC=C1O CNHDIAIOKMXOLK-UHFFFAOYSA-N 0.000 description 2
- 231100000216 vascular lesion Toxicity 0.000 description 2
- 210000003462 vein Anatomy 0.000 description 2
- 206010003210 Arteriosclerosis Diseases 0.000 description 1
- 240000001973 Ficus microcarpa Species 0.000 description 1
- 206010061216 Infarction Diseases 0.000 description 1
- 241000032989 Ipomoea lacunosa Species 0.000 description 1
- 208000032851 Subarachnoid Hemorrhage Diseases 0.000 description 1
- 208000009443 Vascular Malformations Diseases 0.000 description 1
- 230000003187 abdominal effect Effects 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 210000002551 anterior cerebral artery Anatomy 0.000 description 1
- 210000005097 arteria cerebelosa anteroinferior Anatomy 0.000 description 1
- 208000011775 arteriosclerosis disease Diseases 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 210000002302 brachial artery Anatomy 0.000 description 1
- 230000002308 calcification Effects 0.000 description 1
- 210000000269 carotid artery external Anatomy 0.000 description 1
- 210000000711 cavernous sinus Anatomy 0.000 description 1
- 210000004298 cerebral vein Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 210000003109 clavicle Anatomy 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- JJJFUHOGVZWXNQ-UHFFFAOYSA-N enbucrilate Chemical compound CCCCOC(=O)C(=C)C#N JJJFUHOGVZWXNQ-UHFFFAOYSA-N 0.000 description 1
- 238000012277 endoscopic treatment Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 210000002532 foramen magnum Anatomy 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 230000012010 growth Effects 0.000 description 1
- 230000023597 hemostasis Effects 0.000 description 1
- 239000007943 implant Substances 0.000 description 1
- 230000007574 infarction Effects 0.000 description 1
- 201000007272 intracranial sinus thrombosis Diseases 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 235000015097 nutrients Nutrition 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 210000003388 posterior cerebral artery Anatomy 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000010992 reflux Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000008961 swelling Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000002861 ventricular Effects 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/12—Arrangements for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/467—Arrangements for interfacing with the operator or the patient characterised by special input means
- A61B6/469—Arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/504—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of blood vessels, e.g. by angiography
Definitions
- the present invention relates to an image processing device, an image processing method, a program, and an image processing system, and particularly to an image processing technique for use in blood vessel examination or treatment.
- a medical professional when examining or treating a cerebral blood vessel, a medical professional advances a catheter from the subject's groin or brachial artery through the aortic arch and carotid artery to the cerebral artery. Because a lot of experience and training is required for medical personnel to perform these procedures properly, for example, US Pat. is disclosed.
- the present invention has been made in view of these points, and aims to provide a technique for allowing a medical worker to concentrate on work in a region of interest and to support judgment of the region of interest in catheter examination or treatment of a blood vessel. aim.
- a first aspect of the present invention is an image processing device.
- This apparatus includes an image acquisition unit that acquires an image including at least an intravascular inspection or treatment device as a subject, and one or more regions that include at least part of the device included in the image as a region of interest. and a notification unit for notifying a user of the image processing apparatus when at least one of the regions of interest satisfies a condition defined for each region of interest.
- This image processing apparatus may further include a tracking unit that tracks each of the regions of interest in the image.
- the image processing apparatus when at least one of the regions of interest satisfies a condition defined for each of the regions of interest, has a notification unit that notifies the user of the image processing device of that fact.
- an output unit that outputs the information to a device connected to the image processing device.
- Devices connected to the image processing apparatus can be connected wirelessly or by wire. Devices can be, for example, surgical robots, VR/AR/MR devices, smart glasses, devices used as metaverses, and the like.
- the surgical robot may sound a notification, issue a warning, automatically stop the operation, or automatically pull the device.
- the surgical robot may sound a notification, issue a warning, automatically stop the operation, or automatically pull the device.
- the surgical robot may sound a notification, issue a warning, automatically stop the operation, or automatically pull the device.
- AR or smart glasses it is possible to superimpose the region of interest that satisfies the conditions on the image being viewed and notify the user. This is just one example, and the operation of the device that receives the output is not limited to this.
- the image processing apparatus may further include a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- a notification or output may be provided when the region of interest enters a vessel with a diameter below a threshold, or a manually or automatically specified vessel.
- the notification unit or the output unit is provided that the region of interest disappears from the image when a region including the tip of a catheter such as a guiding catheter or the tip of a guide wire is set as the region of interest.
- the user may be notified or the information may be output to the device.
- the notification unit or the output unit is configured such that the distance between the region of interest and the edge of the image is The user may be notified or the information may be output to the device on condition that the distance is less than a predetermined threshold.
- the notification unit or the output unit notifies the user or provides the information on the condition that at least one of the moving distance, moving speed, and acceleration of the region of interest in the image exceeds a predetermined threshold. You can output to the device.
- the notification unit may cause a display device that displays the image to display the distance between the region of interest and the edge of the image.
- the notification unit may change the display mode of the distance on the display device according to the size of the distance between the region of interest and the edge of the image.
- the notification unit or the output unit on the condition that a value obtained by dividing the distance between the region of interest and the edge of the image by the moving speed of the region of interest in the image is less than a predetermined threshold, The user may be notified or the information may be output to the device.
- the image processing device includes a marker detection unit that detects a marker provided on the delivery wire of the embolization coil and approaching a region of interest set on a part of a guiding catheter that guides the delivery wire. Further, the tracking unit may further track the detected marker, and the notification unit or the output unit may detect the embolization target when the marker and the region of interest overlap. The user may be notified of the timing at which the coil may be disconnected from the delivery wire, or the information may be output to the device.
- the notification unit or the output unit may notify the user or output the information to the device.
- the notification unit may cause the display device to display the distance that the marker should move until the embolization coil is cut off from the delivery wire.
- the notification unit may notify the user of the image processing apparatus when the feature amount indicating the shape of the device included in the region of interest satisfies a predetermined condition.
- the feature amount may be a curvature
- the notification unit or the output unit determines whether the curvature of the device included in the region of interest exceeds a predetermined threshold curvature, or when the tip moves while the curvature is changing.
- the user may be notified or the information may be output to the device on the condition that there is no such information.
- the feature quantity may be the distance or segmentation area of the guidewire within the region including the tip of the guidewire. Notify when this exceeds a certain threshold (Fig. 43). A condition that the movement of the tip is within a certain range or less may be added to this condition. The range of the area may be changed according to the magnification of the screen, the content of the surgery, and the location of the blood vessel.
- the threshold value may be determined in advance, or may be adjusted according to the operator's preference, or the value over time during the operation may be accumulated, and, for example, a notification may be given when the distance exceeds 2 ⁇ standard deviation. may However, this is an example, and the method of setting the threshold is not limited to this.
- the notification unit or the output unit determines that a value obtained by subtracting the length of the center line of the blood vessel included in the image or region of interest from the length of the device included in the image or region of interest exceeds a predetermined threshold length.
- the user may be notified or the information may be output to the device on the condition that
- the notification unit displays the region of interest in a different color from the image, changes the font, size or color of displayed characters, changes the color of the entire screen of the display device or a part of the display device, It has functions such as displaying graphics on the entire screen, outside the frame, or part of the screen, magnifying the area of interest, and notifying the user by changing the color or size of the mark attached to the area of interest.
- the notification unit may use sound or vibration for notification.
- a second aspect of the present invention is an image processing method.
- a processor of an image processing apparatus acquires an image including at least an intravascular examination or treatment device as a subject; obtaining a region as a region of interest; tracking each of the regions of interest in the image; and notifying a user of the processing device.
- the methods of the present disclosure may be computer-implemented methods.
- a third aspect of the present invention is a program.
- This program provides a computer with a function of acquiring an image including at least an intravascular examination or treatment device as a subject, and one or more regions including at least part of the device included in the X-ray image.
- a program may be a program containing instructions that cause a computer to perform certain steps when the program is executed by a computer.
- a computer-readable recording medium recording this program may be provided, or this program may be transmitted via a communication line.
- a third aspect of the invention includes a computer-implemented method, which may be performed by a computer including a memory and a processor capable of storing instructions for performing the same. Instructions or programs for executing this method on a computer may be recorded on a non-transitory computer-readable medium.
- a fourth aspect of the present invention is an image processing system.
- This system includes the image processing device described above, an imaging device that captures an image of a person with a device for examination or treatment inserted into a blood vessel (an image of a surgical field) and transmits the image to the image processing device; Prepare.
- a further aspect of the present invention is an image processing apparatus comprising a storage device such as a memory, and a processor connected to the storage device, wherein the processor processes an image including at least an intravascular examination or treatment device as a subject.
- the present invention relates to an image processing device that performs processing such as obtaining
- the present invention it is possible to provide a technique for allowing medical staff to concentrate on work in the target area and supporting oversights and delays in judgment in areas other than the target area during catheterization or treatment of blood vessels.
- FIG. 1 is a diagram schematically showing the appearance of an image processing system according to an embodiment
- FIG. 1 is a diagram schematically showing the functional configuration of an image processing apparatus according to an embodiment
- FIG. It is a figure for demonstrating a region of interest.
- FIG. 4 is a diagram for explaining an example of conditions set for a region of interest; It is a figure which shows an example of the message which a notification part notifies.
- FIG. 10 is a diagram for explaining another example of conditions set for the region of interest;
- FIG. 10 is a diagram for explaining another example of conditions set for the region of interest;
- FIG. 10 is a diagram for explaining the timing of disconnection of the embolization coil;
- FIG. 4 is a diagram for explaining conditions related to the shape of the device set in the region of interest; 4 is a flowchart for explaining the flow of image analysis processing executed by the image processing apparatus according to the embodiment; 1 is a schematic diagram of a typical neural network; FIG. 1 is a diagram schematically showing a functional configuration of an image processing apparatus having a replay function according to an embodiment; FIG. FIG. 10 is a diagram for explaining an example of replay display when notification occurs; FIG. 10 is a diagram for explaining an example of displaying a replay playback window on a real-time display screen; 1 is a diagram schematically showing a functional configuration of an image processing device provided with a state estimating section according to an embodiment; FIG. FIG.
- FIG. 10 is a diagram for explaining an example of display of an estimation result of the state estimation unit according to the embodiment.
- FIG. 5 is a diagram for explaining an example of estimating a catheter tip position using a 2nd marker according to an embodiment;
- FIG. 10 is a diagram for describing an embodiment in which image analysis results are displayed on two screens of different sizes;
- FIG. 10 is a diagram for explaining an embodiment in which a screen to be notified is recognized by highlighting a frame of the screen;
- FIG. 4 is a diagram for explaining display of a product list of intravascular examination or treatment devices (for example, various catheters and coils) according to the embodiment;
- FIG. 10 is a diagram for explaining detection of a device that has passed through a designated boundary line according to the embodiment;
- FIG. 10 is a diagram for explaining detection of a device that has passed through a designated boundary line according to the embodiment;
- FIG. 10 is a diagram for explaining display of device positions on a sub-screen according to the embodiment;
- FIG. 4 is a diagram for explaining an example of recognition of a surgical situation according to the embodiment;
- FIG. It is a figure showing an example of composition of a computer concerning this indication. It is a figure which shows the specific example of blood vessel extraction. The figure on the left shows the recognized result.
- 1 is a diagram schematically showing the functional configuration of an image processing apparatus according to an embodiment;
- FIG. It is a figure for demonstrating the notification by a sound (word) which concerns on embodiment.
- FIG. 10 is a diagram for explaining automatic setting of a specific boundary line according to the embodiment
- FIG. 10 is a diagram for explaining notification when a liquid embolic substance (Onyx, NBCA, etc.) crosses a specified boundary
- FIG. 10 is a diagram for explaining variations of a method for setting a specific area according to the embodiment
- FIG. There are various ways to draw a specific area (boundary) (four examples A to B are shown).
- B Specification by ellipse.
- C Designation by a horizontal straight line.
- D Specification by a vertical straight line. It is a figure which shows the specific example of blood-vessel recognition. The figure on the right shows the recognized result.
- FIG. 10 is a diagram for explaining automatic setting of a specific boundary line according to the embodiment
- FIG. 10 is a diagram for explaining notification when a liquid embolic substance (Onyx, NBCA, etc.) crosses a specified boundary
- FIG. 10 is a diagram for explaining variations
- FIG. 10 is a diagram for explaining notification when an extravascular area (outside a recognized blood vessel) is designated as a specific area;
- FIG. 10 is a diagram for explaining notification when a filter moves;
- FIG. 10 illustrates automatically detecting the edge of the X-ray portion and setting the boundary accordingly.
- FIG. 10 is a diagram for explaining an embodiment in which an area moves or disappears according to screen enlargement/reduction movement and scene change; It may be set automatically or suggested so that the user presses OK or NO accordingly.
- FIG. 10 is a diagram illustrating notification when the tip of the guide wire is sharply bent; Specifically, for example, notification can be made when the length of the GW within the boundary including the tip reaches a certain distance or longer.
- FIG. 1 is a diagram schematically showing the appearance of an image processing system S according to an embodiment.
- the image processing system S includes an image processing device 1 , a display device 2 and an X-ray imaging device 3 . An outline of the embodiment will be described below with reference to FIG.
- the X-ray imaging device 3 captures an X-ray image of a human subject P in a state in which a blood vessel inspection or treatment device (hereinafter sometimes simply referred to as "device") is inserted, It is a device for transmitting the X-ray image to the image processing device 1 . Therefore, the X-ray imaging device 3 includes an X-ray irradiator 30 (a first X-ray irradiator 30a and a second X-ray irradiator 30b) for irradiating the subject P with X-rays, an X-ray irradiator An X-ray detector 31 for detecting X-rays emitted by 30 and a bed 32 for supporting the subject P are provided.
- a blood vessel inspection or treatment device hereinafter sometimes simply referred to as "device”
- the first X-ray irradiator 30a and the second X-ray irradiator 30b can irradiate the subject's P head with X-rays at different incident angles.
- the X-rays emitted by the first X-ray irradiator 30a are detected by the first X-ray detector 31a and converted into an X-ray image based on the X-ray absorption rate.
- the X-rays emitted by the second X-ray irradiator 30b are detected by a second X-ray detector (not shown) and converted into an X-ray image.
- These X-ray images are displayed on the display device 2 .
- the positions of the first and second X-ray irradiators are fixed during treatment of the cerebrovascular area, and the image area displayed on the display device 2 is fixed.
- the image generated from the X-rays detected by the X-ray detector 31 includes the blood vessels of the subject P (which become visible when a contrast agent is applied), tissues such as bones, and various images used for examination and treatment of blood vessels.
- a catheter such as a guiding catheter, a guide wire, an embolization coil, a delivery wire for carrying the embolization coil to a site of interest, etc.
- a delivery wire for carrying the embolization coil to a site of interest, etc.
- the image processing apparatus 1 is used by a user who performs vascular catheterization or treatment (hereinafter simply referred to as "catheterization” except when distinguishing between vascular catheterization and vascular catheterization). It is a device for assisting.
- the image processing device 1 recognizes and/or tracks one or more preset regions of interest in an X-ray image generated based on X-rays detected by the X-ray imaging device 3 . By analyzing the X-ray image, the image processing apparatus 1 can notify the user of the image processing apparatus 1 when the state of any region of interest satisfies the conditions defined for each region of interest. Notify the medical staff (hereinafter simply referred to as "user").
- the user can work on the region of interest (for example, guide a microcatheter into the aneurysm, You can concentrate on the tasks such as inserting the coil into the aneurysm, expanding the balloon, and placing the stent).
- FIG. 2 is a diagram schematically showing the functional configuration of the image processing device 1 according to the embodiment.
- the image processing device 1 includes a storage unit 10 and a control unit 11.
- each functional block does not show the configuration in units of hardware (apparatus), but the configuration in units of functions. Therefore, the functional blocks shown in FIG. 2 may be implemented within a single device, or may be implemented separately within a plurality of devices. Data exchange between functional blocks may be performed via any means such as a data bus, network, or portable storage medium.
- FIG. 33 is a diagram schematically showing the functional configuration of an image processing device 1 according to another embodiment.
- FIG. 33 shows that the output unit 116 outputs to the external device 4.
- FIG. 33 is a diagram schematically showing the functional configuration of an image processing device 1 according to another embodiment.
- FIG. 33 shows that the output unit 116 outputs to the external device 4.
- FIG. 33 is a diagram schematic
- FIG. 31 is a diagram showing an example of the configuration of a computer according to the present disclosure.
- the configuration of the computer includes a CPU 20 which is a processor capable of performing arithmetic processing, etc., a ROM 21 which is a memory connected to the processor and capable of storing BIOS and the like, and a RAM 22 which may be a work area. , a storage 23 that can store programs and the like, and the configuration of the computer can include an input unit 25, an output unit 26, and a storage medium 27 via an input/output interface 24.
- the input unit 25 may include an input device such as a keyboard.
- the output unit 26 may include an output device such as a display. Data transmission and reception may be performed via the input section 25 and the output section 26 .
- the storage unit 10 includes a ROM (Read Only Memory) that stores the BIOS (Basic Input Output System) of the computer that implements the image processing device 1, and a RAM (Random Access Memory) that serves as a work area for the image processing device 1. ), OS (Operating System), application programs, and large-capacity storage devices such as HDD (Hard Disk Drive) and SSD (Solid State Drive) that store various information referenced when the application program is executed. (also known as storage).
- BIOS Basic Input Output System
- RAM Random Access Memory
- OS Operating System
- application programs and large-capacity storage devices such as HDD (Hard Disk Drive) and SSD (Solid State Drive) that store various information referenced when the application program is executed. (also known as storage).
- the control unit 11 is a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit) of the image processing device 1, and executes a program stored in the storage unit 10 to obtain an image acquisition unit 110, obtain a region of interest, It functions as a unit 111, a tracking unit 112, a notification unit 113, a marker detection unit 114, a distance measurement unit 115, and the like.
- a CPU Central Processing Unit
- GPU Graphics Processing Unit
- FIG. 2 shows an example in which the image processing device 1 is composed of a single device.
- the image processing device 1 may be implemented by computational resources such as multiple processors and memories, like a cloud computing system, for example.
- each unit constituting the control unit 11 is implemented by executing a program by at least one of the plurality of different processors.
- the image acquisition unit 110 acquires an X-ray image created based on the X-ray absorptance, including at least the blood vessel of the subject P and the device for examination or treatment in the blood vessel as the subject.
- the X-ray image may include an aneurysm that has occurred in the blood vessel of the subject P, and a stenotic or infarcted portion of the blood vessel that the medical staff focuses on.
- the region-of-interest acquiring unit 111 acquires one or more regions including at least part of the device included in the X-ray image as regions of interest.
- the region of interest for example, the tip of the guide wire (GW), the tip of the guiding catheter (GC), the marker of the catheter, the coil, etc. can be set. Multiple regions of interest may be set at the same time, such as GW tip, GW tip and coil.
- the region of interest may include blood vessels (vascular lesions such as cerebral aneurysms and stenoses) and bones.
- FIG. 3(a) is a diagram schematically showing an example of an X-ray image of a blood vessel V.
- FIG. 3(a) a guide wire, which is a type of device D, is present in a blood vessel V.
- a guide wire which is a type of device D.
- FIG. 3(b) is a diagram showing an example of candidate region C, which is a candidate region of interest.
- the region-of-interest acquisition unit 111 may include a candidate region detector generated using a known machine learning method such as a neural network (FIG. 11).
- a first candidate region C1 tip of the guiding catheter
- a second candidate region C2 tip of the guide wire
- This candidate region C is the result obtained by the region-of-interest acquisition unit 111 inputting the frame image of the X-ray image to the candidate region detector.
- the candidate region detector is trained to detect guidewire tips, guiding catheter tips, catheter markers, and the like.
- Detection generally means identifying the position and shape of an object in a still image of one frame in a video. For example, in detecting the tip of a guide wire, since the guide wire is thin, it may be detected as point coordinates (x, y) on the screen. The detection of the entire guidewire can detect a curved thin thread-like shape as a one-dimensional curve or as a two-dimensional segmentation.
- Object detection algorithms that can be used in the candidate area detector include, but are not limited to, algorithms such as Faster R-CNN, YOLO, SSD, U-Net, and ResNet. Note that an image processing library such as OpenCV (Open Source Computer Vision Library) may be used to implement the object recognition algorithm.
- OpenCV Open Source Computer Vision Library
- the detection method is not limited to machine learning, but in the case of machine learning, for example, the input data for learning may include actual surgery/examination videos, videos of angiography equipment using phantoms, and virtual surgical videos. and so on. It may be a still image instead of a moving image. Virtual surgical videos can be created manually, images similar to surgery can be generated by algorithms, or Generative Adversarial Networks (GANs) can be used. .
- GANs Generative Adversarial Networks
- videos of endovascular surgery and examinations of all parts can be used (for example, brain, heart, extremities, abdomen, pelvis).
- Any medical image such as an X-ray image, CT image, or MRI image may be used.
- a video may be appropriately divided into still images, and one or more still images may be used as input images for learning.
- Input images may be subjected to arbitrary transformations (eg, normalization, denoising, etc.).
- a plurality of images may be input as time-series images.
- Front and side images may be input at the same time, or mask and live images may be input.
- a combination of these may be used as an input image.
- time-series live front, live side, front mask, and side mask images may be input. An improvement in accuracy is expected by including a plurality of these related images.
- Annotations can include any endovascular treatment device, blood vessel/bone/lesion site (aneurysm, stenosis, vascular malformation, etc.), vessel anatomy (internal carotid artery, Sylvian vein, etc.), magnification, site (head, chest, etc.). etc.), image quality (degree of noise), image type (mask image, live image, contrast imaging, 3D imaging, etc.), surgical procedure, device used in surgery, patient information (age, gender, disease, etc.), etc. It can be done by For endovascular treatment devices, blood vessels, bones, lesions, and blood vessel anatomy, annotations such as points, straight lines, circles, ellipses, curves, closed curves, segmentation, and bounding boxes are made on the image. An ID may be given as an instance.
- the position and shape of the device e.g., guidewire curve/segmentation, guidewire tip/tip circumference, balloon inflation, stent stump, stent shape, etc.
- vessel location e.g., stenosis
- image quality e.g, noisy
- image type e.g, contrasting, mask image, etc.
- This model can be used for object detection in images, and can also be used for inference of arbitrary data input above, such as scene classification of still images and videos.
- the image processing apparatus 1 can recognize not only the region of interest including the device, but also blood vessels.
- the method is the same as the device recognition described above.
- Machine learning/deep learning (AI) and rule-based methods are used, but the methods are not limited to these.
- AI Machine learning/deep learning
- a threshold value may be set to extract only the white and black portions.
- a blood vessel may be extracted by contour extraction. If it is a moving image, an addition average may be taken. Blood vessels are extracted by these combinations.
- a blood vessel may be recognized from one or multiple mask images over time, or may be recognized using one or multiple images during contrast enhancement. Furthermore, the front image and the side image may be recognized together. Recognition using images from two directions is expected to improve recognition accuracy.
- intraoperative DSA images mask images, 3D images (3DRA), images such as preoperative CTA, MRA, 3DRA, and other images may be used.
- Recognition of blood vessels is performed by painting (segmentation), boundary of segmentation, collection of curves, tree structure of curves, etc., but not limited to this.
- anatomical information can also be inferred at the same time. For example, it may be recognized by anatomical classification such as identification of right and left, internal carotid artery/external carotid artery, middle cerebral artery, posterior cerebral artery, posterior communicating artery, and anterior communicating artery.
- the internal carotid artery can be divided into C1 to C5, and the middle cerebral artery into M1, M2 (superior trunk, inferior trunk, anterior temporal artery, etc.), M3, etc.
- blood vessels may be converted into a tree shape (line shape, 1D) by not only segmentation (painting, 2D), but also taking a center line from there.
- the recognition method and expression method are the same as those described above.
- a lesion may be recognized when recognizing a blood vessel. For example, cerebral aneurysms, stenoses, cerebral arteriovenous malformations, occlusions, etc. may be recognized.
- the recognition method and expression method are the same as those described above.
- Important blood vessels in surgery may be manually or automatically extracted and highlighted or only the blood vessels may be displayed.
- automatic extraction selects blood vessels whose diameter is larger than a threshold value, extracts main blood vessels such as the internal carotid artery from the above recognition of vascular anatomy, and identifies aneurysms and stenosis from the above recognition of lesions.
- the present invention is not limited to this.
- FIG. 1 A specific example of blood vessel extraction is shown in FIG. Such blood vessel extraction can be performed automatically.
- Endovascular treatment is characterized by viewing X-ray images on up to 4 screens (2 directions of frontal and lateral images x mask and live images). Therefore, as an input image, 2 to 4 screens out of 4 screens may be used for learning instead of 1 screen.
- an image generated by appropriately combining four screens may be used as an input image (for example, taking a difference).
- the present disclosure relates to a method for creating a machine learning model, characterized by performing training using cerebrovascular images acquired from multiple directions as input.
- Faster R-CNN is a CNN (Convolutional Neural Network) that simultaneously performs region segmentation and recognition.
- a convolutional neural network (CNN) is a neural network with many deep layers, which consists of stacking layers with some characteristic functions such as "convolutional layers” and “pooling layers", especially It has demonstrated excellent performance in the field of image recognition.
- Faster R-CNN can extract and recognize a region of interest from an input image almost in real time (about 10 to 20 frames per second).
- Faster R-CNN enables end-to-end learning from image input to object detection.
- YOLO (You Only Look Once) is also a CNN that simultaneously performs region segmentation and recognition. YOLO divides the entire image into grids and finds a bounding box for each region. YOLO's CNN architecture enables fast object detection.
- SSD Single Shot MultiBox Detector
- the SSD can output multi-scale detection frames from various hierarchical output layers.
- the SSD puts about 9000 rectangular frames called default boxes with different sizes and shapes on the image, and calculates the prediction value for each frame.
- speeding up is achieved by reducing the filter size.
- U-Net is a CNN that recognizes an object called segmentation pixel by pixel. It consists of convolutional layers and has a nearly symmetrical Encoder-Decoder structure.
- the Decoder up-samples the down-sampled feature map through Encoder pooling.
- FIG. 3(c) is a diagram showing an example of a method for setting the region of interest R.
- the user selects the second candidate region C2 as the region of interest R.
- the user selects the region of interest R by moving the mouse cursor M to the second candidate region C2 using a pointing device (not shown) such as a mouse.
- the candidate region C may be presented by the region-of-interest acquiring unit 111 arbitrarily, and the user may directly set the region of interest R in the X-ray image in which the candidate region C is not presented.
- the region of interest R can be set by drawing a rectangle in the image and specifying the region using a pointing device such as a mouse.
- region of interest acquisition section 111 may acquire, as region of interest R, an output of a candidate region detector generated using a known machine learning method or the like.
- FIG. 3(d) is a diagram showing the region of interest R set by the user.
- the region of interest R is indicated by a black pentagon and is the region that includes the tip portion of the device D.
- the user can set two or more regions of interest R.
- Tracking unit 112 tracks each of one or more regions of interest R in an X-ray image.
- the tracking unit 112 can track the region of interest R by using known image tracking technology.
- Known image tracking techniques that can be used include, but are not limited to, those using algorithms such as Boosting, MIL, TLD, MedianFlow, KCF, GOTURN, MOSSE, CSRT. Tracking algorithms may be used in combination with the object detection algorithms described above. For implementation of algorithms, libraries such as OpenCV (Open Source Computer Vision Library) may be used. It should be noted that in the context of the present invention, tracking the region of interest R also includes intermittently detecting the region of interest R to identify its state.
- tracking the region of interest R may be performed using a tracking algorithm or an object detection algorithm, or a combination thereof.
- the BOOSTING tracker is a tracker based on the online version of AdaBoost (the algorithm used internally by HAAR cascade-based face detectors).
- AdaBoost the algorithm used internally by HAAR cascade-based face detectors.
- This classifier is trained at runtime using correct and incorrect examples of objects.
- the first bounding box specified by the user (or another object detection algorithm) is treated as the correct object instance, and the image outside the bounding box is treated as the background.
- the classifier Given a new frame, the classifier is applied to all pixels in the neighborhood of the previous location and the score is recorded.
- the new position of the object is the position with the maximum score. This gives us another correct example of the classifier. Additional frames are input and the classifier is updated with this additional data.
- the MIL tracker is based on the same concept as the BOOSTING tracker above. The big difference is that instead of just considering the object's current position as the correct answer, we look at a small neighborhood around the current position to generate some potential correct answers. Rather than specifying correct and incorrect examples, MIL designates a "bag" of correct and incorrect answers. Not all of the images in the correct answer bag are correct examples. Only one image in the correct answer bag should be an example correct answer. The correct answer bag contains an image centered on the current position of the object and a small neighborhood image around it. Even if the current position of the tracked object is not exact, the bag contains at least one image with the object properly centered if the samples near the current position are in the correctness bag. It is highly likely that The MIL tracker performs well, does not drift as much as the BOOSTING tracker, and performs reasonably well even in the presence of partial occlusion.
- KFC means a kernelized correlation filter.
- This tracker builds on the concepts proposed in the above two trackers. This tracker takes advantage of the fact that the multiple correct samples used in the MIL tracker have large overlapping regions. Such duplicated data provides some excellent mathematical properties while making tracking faster and more accurate. It outperforms MIL in both accuracy and speed, and is superior in tracking failure reporting.
- TLD stands for tracking, learning, and detecting.
- this tracker decomposes the long-term tracking task into three components: (short-term) tracking, learning and detection.
- This tracker tracks an object frame by frame.
- the detector identifies all previously observed features to calibrate the tracker if necessary. It estimates the error of the detector through learning and updates to avoid future errors.
- This output of this tracker tends to be somewhat unstable, for example, if you are tracking a pedestrian and there are other pedestrians in the scene, this tracker will track a different gait than the pedestrian you want to track. A person may be tracked temporarily. On the positive side, it works best under occlusion over multiple frames. A disadvantage is that there are many false detections.
- the MEDIANFLOW tracker tracks objects both forward and backward in time and measures the discrepancy between these two trajectories. By minimizing this forward/backward error, it is possible to reliably detect tracking failures and select reliable trajectories in videos. This tracker performs best when motion is predictable and small, and when there is no occlusion. Unlike other trackers that continue even if tracking clearly fails, this tracker is able to recognize that tracking has failed.
- the GOTURN tracker is an algorithm based on convolutional neural networks (CNN).
- CNN convolutional neural networks
- MOSSE Minimum Output Sum of Squared Error trackers
- the MOSSE tracker is robust to lighting, scale, pose changes, and non-rigid deformations.
- the tracker also detects occlusion based on the peak-to-sidelobe ratio and can pick up where it left off when the object reappears.
- the MOSSE tracker works even at high frame rates (450fps and above). On the positive side, it's very easy to implement, as accurate as other complex trackers, and much faster.
- the CSRT tracker uses a spatial reliability map for tracking.
- the CSRT tracker operates at a relatively low frame rate (25fps) but gives higher accuracy of object tracking.
- the notification unit 113 When at least one of the regions of interest R satisfies the conditions defined for each region of interest, the notification unit 113 notifies the user of that fact. Specifically, for example, when at least one of the regions of interest R satisfies a condition defined for each region of interest, the notification unit 113 causes the display device 2 to display a message indicating that fact, or an image (not shown). The user is notified by, for example, playing a notification sound through the speaker of the processing device 1 . Alternatively, if the user is wearing a device with a vibrating member such as a smart phone, the user may be notified by vibrating the vibrating member. Note that the type, amplitude, and frequency of the notification sound may be changed according to conditions.
- the notification sound may be combined with voice (Fig. 34).
- the operator/assistant is concentrating on the operation and may not be able to see the notification screen or may be late to see it.
- the notification status can be determined only by voice, this problem can be solved.
- “beep, coil approaching in front mask image” Fig. 34
- “beep, guide wire out of screen in side live image” "beep, in front live image , the guiding catheter fell off the screen”
- “Pyrolin, on the front (on the A side) the filter moved”
- the notification unit 113 may display the region of interest in a color different from the color of the acquired image (X-ray image).
- angiography images are monochrome, so it is difficult to recognize the devices included in the images. can be provided.
- the regions of interest may be colored using different colors.
- the user In vascular catheterization, the user first passes the guiding catheter through the subject P's blood vessel to the vicinity of the target area. Subsequently, the user passes another device such as a guide wire, a delivery wire, or a balloon catheter into the guiding catheter to observe (diagnose) or treat a region of interest, for example coil embolization of a cerebral aneurysm.
- another device such as a guide wire, a delivery wire, or a balloon catheter into the guiding catheter to observe (diagnose) or treat a region of interest, for example coil embolization of a cerebral aneurysm.
- a catheter is one of the devices and is a long and narrow tube with a lumen. This catheter is passed through a blood vessel, for example, guided into an aneurysm, a coil is inserted through the lumen of the catheter, and the coil is inserted into the aneurysm to prevent rupture.
- the guiding catheter has a slightly thick diameter of about 2-4 mm, and plays a role in connecting the puncture site to the front of the target site.
- a smaller device is inserted into this guiding catheter and guided to the target site.
- the advantage of a guiding catheter is that it eliminates the need to track the passage of various devices through the guiding catheter from the groin or arm vessel puncture site to the proximal target site. . Thereby, for example, treatment can be performed efficiently without moving a fixed screen.
- Some guiding catheters have a balloon at the tip (guiding catheter with balloon), which can stop blood vessel flow and stabilize the guiding catheter.
- the intermediate catheter In order to send a thinner device to the target site, the intermediate catheter is placed inside the guiding catheter and left more proximal to the target site where the blood vessels are narrower. Multiple intermediate catheters may be used (gradually tapered).
- Microcatheters are the thinnest catheters, soft and can be inserted into thin blood vessels. A coil or stent is placed in this microcatheter and carried to the target site. Embolic material may also be flushed through the microcatheter.
- a balloon catheter has a balloon attached near the tip of a microcatheter, which is inflated near the target site. For example, it prevents the coil from exiting an aneurysm, dilates a stenotic site, or stops the flow of blood when a blood vessel is perforated to achieve hemostasis.
- the term "balloon" usually refers only to the balloon portion of the balloon catheter.
- an appropriately sized guide wire is often used inside. In rare cases, it can be guided by putting it on the flow of blood.
- a guide wire is an elongated wire.
- guidewires are used to guide soft catheters to selected vessel bifurcations and to target sites.
- Other devices include delivery wires for carrying stents, and devices with balloons attached to guide wires to stop blood flow.
- guide wire is used as a broad definition including those. The reason for this is that the tips of these guidewires can perforate blood vessels and cause serious complications. Since one of the purposes of the present invention is to assist in preventing perforation of blood vessels by the tip of such a guide wire, a wire containing a fine wire whose tip may perforate a blood vessel is referred to as a guide wire.
- an X-ray imaging device 3 used in catheter surgery picks up a specific region in the body of a subject P, including a region of interest (for example, an aneurysm, stenosis region, infarction region, etc. that requires treatment).
- the entire device D is not always imaged in the X-ray image that can be viewed by the user.
- the tip of a guide wire or a guiding catheter will be different from the angle of view of the X-ray image. It may also disappear from the X-ray image and become unobservable by the user. In such cases, it may not be noticed that the tip of the device has perforated a blood vessel. Vascular perforation is a serious life-threatening complication. Vessel perforation is more likely to occur at the tip of the device, especially at the tips of guidewires and catheters. Therefore, in the present invention, the tip is of particular interest.
- an example of a condition set for the region of interest R used in the present invention is a condition regarding the distance between the tip of the guiding catheter or the tip of the guide wire and the edge of the X-ray image.
- the condition may be that the region of interest R exceeds a specific range specified on the X-ray image (for example, a boundary line specified by the user with a pointing device such as a mouse) or simply moves. That is, in some aspects of the present invention, not only the edge of the image, but any area within the image may be processed as the same as the edge.
- the boundary line can be a straight line, a curved line, a circle, a rectangle, other polygons, or the like.
- a boundary line may be movable, deformable, enlarged, or shrunk by a user's operation.
- the distance between the region of interest and the edge of the specified area can be displayed.
- the display mode of the distance may be changed according to the magnitude of the distance between the region of interest and the edge of the specific range.
- the notification unit notifies that a value obtained by dividing the distance between the region of interest and the edge of the specific range by the speed of movement of the region of interest in the image is less than a predetermined threshold. As a condition, it is also possible to notify the user.
- Some aspects of the present invention also include a function of facilitating the operator's judgment by making a specific range easier to see by superimposing it or the like. As for the method of displaying the range, not only superimposed display but also arrows or the like may be used, but the method is not limited to these.
- the distance may be determined either by a straight line distance or by a distance along a vessel.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the position of an intravascular examination or treatment device (also referred to as the device of interest) at a certain point in time can be specified automatically or by a user (operator/assistant) at that point. All or part of the device being detected may be superimposed on subsequent real-time images. As a result, the user (operator/assistant) can recognize the movement (deviation) of the device from a specific point in time through the superimposed display on the real-time image (Fig. 21).
- the shape detection of the device may be automatically obtained by image analysis, or may be performed by the operator marking with a mouse, touch panel, or the like.
- Specific examples of the superimposed display device include, but are not limited to, the tip of a guide wire, a marker, a stent, a guiding catheter, and the like.
- the tip of the guidewire black part
- the tip of the guide wire is recognized, and as shown in the middle diagram of FIG.
- the tip of the guidewire stored on the line image is superimposed and displayed.
- the right diagram of FIG. 21 how much the tip of the guide wire has moved after that can be visualized and can be easily grasped.
- the image processing apparatus includes a storage unit that acquires and stores the position and/or shape of the device for examination or treatment in the blood vessel at an arbitrary point of time.
- the position and/or shape of the acquired device is superimposed on the acquired image.
- a notification can be issued when they or the tips of the guide wires attached to them go out of a certain range specified by the operator or automatically.
- cerebral aneurysm embolization if the tip of the guide wire inside the balloon catheter goes out of the specified range, the balloon may slip or become unable to return, or the distal blood vessel may be damaged.
- a notification can be issued because there is a risk of puncturing or the like.
- the coil when inserting the coil, it can provide a notification that straying out of certain areas can block important blood vessels. Even when the coil deviates from the aneurysm mask image (area), it is possible to issue a notification because there is a possibility of an aneurysm perforation due to the coil, guide wire, catheter, or the like. While it is important for the guiding catheter to be stable in all endovascular procedures, there are times when we want to correct it before it goes out of bounds, and we can provide a notification if it seems to move out of a certain area.
- devices In tumor embolization, cerebral arteriovenous malformation embolization, and dural arteriovenous fistula embolization, embolic substances such as liquids and particles are used for embolization. Because there is a possibility of having a cerebral infarction at a particular site, a notification can be issued.
- devices also include liquid embolic agents, particulate embolic agents, and the like. Also, and not limited to the above examples, endovascular procedures are careful to keep any devices, embolic material, etc. within a designated area and can assist in doing so.
- FIG. 4(a)-(b) are diagrams for explaining an example of the conditions set for the region of interest R.
- FIG. 4(a) a region of interest R is set at the tip of the device D.
- the information W indicating the velocity and acceleration of the region of interest R and the distance (number of pixels) between the region of interest R and the edge F of the X-ray image is displayed on the X-ray image. They are superimposed and displayed.
- FIG. 4(a) shows an example in which the device D is a guide wire.
- a specified region designated within the X-ray image may be identified with the edge F of the X-ray image.
- the notification unit 113 notifies the user on the condition that the region of interest R disappears from the X-ray image when the region including the tip of the guidewire is set as the region of interest R.
- the X-ray image is an image showing a fixed region containing the site of interest. Normally, the area displayed as an X-ray image is fixed during treatment. Further, the notification unit 113 notifies that the distance between the region of interest R and the edge F of the X-ray image is less than the predetermined threshold distance when the region including the tip of the guidewire is set as the region of interest R. Notify the user as a condition.
- the "predetermined threshold distance” is a "protrusion determination reference distance" provided for the notification unit 113 to determine whether or not there is a high probability that the tip of the device will deviate from the angle of view of the X-ray image.
- a specific value of the predetermined threshold distance may be determined by experiments in consideration of the frequency of notification by the notification unit 113, usability, and the like. is 5% of the number of pixels in either As a result, when the region of interest R, which is the region including the tip of the guiding catheter or the tip of the guide wire, approaches the edge F of the X-ray image, the region of interest R deviates from the edge F. The user can be notified later.
- the magnitude of movement of the device D in the blood vessel V can be a useful index for the user to predict the time until the tip of the device D reaches the edge F of the X-ray image.
- the greater the speed or acceleration of the tip of D the shorter the time it takes for the tip of D to reach the edge F of the X-ray image. Therefore, when a region including the tip of the guiding catheter or the tip of the guide wire is set as the region of interest R, the notification unit 113 determines whether at least one of the moving speed and the acceleration of the region of interest R in the X-ray image is determined. The user may be notified on the condition that one of them exceeds a predetermined threshold. As a result, when the moving speed or acceleration of the region of interest R is greater than the threshold, the user's attention can be aroused before the region of interest R approaches the edge F of the X-ray image.
- the notification unit 113 may display the distance between the region of interest R and the edge F of the X-ray image on the display device 2 that displays the X-ray image.
- the distance between the region of interest R and the edge F of the X-ray image is the distance between the region of interest R and the edge of the X-ray image along the blood vessel V into which the device D with the region of interest R is set. It may be the distance when moving to part F.
- the distance measurement unit 115 extracts the blood vessel V using a blood vessel recognition engine generated in advance using a known machine learning method or the like, and along the blood vessel V, the distance from the region of interest R to the edge F is determined. It can be realized by measuring the distance. Alternatively, the distance measuring unit 115 may measure the above-described distance based on the trajectory of the device D traveling through the blood vessel V. FIG.
- the tracking unit 112 tracks the tip of the guiding catheter and stores the trajectory in the storage unit 10 .
- Distance measuring section 115 may set the length of the trajectory included in region of interest R among the trajectories stored in storage section 10 as the length described above.
- the notification unit 113 causes the display device 2 to display the distance between the tip of the device D and the edge F of the X-ray image, so that the user can understand how much the device D must move to reach the edge F of the X-ray image. can be objectively grasped at a glance. Furthermore, the notification unit 113 may change the display mode of the distance according to the magnitude of the distance between the region of interest R and the edge F. For example, the font size is increased as the distance is shorter, the color is changed according to the distance (blue ⁇ yellow ⁇ red), the region of interest is enlarged according to the distance, and the region of interest is displayed according to the distance. For example, changing the color or size of the mark attached to the . By devising the display mode as described above, it is possible to make it easier for the user to notice the change in the distance.
- the region of interest R set at the tip of the device D is closer to the edge F than in the example shown in FIG. 4(a). For this reason, the font of the information W indicating the velocity and acceleration of the region of interest R and the distance (number of pixels) between the region of interest R and the edge F of the X-ray image is different from the example shown in Fig. 4(a). is getting bigger.
- the "distance between the region of interest R and the edge F of the X-ray image” may be the shortest distance between the region of interest R and the edge F of the X-ray image. It may be the length measured to the edge F of the image. In this case, since the blood vessel extraction process by the distance measurement unit 115 can be omitted, it is advantageous in that the process can be speeded up.
- the notification unit 113 on the condition that the value obtained by dividing the distance between the region of interest R and the edge F of the X-ray image by the moving speed of the region of interest R in the X-ray image is less than a predetermined threshold, User may be notified.
- the value obtained by dividing the distance between the region of interest R and the edge F of the X-ray image by the moving speed of the region of interest R within the X-ray image is, so to speak, the distance until the region of interest R reaches the edge F of the X-ray image.
- the value of the "predetermined threshold value” is the "delay time for pop-out determination” provided for the notification unit 113 to determine whether or not there is a high probability that the region of interest R will deviate from the angle of view of the X-ray image. ”.
- a specific value of the grace period may be determined by experiments in consideration of the frequency of notification by the notification unit 113, usability, etc., and is, for example, 3 seconds.
- FIG. 5 is a diagram showing an example of the message Ms notified by the notification unit 113.
- the notification unit 113 displays a message superimposed on the X-ray image indicating that the region of interest R will disappear from the X-ray image after 3 seconds (so-called frame-out). ing.
- the notification unit 113 changes the shape of the region of interest R (changes to a circle in FIG. 5). , the size of the region of interest R may be increased, or the color of the region of interest R may be changed. Thereby, the notification unit 113 can make it easier for the operator to recognize that the region of interest R is about to go out of the frame.
- FIG. 6 is a diagram for explaining another example of conditions set for the region of interest R.
- FIG. 6 shows an example in which the device D is a guiding catheter.
- the guiding catheter is different in that not only its tip but also the entire device D can disappear from the X-ray image.
- the device D is a guiding catheter or a guide wire, the point that the tip portion disappears from the X-ray image is the same. Therefore, in the case of the guiding catheter as well, the region of interest R is set at the tip portion, similarly to the guide wire.
- the threshold is determined based on the distance between the region of interest R and the edge F of the X-ray image as the “predetermined threshold distance” mainly from the viewpoint of preventing the surgical instrument from coming out of the frame. Focusing on the moving distance of the region R itself, it may be configured to issue a notification to the user when the moving distance of the region of interest R becomes equal to or greater than a predetermined threshold value. For example, when attempting to deploy a stent while the tip of the catheter is inside the aneurysm, it is necessary to be careful not to protrude the tip of the catheter from the aneurysm. Since the person is looking at the stent, the movement of the tip of the catheter cannot be followed by the eye.
- the movement distance threshold may be set after the region of interest (for example, the tip of the catheter) reaches a predetermined position (for example, inside the aneurysm), and the arrival position is used as a reference.
- Embolization is known as catheter treatment for cerebral aneurysms, in which embolic coils are filled into cerebral aneurysms.
- a guiding catheter is passed to the vicinity of the cerebral aneurysm, which is the site of interest, and a delivery wire for an embolic coil is passed through the guiding catheter.
- the purpose of this surgery is to block the flow of blood into the aneurysm.
- FIG. 7 (a)-(b) is a diagram schematically showing an example of an X-ray image in which a guiding catheter for guiding a delivery wire of an embolization coil is imaged, and conditions set for the region of interest R It is a figure for demonstrating another example.
- FIGS. 7(a) and 7(b) illustrate X-ray images captured by irradiating the head of the subject P with X-rays at different incident angles. The user performs embolization while viewing the two images shown in FIGS. 7(a) and 7(b).
- the aneurysm is indicated by symbol A.
- the user embolizes the aneurysm A by placing a plurality of embolization coils E in the aneurysm A.
- FIG. Here, since the aneurysm A can be observed in the X-ray image shown in FIG. 7(b), the user pays attention to one of the two X-ray images.
- a delivery wire for carrying the embolic coil E to the aneurysm A is connected to the embolic coil E at its tip. work to separate from As shown in Fig. 7(a)-(b), when the embolic coil E reaches the aneurysm A, the other embolic coil E already indwelled in the aneurysm A causes the embolic coil being transported to The position of E becomes unclear in the X-ray image. Therefore, the delivery wire of the embolization coil E is provided with a marker L in advance for timing the detachment of the embolization coil E. As shown in FIG. The user determines the timing of disconnecting the embolic coil E by using the marker L provided on the delivery wire as a guide, not the embolic coil E itself in the X-ray image.
- the user of the image processing apparatus 1 first sets a region of interest R in a portion of the guiding catheter that guides the delivery wire of the embolization coil E. As shown in FIG. Specifically, the user sets the region of interest R at a position where the marker L and the region of interest R overlap at the timing of disconnecting the embolization coil E from the delivery wire.
- the region-of-interest acquiring unit 111 receives a region-of-interest R that is set in a part of the guiding catheter that guides the delivery wire.
- the region of interest R can be specified using, for example, a pointing device such as a mouse, a touch panel, or the like.
- the region of interest R is indicated by a white star.
- the marker detection unit 114 detects a marker L that approaches the region of interest R among the markers L provided on the delivery wire of the embolization coil E for embolizing an aneurysm. Also, the tracking unit 112 tracks the marker L detected by the marker detection unit 114.
- the notification unit 113 notifies the user of the timing to disconnect the embolization coil E from the delivery wire based on the positional relationship between the marker L and the region of interest R. do.
- FIGS. 8(a)-(d) are diagrams for explaining the timing of disconnection of the embolization coil E.
- FIG. Device D indicated by a dashed line in FIGS. 8(a)-(d) is a delivery wire.
- the delivery wire has a marker L attached at a certain section on the wire.
- 8(a)-(d) show typical examples of markers L.
- a one-dot chain line "---" or a long straight line may be used.
- the marker L in the X-ray image also moves in conjunction with the movement of the delivery wire.
- the guiding catheter for guiding the delivery wire may move slightly due to friction or the like in accordance with the movement of the delivery wire, but the amount of movement is small compared to the movement of the delivery wire. Therefore, the user sets the region of interest R in advance at the position of the guiding catheter corresponding to the position where the marker L should be when the embolization coil E reaches the aneurysm A.
- the marker L exists over a certain section on the wire.
- the tip of the marker L contacts the region of interest R as shown in FIG. 8(b). This indicates that the embolic coil E has approached the aneurysm A.
- the notification unit 113 starts an operation of notifying the user of the timing of disconnecting the embolization coil E from the delivery wire.
- the notification unit 113 is triggered by the superimposition of the marker L and the region of interest R until the embolization coil E is cut off from the delivery wire. 2, the display device 2 displays information W2 indicating the distance that the marker L should move.
- the user should adjust the region of interest so that the end of the marker L just passes through the region of interest R when the embolization coil E reaches the aneurysm A. set R.
- the information W2 displayed on the display device 2 by the notification unit 113 displays the distance until the terminal end of the marker L passes through the region of interest R.
- the notification unit 113 When the marker L passes through the region of interest R, the notification unit 113 further notifies the user of that fact. As a result, even when the user is concentrating on the captured image of the aneurysm A as shown in FIG. can be done.
- shape of device D Next, as yet another example of the conditions set for the region of interest R, the conditions regarding the shape of the device D (eg, guide wire) will be described.
- the notification unit 113 When the feature amount indicating the shape of the device D included in the region of interest R satisfies a predetermined condition, the notification unit 113 notifies the user of the image processing apparatus 1 of that fact. Specifically, the notification unit 113 notifies based on the feature amount indicating the curvature of the device D included in the region of interest R or the “deflection” of the device D included in the region of interest.
- the tip of the device D When the user tries to advance the device D while the tip of the device D is caught on the blood vessel wall, etc., the tip of the device D is bent. In general, when the tip of device D is bent, elastic energy is accumulated in that portion. This elastic energy accumulates more as the tip portion of the device D bends, that is, as the curvature of the device D increases (curvature radius decreases). When the amount of stored elastic energy increases, the elastic force of the device D releases the catching of the tip, and the tip may move at high speed. As a result, the tip of device D may suddenly disappear from the X-ray image.
- the notification unit 113 notifies the user on the condition that the curvature of the device D included in the region of interest exceeds a predetermined threshold curvature or that the tip does not move even though the curvature changes. It may be added that the tip of the device D is immovable or that the movement distance is lower than a certain threshold.
- the "predetermined threshold curvature” is a "reference curvature for notification determination” provided for the notification unit 113 to determine whether or not there is a high probability that the tip of the device D will move at high speed.
- a specific value of the predetermined threshold curvature may be determined through experiments in consideration of the frequency of notification by the notification unit 113, usability, the material and size of the device D, the elastic modulus, and the like.
- FIGS. 9A and 9B are diagrams for explaining the conditions regarding the shape of the device D set in the region of interest R.
- FIGS. 9A and 9B are diagrams for explaining notification based on the curvature of device D.
- FIG. 9(a) the region of interest R is indicated by a dashed circle.
- the region of interest R in FIG. 9(a) is a circle centered at the tip of device D and having a radius of 1 centimeter.
- FIG. 9(b) is a histogram showing the curvature distribution of the device D included in the region of interest R. As shown in FIG. Specifically, FIG. 9(b) divides the device D included in the region of interest R into a plurality of small regions, obtains the radius of curvature of the device D in each small region, and shows the distribution of the radius of curvature. .
- the notification unit 113 determines that, in the histogram indicating the curvature distribution of the device D, the feature amount (for example, a statistic such as the mean value, the mode value, or the median value of the curvature) calculated from the curvature distribution is equal to the predetermined threshold curvature value. or the tip does not move even though the curvature is changing, the user is notified. Thereby, the image processing apparatus 1 can provide an opportunity for the user to notice that the device D is in a state where elastic force is accumulated.
- the feature amount for example, a statistic such as the mean value, the mode value, or the median value of the curvature
- FIG. 9(c) is a schematic diagram showing the relationship between the length of the device D included in the region of interest R and the length of the centerline of the blood vessel V included in the region of interest R.
- FIG. 9(c) the device D and the blood vessel are curved inside the body of the subject P, for convenience of explanation, the device D and the blood vessel V are displayed as straight lines in FIG. 9(c).
- the center line of the blood vessel V is indicated by a dashed line.
- the distance measurement unit 115 extracts the blood vessel V in the region of interest R using the blood vessel recognition engine, traces its center line, and obtains the length D1. Similarly, distance measurement section 115 extracts device D using a device recognition engine generated using a known machine learning method or the like, and obtains its length D2.
- the device D advances along the wall of the blood vessel V while meandering. Therefore, the length D2 of the device D in the blood vessel V is longer than the length D1 of the blood vessel V (the length of the center line of the blood vessel V), and the device D is bent inside the blood vessel V. It means that elastic energy is stored in device D. If the amount of deflection increases, the elastic energy is released for some reason, and device D may move significantly. As a result, the tip of device D may disappear from the X-ray image.
- the differential length B which is the length obtained by subtracting the length D1 from the length D2 calculated by the distance measuring unit 115, can be an index indicating the amount of deflection of the device D in the blood vessel V. Therefore, the notification unit 113 displays the length B obtained by subtracting the length D1 from the length D2 calculated by the distance measurement unit 115, and notifies the user when the length is longer than a predetermined threshold length. Thereby, the image processing apparatus 1 can provide an opportunity for the user to notice that the device D is in a state where elastic force is accumulated.
- the aneurysm embolization catheter has a tip marker (1st marker) and a 2nd marker (usually 3 cm from the tip). It is important for the operator to know where the tip of the catheter is within the aneurysm in order to perform the operation safely. Less safe (see Figure 17). For example, if the tip marker moves deep into the aneurysm, its tip or the coil coming out of that tip will perforate the aneurysm, leading to a serious complication of subarachnoid hemorrhage.
- the catheter or coil will protrude outside the aneurysm and must be reinserted into the aneurysm, which poses the risk of perforation of the aneurysm wall.
- FIG. 17 the diagram on the left shows the state in which the microcatheter is inserted into the aneurysm.
- the aneurysm size can be, for example, 10 mm.
- the distance between the 1st marker and the 2nd marker attached to the microcatheter is constant (usually 30mm). For example, suppose that the positions of the 1st marker and the 2nd marker are respectively recorded in this state (storage position).
- the central figure in FIG. 17 shows a state in which the position of the microcatheter has moved. When the coil enters the aneurysm, the position of the 1st marker becomes invisible or difficult to see.
- the position of the 1st marker is estimated from the difference between the position of the 2nd marker at this time and the previously stored position of the 2nd marker.
- the 1st marker is predicted to be approximately 0 mm from the aneurysm neck (predicted position A).
- the figure on the right side also shows a state in which the position of the microcatheter has moved.
- the position of the 1st marker is estimated based on the movement distance of the 2nd marker. In this case, it is estimated to be 8 mm from the aneurysm neck.
- Some aspects of the present invention relate to a method of estimating the tip position from the movement of the second marker when the position is unknown because the tip of the catheter is inside the coil (aneurysm). More specifically, the step of memorizing the positional relationship (for example, 3 cm apart) between the tip of the catheter and the 2nd marker, the distance a between the aneurysm neckline and the 1st marker, and the position of the 2nd marker at time t1 , calculating the movement distance b from the position of the second marker at time t2, estimating the distance a - b from the aneurysm neckline of the tip of the catheter, and notifying the user of the estimated distance and a method for estimating a catheter tip position.
- the positional relationship for example, 3 cm apart
- the tip of the catheter, the second marker, and the neckline of the aneurysm can be automatically detected by computer image recognition. Alternatively, it may be specified manually using a pointing device such as a mouse or a touch panel.
- the estimated distance ab can be displayed on the display. Further, the position of the catheter tip estimated based on the estimated distance a ⁇ b may be displayed on the display device. You can set an arbitrary threshold and notify when it deviates from it, or calculate not only the distance but also the speed and acceleration and notify you based on the value (when it moves rapidly or greatly, there is a high possibility of danger ).
- This travel distance can be a straight line distance or a distance along the curve of the catheter. Also, since the curve shape of the catheter may vary, the length of the curve may be used.
- the distances a, b, ab can also be probability distributions. For example, if there are multiple timings to memorize for the first time, a distribution of the distance a can be created, so you can guess the location using the average or variance, etc., and predict it as the most likely point. , may be displayed in a heat map or the like as a probability distribution. Therefore, the estimated distance may be represented by a probability distribution, and the estimated position of the tip of the catheter may be colored like a heat map and displayed based on the probability distribution.
- the position of the 2nd marker may be specified by the operator or recognized by the computer, and may be indicated by a translucent superimposed display or an arrow on the display device. From this fixed display, the operator/assistant visually recognizes whether the tip marker has been advanced or pulled back by recognizing how much the current 2nd marker has deviated.
- Such an image processing device includes, for example, a positional relationship storage unit that stores the positional relationship (for example, 3 cm apart) between the tip of the catheter and the 2nd marker, the distance a between the neckline of the aneurysm and the 1st marker, and its A position storage unit that stores the position of the 2nd marker at time t1, a distance estimation unit that calculates the movement distance b from the position of the 2nd marker at time t2, and estimates the distance a - b from the aneurysm neckline of the tip of the catheter, and a notification unit that notifies the user of the estimated distance.
- a positional relationship storage unit that stores the positional relationship (for example, 3 cm apart) between the tip of the catheter and the 2nd marker, the distance a between the neckline of the aneurysm and the 1st marker, and its A position storage unit that stores the position of the 2nd marker at time t1, a distance estimation unit that calculates the movement distance b from the position of the 2nd marker at time t
- an X-ray image of a patient (or subject P) in a state in which the above-described image processing device, a guiding catheter, and a delivery wire for an embolization coil are inserted into a blood vessel is captured.
- the present invention also relates to a system for assisting cerebral aneurysm coil embolization, comprising an image capturing device for transmitting the image to the image processing device.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- FIG. 22 is a flowchart for explaining the flow of a catheter tip position estimation process using a second marker executed by the image processing apparatus according to the embodiment.
- the processing in this flow chart is started, for example, when the image processing device 1 is activated, or when the user or the image processing device 1 determines that it is necessary to start the processing.
- aneurysm neck line is automatically determined or specified by the user (operator) (S104). Then the distance A between the aneurysm neck line and the 1st marker is calculated (S106). The straight or curved distance A between the aneurysm neck line and the 1st marker (usually within the aneurysm) can then be measured.
- This movement distance may have directionality (plus/minus).
- the tip of the guide wire it is desirable to obtain detailed information on the movement of the tip. For example, rapid large movements increase the risk of vessel perforation.
- the position and elapsed time at the time when the catheter exits the image it is desirable to have information about deviations from the optimum point.
- the image processing device has a function (that is, a replay function) that allows the operator to look back when necessary by storing video and recognition information and providing information as necessary.
- FIG. 12 shows an image processing apparatus further comprising a video recording unit (which can be integrated with 10) that saves the image obtained from the image acquisition unit 110, and a video extraction unit 118 that partially extracts the video before and after the notification generation time. configuration is shown.
- FIG. 13 shows an example in which the replay video when the notification occurs is cropped around the region of interest and displayed enlarged.
- FIG. 14 shows an example of displaying the (enlarged) replay window on the real-time display screen. In this example, the replay is magnified and displayed as far as possible from the area of interest, allowing us to determine what happened in the area where the warning was issued within the limited range of one screen. be able to.
- the image processing device further includes an image storage unit 117 that temporally (continuously) saves images (including moving images) obtained from the image acquisition unit.
- the image processing apparatus further includes a video extraction unit 118 that extracts video from the video storage unit 117 for a certain period of time before and after the notification is issued by the notification unit.
- the video extraction period and playback speed may be automatically determined based on at least one of the moving distance, moving speed, and acceleration of the region of interest when the notification occurs.
- Information obtained by the region-of-interest acquisition unit, tracking unit, notification unit, marker detection unit, and/or distance measurement unit can also be used for extraction.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the image processing device can display the extracted video on the display device.
- the extracted video may be automatically displayed repeatedly a predetermined number of times.
- the extracted video may be displayed based on arbitrary operations including play, stop, fast-forward, rewind, frame-by-frame, slow play, and double speed play. This allows the user to easily check the video.
- the display device displays the elapsed time from the time when the notification was generated, and the comparison of the position of the region of interest between the time when the notification was generated and after the passage of an arbitrary time (comparative display is, for example, the difference between the relevant region and the detected position , alignment of the image itself), or the trajectory of the region of interest acquired by the tracking unit may be further displayed superimposed on the extracted video.
- the image processing device can cut out and display a partial region in the vicinity of the region of interest from the extracted video.
- the extracted image can be displayed at a position that does not interfere with the display of the region of interest.
- the extracted video may be enlarged and displayed.
- the image processing device can display the extracted video at the same time as the notification is generated or after a predetermined period of time has passed since the notification was generated. In some aspects of the present invention, the image processing device can simultaneously display images shot from multiple directions.
- the above replay display may be used at times other than when the notification occurs.
- the image extracting unit 118 may extract from the image storage unit 117 not only the image of a certain period before and after the notification is issued by the notification unit, but also the image of an arbitrary time or period. For example, when the user (surgeon) feels it necessary, by specifying an arbitrary region of interest, the previous scene can be viewed in replay display. This allows the user to grasp and compare what happened in the region of interest while watching the display in real time.
- Some aspects of the present invention relate to programs for executing the above methods on a computer. Some aspects of the invention also relate to an image processing apparatus and method of operation for performing the above method.
- FIG. 23 is a flowchart for explaining the processing flow of the replay function executed by the image processing device according to the embodiment. The processing in this flowchart starts, for example, when the image processing apparatus 1 is activated.
- the image acquisition unit 110, the region-of-interest acquisition unit 111, and possibly the tracking unit 112 function to detect and track the inspection or treatment device D in the blood vessel V by video analysis (S202).
- a notification condition (moved distance exceeds a threshold value, etc.) is satisfied (S204).
- Replay videos before and after satisfying the conditions are displayed (S206).
- the real-time image may be displayed normally, and the replay video may be displayed several times so as not to overlap the notification condition area, or may be repeatedly displayed until the user (operator/assistant) desires ( S208).
- the repetition ends close the replay video screen. After the end, the processing in this flowchart may be started again.
- the region of interest such as the tip of the guide wire or the tip of the guiding catheter
- the degree of risk varies depending on the amount of movement.
- the tip of the guide wire protrudes slightly outside the frame (e.g., within 5 mm)
- the possibility of vascular perforation is low, but if it protrudes significantly (e.g., 20 mm or more), the risk of vascular perforation.
- some aspects of the present invention relate to apparatus for estimating and displaying the state of position, velocity, and acceleration of a framed-out region of interest.
- Such estimation of the position of the region of interest outside the frame is performed, for example, by an image processing device that acquires an image including at least an intravascular examination or treatment device as a subject, and a region of interest acquisition unit that acquires one or more regions including at least part of the included device as a region of interest; a tracking unit that tracks each of the regions of interest in the image; A notification unit for notifying a user of the image processing apparatus of the fact that a condition defined for each region of interest is satisfied, wherein a region including a distal end of a catheter or a distal end of a guide wire is set as the region of interest.
- a notification unit that notifies the user on the condition that the region of interest disappears from the image, and a position of the tip of the catheter or the tip of the guide wire immediately before the region of interest disappears from the image
- a state estimator 119 that estimates the current position and/or velocity of the missing catheter tip or guidewire tip from the image based on velocity and/or acceleration ( Figure 15).
- the current position and/or velocity of the region of interest estimated by the state estimator 119 can be displayed on the display device.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the state estimation unit 119 chronologically stores the outputs from the region of interest acquisition unit, the marker detection unit, and the tracking unit before the frame-out. , the stored output can be used to compute the position, velocity, acceleration, etc. state of the region of interest. Then, when the region of interest is out of the frame and tracking on the screen becomes impossible, the position, speed, etc. of the region of interest are estimated from the state before the frame out and notified to the user.
- the estimation method at that time includes, but is not limited to, a learning-based method by deep learning (CNN, RNN, WaveNet, etc.), and a Bayesian estimation method (Kalman filter, extended Kalman filter, ensemble Kalman filter, particle filter, etc.).
- some aspects of the present invention also relate to a device that calculates the degree of risk from the estimated state and issues a notification to the user according to the degree of risk.
- the position, velocity, and degree of risk of the region of interest estimated by state estimation section 119 can be displayed on the display device.
- the display method can be display by points, arrows, heat maps, etc., but is not limited to these. For example, if the estimated position of the region of interest is separated from the edge of the image by a predetermined distance or more, it can be determined that the risk is high (FIG. 16). For example, in FIG. 16, the estimated positions of the regions of interest are indicated by circles, and it is determined that the farther the position is estimated from the edge of the screen, the higher the risk.
- the degree of risk may be indicated by the color of the circle (for example, green indicates low risk and red indicates high risk).
- an alert can be displayed on the screen of the display device.
- an audio notification may be provided.
- one aspect of the present invention is an image processing apparatus comprising: an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region of interest acquisition unit that acquires one or more regions including at least a part thereof as a region of interest; A notification unit for notifying the user of the device, wherein when a region including a tip of a catheter or a tip of a guide wire is set as the region of interest, the region of interest disappears from the image.
- a notification unit that notifies the user, and based on the position, velocity and/or acceleration of the catheter tip or guidewire tip just before the region of interest disappeared from the image, the tip of the catheter that disappeared from the image or
- a state estimator 119 for estimating the current position and/or velocity of the tip of the guidewire, and alerting the user when the current position and/or velocity of the region of interest estimated by the state estimator 119 exceeds a predetermined threshold. It also relates to an image processing device in which the This image processing apparatus may further include a tracking unit that tracks each of the regions of interest in the image. Note that in some aspects of the present invention, an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the image processing apparatus may further include a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- a notification or output may be provided when the region of interest enters a vessel with a diameter below a threshold, or a manually or automatically specified vessel.
- Some aspects of the present invention relate to programs for executing the above methods on a computer. Some aspects of the present invention also relate to an image processing apparatus and method of operation for performing the above method.
- FIG. 24 is a flowchart for explaining the flow of a process of estimating the position of an out-of-frame region of interest performed by the image processing apparatus according to the embodiment. The processing in this flowchart starts, for example, when the image processing apparatus 1 is activated.
- the image acquisition unit 110, the region-of-interest acquisition unit 111, and possibly the tracking unit 112 function to detect and track the inspection or treatment device D in the blood vessel V by video analysis (S302).
- the position of the device outside the frame is estimated (S306).
- the position outside the frame of device D is estimated from the front side of the black part of the guide wire visible in the screen connected to device D, filters, balloons, etc. A difference in distance, machine learning, or the like can be used for the estimation.
- display the estimated location of the device (S308).
- the estimated position of device D that has moved out of the frame is displayed on the display screen as the actual estimated position or the distance from the frame. The more out of the frame, the stronger the display or notification may be.
- the results of image analysis can be displayed on two screens of different sizes.
- surgery is generally performed while viewing multiple screens (for example, 4 screens), and the operator has at least 2 screens (generally front (AP or F direction: Anterior-Posterior, Frontal) and side (RL direction or LAT direction: Right to Left, Lateral)) to grasp three-dimensional information. Therefore, it is important to display front and side images, and it is very important to display them easily on a monitor with a physically limited size.
- the monitor is often positioned over the patient's bed at a distance of 1m or more, and the operator is often instructed to bring the monitor closer to the operator by even 1cm.
- the angle between the front and the side moves in three dimensions. As a result, even when 3D is projected onto 2D, it is still viewed at an angle so that it can be seen clearly in 2D as well. Furthermore, in the case of 4 screens, it will be Live and Mask on the front and Live and Mask on the side. Live is a normal fluoroscopy image, similar to a typical x-ray, and you see it in real time. Mask takes the difference (subtraction) with any past Live image selected by the operator. As a result, the bones that were visible in Live disappear, and for example, only the blood vessels and devices that are visible with the contrast agent are visible, providing an image that is easy for the operator to understand.
- the two screens of different sizes can be switched automatically or by the user's (operator's) selection (see Fig. 18). Since there is a physical limit to screen size, and you may want to see one larger than the other, you can improve visibility by displaying the two screens at different sizes, as shown in Figure 18. can be done.
- the display device has a function of drawing the user's attention by lighting, changing the color, or highlighting the frame portion of one of the two screens. may be
- the region of interest in the screen is output as a probability distribution, so it is possible to express the existence of the region of interest in terms of probability.
- the probability distribution of the existing area may be displayed using numerical values, colors, bars, or the like.
- the probability distribution may be converted and displayed for easy understanding.
- 0% to 30% may be displayed as low, 30% to 70% as middle, and 70% to 100% as high, or may be displayed in three colors.
- the area below 70% may be slightly darkened so that the area of interest or the area to be notified appears bright like a spotlight.
- the notification unit displays numerical values, colors, bars, heat maps, or probability distributions according to the probability that at least one of the regions of interest satisfies a condition defined for each region of interest.
- Numeric values, colors, bars or heat maps based on any transformed values can be displayed on the display device that displays the image.
- the notification unit uses a color or a heat map according to the probability that at least one of the regions of interest satisfies a condition defined for each region of interest, or a color or heat map based on a value obtained by arbitrarily converting the probability distribution, It may be displayed on a display device that displays an image by coloring the region of interest, or may be displayed on a display device that displays an image by replacing the probability of satisfying the condition with a numerical value or color.
- Device selection display/recording There are various types of devices used in endovascular treatment, and there are many types of each. Examples of devices include various catheters, balloon catheters, guidewires, stents, flow diverter stents (fine-mesh stents), coils, embolic agents (substances such as liquids and particles), and other embolic devices (such as WEB). There is, there are various types of each device. For example, for catheters, there are standards for tip shape, length, lumen, outer lumen, hardness, and the like. There are hundreds of types of coils, with standards for manufacturers, thickness, length, diameter, hardness, etc.
- the coil lineup (length, diameter, hardness, shape, etc.) is selected according to the size of the aneurysm and the behavior of the previously inserted coil.
- a proposal may be made according to the aneurysm and the winding method of the coil by image analysis.
- the operator selects a desired device from such a list and performs surgery. By recording this information and combining it with snapshots of treatment videos, a surgical record can be automatically created.
- the display device that displays images can display a product list of intravascular examination or treatment devices (for example, various catheters and coils) (see FIG. 20).
- the display may also display product listings filtered by size or inventory.
- the display device may display a list of recommended products based on image analysis results, facility information, or user preference information.
- the image processor can automatically or based on user selection create a surgical record including information of the device used, information of the images acquired, and the results of the image analysis. can.
- one aspect of the present invention is an endovascular surgery support system, comprising: a storage unit storing a product list of intravascular examination or treatment devices (for example, various catheters and coils); It also relates to a system including a recommender for recommending products for use based on results, facility information, or user preference information, and a display for displaying the recommended products.
- a storage unit storing a product list of intravascular examination or treatment devices (for example, various catheters and coils); It also relates to a system including a recommender for recommending products for use based on results, facility information, or user preference information, and a display for displaying the recommended products.
- the image processing apparatus determines that a newly appearing device in an image, for example, an edge or within a certain distance from an edge or within a particular region in an image, is a region of interest. It is possible to notify the user (operator) on the condition that it is newly detected as (Fig. 25). More specifically, some aspects of the present invention include an image acquisition unit that acquires an image for intravascular inspection or treatment, and an intravascular inspection or treatment device in the image. a region-of-interest acquiring unit that acquires one or more regions including at least part of the device as regions of interest, and at least one of the regions of interest satisfies a condition defined for each region of interest.
- the present invention relates to an image processing apparatus that notifies the user on the condition that the This can reduce the burden on the operator when performing endovascular treatment.
- the device appears on a maximum of 4 screens: the front live image, the side live image, the front mask image, and the side live image.
- the timing at which the device comes out from below is often different between the front image and the side image. Therefore, notification may be given when the device is visible on one of the screens, and then notification may not be given even if the same device appears on another screen.
- the front or side live image and the mask image are often the same size, but may be magnified by the mask image. In this case, since the device first appears in the live image, the live image may be used for notification, and after that, even if the device appears in the mask image, notification may not be performed.
- the edge may be changed by focusing the X-ray, for example, pressing a button on the edge may automatically set the boundary to the X-ray range (Fig. 41). If the scene changes temporarily due to imaging or the like, it may be detected and the notification may be disabled. Notification may be suppressed according to image quality. For example, if the mask image is out of alignment and difficult to see, or if the patient's body motion causes the entire screen to move, the notification may be temporarily suppressed.
- the image processing device may further include a tracking unit that tracks the region of interest in the image.
- the notification unit notifies the user on the condition that a device newly appearing within an edge in the image, within a range of a certain distance from the edge, or within a specific area is newly detected as the region of interest. can do.
- the edge in the image or within a certain distance from the edge is, for example, within 10%, within 8%, within 5% of the total length in the vertical or horizontal direction when viewed from the edge of the image, or It refers to the area within 3%, but is not limited to this.
- the specific area may be designated by the user or automatically designated by the image processing apparatus.
- the designation by the user can be performed using a pointing device such as a mouse or a touch panel.
- the specific area may be an area automatically determined as an area to be confirmed by the image processing apparatus. Areas to be confirmed include, for example, whether the tip of the guide wire is within a certain range, whether the tip of the guiding catheter is within a certain range, whether the tip of the stent delivery wire is within a certain range, The coil marker is within a certain range, the 2nd marker is within a certain range, the range in which the catheter's 2nd marker moves to see when the coil can be cut, and the filter is within a certain range. , catheter tip within range, embolic material within range, device remaining within large vessels, aneurysms, or critical vessels.
- the automatic determination of the area to be checked can be made based on a model created by machine learning, but is not limited to this.
- the area may be specified manually by the user, or may be specified automatically or semi-automatically.
- the area surrounding the 2nd marker is automatically set or proposed, and when the user OKs A region is defined (Fig. 35).
- a rectangular area of constant a ⁇ constant b is automatically specified so that the guide wire is centered.
- the constant a/b may be changed according to the case, magnification ratio, and operator's preference.
- the area is not limited to a rectangle, and may be a circle, an ellipse, an arbitrary closed curve, a straight line, or an arbitrary curve. In the case of a straight line or an arbitrary curve, it is determined that the area is divided there. A straight line can be stretched to divide the screen into two areas. Also, the area may be divided along the blood vessel by a straight line or an arbitrary curve.
- regions may be divided according to whether they are distal or proximal rather than regions drawn with straight lines or curved lines. These are specific examples and are not limited to these.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit. Specifically, for example, output to the catheter surgery robot via the output unit, notify the operator through the robot, automatically stop advancing the device once, insert the device and display on the screen The device may be automatically fed until it comes out, and when the device comes out by the output of the output section, the feeding may be stopped or the speed may be reduced.
- AR/VR/MR As another example, if you connect to AR/VR/MR, etc., and receive the output of the appearance of the device from the above output unit, you may display or notify it on AR/VR/MR. Of course, it is not limited to this example, and includes outputting to any other device.
- the image processing apparatus notifies the user (operator) on the condition that a device newly appearing in a specific region in the image is newly detected as the region of interest. (Fig. 37). More specifically, some aspects of the present invention include an image acquisition unit that acquires an image for inspection or treatment in a blood vessel, and an intravascular inspection or treatment device in the image.
- a region-of-interest acquiring unit that acquires one or more regions including at least part of the device as regions of interest, and at least one of the regions of interest satisfies a condition defined for each region of interest.
- a notification unit for notifying a user of the image processing apparatus of the fact, the notification unit newly detecting a newly appearing device or part thereof in an image or in a specific area as an area of interest.
- the present invention relates to an image processing apparatus that notifies the user on the condition that the This can reduce the burden on the operator when performing endovascular treatment.
- the image processing device may further include a tracking unit that tracks the region of interest in the image.
- the notification unit notifies the user on the condition that a device newly appearing within an edge in the image, within a range of a certain distance from the edge, or within a specific area is newly detected as the region of interest. can do.
- the specific area may be specified by the user or automatically by the image processing device. The designation by the user can be performed using a pointing device such as a mouse or a touch panel. Also, the specific area may be an area automatically determined as an area to be confirmed by the image processing apparatus. Areas to be confirmed include, for example, whether the tip of the guide wire is within a certain range, whether the tip of the guiding catheter is within a certain range, whether the tip of the stent delivery wire is within a certain range, The coil marker is within a certain range, the 2nd marker is within a certain range, the range in which the catheter's 2nd marker moves to see when the coil can be cut, and the filter is within a certain range.
- the automatic determination of the area to be checked can be made based on a model created by machine learning, but is not limited to this.
- the area may be specified manually by the user, or may be specified automatically or semi-automatically. For example, when the 2nd marker of the catheter is recognized and the coil enters the aneurysm or the coil marker is detected, the area surrounding the 2nd marker is automatically set or proposed, and when the user OKs A region is defined (Fig. 35). Alternatively, for example, at the tip of the guide wire, a rectangular area of constant a ⁇ constant b is automatically specified so that the guide wire is centered.
- the constant a/b may be changed according to the case, magnification ratio, and operator's preference.
- the area is not limited to a rectangle, and may be a circle, an ellipse, an arbitrary closed curve, a straight line, or an arbitrary curve. In the case of a straight line or an arbitrary curve, it is determined that the area is divided there. A straight line can be stretched to divide the screen into two areas. Also, the area may be divided along the blood vessel by a straight line or an arbitrary curve. That is, along the blood vessel, regions may be divided according to whether they are distal or proximal rather than regions drawn with straight lines or curved lines.
- the specific area may be intravascular or extravascular.
- extravascular Fig. 39
- the tip of the guidewire will be notified when it enters the specified area, that is, when it goes out of the blood vessel.
- a guide wire is inserted into a small blood vessel (perforator), but when only looking at either the front or the side, it is possible to damage the small blood vessel without noticing it too late. Therefore, it can be notified early by notifying it.
- a blood vessel may also be perforated, and even in such cases, early notification can minimize complications.
- Thin blood vessels such as perforators are too thin to be seen in X-ray images.
- the automatic segmentation of the blood vessel does not include the perforator, so by setting the outside of this segmentation as a specific region, it is possible to notify, for example, when the guide wire enters the perforator. If perforating branches and relatively thin blood vessels (ophthalmic arteries, thin posterior communicating arteries, etc.) can be seen in the X-ray image, a certain threshold value can be set, and thin blood vessels smaller than that can be excluded from segmentation. It is possible to notify when the guide wire has strayed into the blood vessel.
- the specific region may be a region containing a lesion such as an aneurysm or stenosis, or a region behind it (inverted region).
- a notification sounds when the guidewire enters the aneurysm. This notifies the user that he or she has entered an aneurysm, and allows the patient to be cautious in subsequent operations.
- a notification can inform the user that the guidewire has entered a dangerous (undesired) location. .
- the front live image, the side live image, the front mask image, and the side live image it is possible to automatically enclose all the specific areas or to select and enclose.
- the tip of the guiding catheter can be seen in either the front image or the side image, so select the guiding catheter at the higher position in the two screens and set a specific area.
- the specific region may be selected only from the front or the side, whichever side makes the coil markers more visible.
- the visibility of the coil marker for example, when detecting, a longer coil marker may be selected, or a coil marker with a higher degree of certainty by deep learning may be selected, but the method is not limited to these methods.
- Notification may be suppressed according to image quality. For example, if the mask image is out of alignment and difficult to see, or if the patient's body motion causes the entire screen to move, the notification may be temporarily suppressed.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- output to the catheter surgery robot via the output unit notify the operator through the robot, automatically stop advancing the device once, insert the device and display on the screen
- the device may be automatically fed until it comes out, and when the device comes out by the output of the output section, the feeding may be stopped or the speed may be reduced.
- you connect to AR/VR/MR, etc. and receive the output of the appearance of the device from the above output unit, you may display or notify it on AR/VR/MR.
- you may display or notify it on AR/VR/MR.
- the user sets a specific region, if there is only one device, by setting it as the region of interest, the relationship between the specific region and the device's region of interest is automatically created. If multiple devices are included, the user may specify, or the region of interest of a specific device may be automatically set according to the importance of the device, the position of the specific region on the screen, the surgical scene, etc. . For example, in the case of a guide wire and a guiding catheter, the guide wire may be selected because the guide wire is more important. Alternatively, when a guiding catheter and a guide wire enter a specific area at the same time, the guiding catheter may be selected because the guide wire is often further advanced.
- the guiding catheter may be set as the region of interest.
- the guiding catheter is specified as the region of interest from the surgical scene. You may These are examples and can be determined by device type, location of the particular region, surgical scene, user preferences, user previous methods.
- this image processing apparatus may further include a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- a notification or output may be provided when the region of interest enters a vessel with a diameter below a threshold, or a manually or automatically specified vessel.
- the region of interest is selected from the group consisting of guiding catheters, guide wires, intermediate catheters, microcatheters, thrombectomy aspiration catheters, markers, coils, stents, filters, embolic materials, aneurysm embolization devices, and balloons.
- a region can be established that includes at least a portion of an intravascular examination or treatment device.
- the region including at least part of the device is the tip of an intravascular examination or treatment device.
- FIG. 40 notifies when the filter enters a specific area (outside the box) in carotid artery stenting. In other words, notification is given when the filter moves by a certain amount.
- an image including at least an intravascular examination or treatment device as a subject is acquired, and one or more regions including at least part of the device included in the image are obtained.
- FIG. 26 is a flowchart for explaining the flow of processing for recognizing a device newly entered on the screen, which is executed by the image processing apparatus according to the embodiment.
- the processing in this flowchart starts, for example, when the image processing apparatus is activated.
- the image acquisition unit and the region-of-interest acquisition unit function to detect an examination or treatment device in the blood vessel through image analysis.
- a notification condition or an output condition a newly appearing device at the edge of the image is newly detected as a region of interest
- Some aspects of the present invention are an image processing apparatus including a storage device and a processor connected to the storage device, the processor acquiring images for intravascular examination or treatment; When an image includes an intravascular examination or treatment device, one or more regions including at least a portion of the device are acquired as regions of interest, and at least one of the regions of interest is the If a condition defined for each region of interest is satisfied, the fact is notified to the user of the image processing apparatus, wherein the notification indicates that a newly appearing device or part thereof in an image or within a specific region is It relates to an image processing apparatus that is performed for the user on the condition that a region of interest is newly detected.
- some aspects of the present invention are image processing apparatus including a storage device and a processor coupled to the storage device, the processor acquiring images for intravascular examination or treatment. , when the image includes an intravascular examination or treatment device, one or more regions including at least a portion of the device are acquired as regions of interest, and at least one of the regions of interest If one satisfies a condition defined for each said region of interest, it is output to an external device, wherein said output is the newly appearing device or part thereof in the image or within a particular region is the region of interest.
- the present invention relates to an image processing apparatus, which is performed on the external device on condition that it is newly detected as an image processing apparatus.
- the image processing apparatus notifies the user (operator) on the condition that the region of interest passes through or is expected to pass through a specific boundary line specified on the image. can do. More specifically, some aspects of the present invention include an image acquisition unit that acquires an image including at least an intravascular inspection or treatment device as a subject, and at least a part of the device included in the image. a region-of-interest acquiring unit that acquires one or a plurality of regions including a notifying unit that notifies, the notifying unit notifying the user on the condition that the region of interest passes through, or is expected to pass, a specific boundary line specified on the image; It relates to an image processing device.
- This image processing apparatus may further include a tracking unit that tracks each of the regions of interest in the image. Note that in some aspects of the present invention, an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- this image processing apparatus may further include a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- a notification or output may be provided when the region of interest enters a vessel with a diameter below a threshold, or a manually or automatically specified vessel.
- the specific boundary line may be specified by the user or automatically specified by the image processing device.
- Various pointing devices such as a touch panel and a mouse can be used when the user specifies.
- the conditions and positions for such designation can be set in advance.
- a particular boundary line can be represented by a straight line, a curved line, a circle, a rectangle, or any polygonal or closed curve. For example, if the boundary is a closed curve, "passing through” can mean either entering or leaving the area specified by the closed curve.
- Such a demarcation line can be used, for example, to detect entry into blood vessels where entry of the device is undesirable. For example, by setting a boundary at the entrance of a bifurcation to a particular blood vessel, entry of the device into such blood vessel can be detected.
- a specific boundary line may be automatically specified when the region of interest has not moved for a certain period of time.
- a particular boundary line may be automatically designated according to the surgical situation, or may be automatically suggested and allowed to be accepted or rejected by the user.
- a particular boundary line may be automatically canceled depending on the surgical situation, or may be automatically proposed and the user may accept or reject it.
- the specific boundary lines specified on one screen may be automatically specified at corresponding positions on other screens.
- the notification unit notifies the user of one or more regions for each of the regions of interest only when the conditions are met for the first time. In other words, when device A and device B are set as regions of interest, notification is performed when device A passes the boundary for the first time and when device B passes the boundary for the first time. Also, in some aspects, the notifier notifies the user of one or more regions for each of the regions of interest only when the condition is met for the first time after a specific point in time. Note that, in some aspects of the present invention, an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the notification unit notifies the user only when the condition is satisfied for the first time in one or more regions in the entire region of interest. In other words, if device A and device B are set as regions of interest, notification will only occur when either device A or device B crosses the boundary for the first time, and when the other crosses the boundary is not notified. Also, in some aspects, the notification unit notifies the user of one or more regions in the entire region of interest only when the condition is met for the first time after a specific point in time. Note that in some aspects of the present invention, an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the notification unit notifies the user of one or more specific regions of interest designated by the user only when the conditions are satisfied. In other words, if device A and device B are set as regions of interest, and the user specifies notification only for device A, notification will be made only when device A crosses the boundary for the first time, There is no notification when device B crosses the perimeter.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the notification unit notifies the user of the automatically identified one or more regions of interest only when the conditions are satisfied.
- the image processing device automatically specifies notification only for device A, notification will be made only when device A passes the boundary line for the first time. is performed, and no notification is performed when device B passes through the boundary.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the boundary line is superimposed on the X-ray image.
- the boundaries may be redrawn in response to changes in the extent or magnification of the X-ray image. It may be done automatically, or it may be suggested to the user that if the user presses OK, it will be redrawn, and if it is No, it will remain as it was. For example, when the range of the X-ray image moves up, down, left, right, or rotates, the boundary line is also redrawn so as to move or rotate accordingly. Also, for example, when the range of the X-ray image is enlarged, reduced or moved, the boundary line is redrawn so as to be enlarged, reduced or moved accordingly (Fig. 42).
- certain boundaries are redrawn after interruption and reacquisition of the X-ray image.
- the notification may be performed by generating a warning sound or changing the display style of the boundary line.
- different notification methods may be used depending on the orientation in which the region of interest passes through a particular boundary line designated on the image. For example, if a boundary is set at the entrance of a bifurcation to a particular blood vessel to detect entry into a blood vessel where device entry is undesirable, then upon detecting entry of the device into such vessel and a different notification (such as a different sound or color) may be used when the device is returned out of such vessel.
- the direction in which the region of interest passes through a specific boundary line specified on the image is automatically recognized or automatically suggested according to the surgical situation, and the user agrees to it. Or you can refuse.
- different notification methods may be used depending on the speed with which the region of interest crosses the boundary.
- the boundary line may be automatically erased as the scene changes, or may be suggested to the user to be erased. If a suggestion is made, it will be deleted if the user OKs it. For example, if the working angle is changed, the shape of the blood vessel changes on the 2D screen of the X-ray image, so maintaining the boundary may not make sense, so the boundary is deleted. Alternatively, if the table is moved and the screen moves greatly, and the boundary is outside the X-ray screen, the boundary is erased. This is an example, suggesting elimination of borders for scene changes, high noise, high patient movement, etc.
- the notification unit causes a display device that displays the image to display the distance between the region of interest and the specific boundary line, the size of the distance between the region of interest and the specific boundary line
- the display mode of the distance on the display device may be changed according to.
- the notification unit notifies the user on the condition that a value obtained by dividing the distance between the region of interest and the specific boundary line by the moving speed of the region of interest in the image is less than a predetermined threshold.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- an image including at least an intravascular examination or treatment device as a subject is acquired, and one or more regions including at least part of the device included in the image are obtained. and tracking each of the regions of interest in the image and providing the user with the condition that the region of interest passes or is expected to pass a particular boundary line specified on the image. It relates to a method of notifying or outputting to an external device, as well as a program for executing the above method on a computer. Prediction of passage may be based on the distance between the region of interest and a boundary, or the distance/velocity of the region of interest. Some aspects of the present invention also relate to an image processing apparatus and method of operation for performing the above method.
- Some aspects of the present invention are an image processing apparatus including a storage device and a processor connected to the storage device, the processor processing an image including at least an intravascular examination or treatment device as a subject. obtaining, as regions of interest, one or more regions that include at least a portion of the device included in the image, and tracking each of the regions of interest in the image, wherein the region of interest is on the image.
- the present invention relates to an image processing apparatus that notifies the user on the condition that a specific designated boundary line is passed or is predicted to be passed.
- an image processing apparatus including a storage device and a processor connected to the storage device, wherein the processor includes at least an intravascular examination or treatment device as a subject. obtaining an image, obtaining one or more regions of interest that include at least a portion of the device included in the image as regions of interest, and tracking each of the regions of interest in the image, wherein the regions of interest are in the image.
- the present invention relates to an image processing apparatus in which output to an external device is performed on the condition that a specific boundary line specified above is passed or is predicted to be passed.
- the image processing apparatus can obtain, as the region of interest, a region of interest included within a region specified by the user on the image.
- some aspects of the present invention include an image acquisition unit that acquires an image including at least an intravascular inspection or treatment device as a subject, and at least a part of the device included in the image.
- a region-of-interest acquiring unit that acquires one or a plurality of regions including and a notification unit that notifies, and acquires a region of interest included in a region specified by a user on the image as the region of interest.
- This image processing apparatus may further include a tracking unit that tracks each of the regions of interest in the image.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the image processing apparatus may further include a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- a notification or output may be provided when the region of interest enters a vessel with a diameter below a threshold, or a manually or automatically specified vessel.
- the area specified by the user on the image is represented by a rectangle, circle, or any closed curve. Area candidates that can be specified by the user may be automatically displayed. Also, in order to facilitate designation by the user on the screen, the image may temporarily be a still image when the user designates an area. Alternatively, when the user designates an area, the image may be temporarily played back with a delay (slow playback). Furthermore, when the user designates an area, part of the image may be enlarged and displayed. In some aspects, candidate regions that can be specified by the user may be automatically displayed according to surgical conditions.
- the region of interest to be confirmed may be automatically selected. In other words, even if the designated region includes multiple regions of interest, unimportant regions of interest are excluded from the designation.
- Conditions for determining the region of interest to be confirmed may be set in advance by the user, for example. Alternatively, this condition may be set automatically by machine learning. For example, the region of interest to be confirmed can be determined depending on the surgical situation. Surgical conditions can also be automatically recognized by the image processor.
- the areas to be checked are that the tip of the guide wire is within a certain range, that the tip of the delivery wire of the stent is within a certain range, and the 2nd marker of the catheter to see when the coil can be cut. Range of movement, within a certain range of the filter, within a certain range of the tip of the catheter, within a certain range of embolic material, and within a certain range of the device remaining inside large blood vessels , aneurysms, or critical vessels.
- an image including at least an intravascular examination or treatment device as a subject is acquired, and one or more regions including at least part of the device included in the image are obtained. obtained as regions of interest, tracking each of the regions of interest in the image, and notifying a user of the image processing device when at least one of the regions of interest satisfies a condition defined for each region of interest. and a method for acquiring a region of interest included in a region specified by a user on the image as the region of interest, and a program for executing the above method on a computer.
- Some aspects of the present invention also relate to an image processing apparatus and method of operation for performing the above method.
- Some aspects of the present invention are an image processing apparatus including a storage device and a processor connected to the storage device, the processor processing an image including at least an intravascular examination or treatment device as a subject. acquiring one or more regions of interest that include at least a portion of the device included in the image as regions of interest; tracking each of the regions of interest in the image; When a condition defined for each region is satisfied, the fact is notified to the user of the image processing device, and a region of interest included in a region designated by the user on the image is acquired as the region of interest.
- the present invention relates to an image processing device.
- an image processing apparatus including a storage device and a processor connected to the storage device, wherein the processor includes at least an intravascular examination or treatment device as a subject. acquiring an image; acquiring one or more regions of interest that include at least a portion of the device included in the image as regions of interest; tracking each of the regions of interest in the image; When the condition defined for each region of interest is satisfied, it is output to an external device, where the region of interest included in the region specified by the user on the image is acquired as the region of interest. It relates to an image processing device.
- some aspects of the present invention include an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject, and a situation recognition unit that recognizes a specific surgical situation based on the image. and an image processing apparatus.
- the conditions of the above-mentioned specific surgery include surgical details, disease name, vascular leakage, vascular perforation, occurrence of thrombus, disappearance of peripheral blood vessels, guidance of guidewire/catheter/balloon, suction by catheter, stent placement, balloon expansion, Insertion of coil, insertion of aneurysm embolization device (e.g., bag-shaped embolization device WEB (Terumo), self-expanding implant PulseRider (Cerenovus), etc.), timing of cutting coil, guide/placement/retrieval of filter, Selected from the group consisting of injection of liquid embolic substance, injection of contrast medium, continuation of static image for a certain period of time or longer, discrimination between mask image and live image, discrimination of imaging region and angle, image switching, and noise generation There can be at least one situation.
- aneurysm embolization device e.g., bag-shaped embolization device WEB (Terumo), self-
- the specific surgical situation may be recognized by pattern matching, image recognition, time-series image recognition, time-series difference, or object detection algorithms. For example, by taking the time-series difference for scene switching, it is possible to recognize that the scene (surgical situation) has been switched if it is greater than a certain threshold.
- noise can be recognized by classifying it into no noise, little noise, and noise, and classifying still images using machine learning such as CNN.
- the expansion of the balloon can be determined by recognizing that there is an expanded balloon through object recognition such as machine learning SSD and learning by U-Net segmentation (Fig. 30).
- FIG. 30 U-Net segmentation
- FIG. 30 shows an example of recognizing, displaying, and recording an X-ray image of a specific operation such as puncture, balloon guidance, or coil insertion, or a moving image of a person's hand. It is also possible to display events such as imaging (contrast agent injection), events such as the guiding catheter or guide wire going out of the screen, and the time period during which the balloon is inflated. Clicking this, for example, will take you to that event, allowing you to quickly review the surgery. 2 screens and 4 screens are displayed as thumbnails, etc., as necessary. When an event occurs, indicate on which screen it occurred by enclosing it in a frame of a different color.
- time-course recordings and event type/count recordings allow us to document important aspects of the surgery (e.g., time to raise the guiding catheter, number of times the guiding catheter has fallen, number of balloon inflations/numbers). mean expansion time). These will serve as feedback to the operator/assistant, and are expected to improve subsequent techniques. Also, as shown in Fig. 30, changes in the speed of the guide wire and the speed of the coil may be recorded together (display is possible if the guide wire and coil can be recognized and tracked by deep learning or the like). . By this, or by converting this into a histogram, average/variance, or the like, it is possible to express the characteristics of the surgery or the technique of the operator.
- the guide wire can be moved slowly near the aneurysm but relatively quickly at other sites, it suggests a good procedure.
- hands are sometimes photographed with a surgical camera, and by recording this on the same timeline, for example, when a certain event occurs, it is possible to see how the hands were.
- a skilled operator can easily look back on the movement of the hand when inserting a catheter into an aneurysm (because, with conventional technology, the timelines of X-ray images and surgical cameras do not match). , it is difficult to confirm how the hand was actually moving in response to the operation that can be confirmed in the X-ray image).
- the image processing apparatus includes a region-of-interest acquiring unit that acquires one or more regions including at least part of the device included in the image as regions of interest, and at least one of the regions of interest is A notification unit may be further provided for notifying a user of the image processing apparatus when a condition defined for each region of interest is satisfied.
- a recognized device can be identified in response to a particular surgical situation recognized. For example, if carotid artery stenting is performed, the coil usually does not come out, so even if it is erroneously recognized as a coil, the recognition accuracy is improved by ignoring it.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the specific boundary line for notifying the user on the condition that the region of interest passes through a specific boundary line specified on the image according to the recognized specific surgical situation may be automatically specified or suggested around the region of interest.
- the situation recognition unit recognizes the occurrence of noise, acquisition of the region of interest can be stopped.
- the acquisition of the region of interest is stopped or acquired when injection of contrast agent is recognized, or when acquiring a 3D image, or when acquiring a cone beam CT (CBCT). It is possible to do but not notify.
- CBCT cone beam CT
- the image processing device further includes a recording unit that records the recognized specific surgical situation.
- aspects of the present invention provide a method of obtaining an image including at least an intravascular examination or treatment device as a subject, and recognizing a specific surgical situation based on the image, as well as the above method. It relates to programs for running on computers. Some aspects of the present invention also relate to an image processing apparatus and method of operation for performing the above method.
- Some aspects of the present invention are an image processing apparatus including a storage device and a processor connected to the storage device, the processor processing an image including at least an intravascular examination or treatment device as a subject.
- the present invention relates to an image processing apparatus for acquiring and recognizing specific surgical situations based on said images.
- Surgical videos are important for documentation, education of medical personnel, and explanation to patients and their families.
- a region-of-interest acquiring unit that acquires a plurality of regions as regions of interest, and a recording unit that, when at least one of the regions of interest has a condition defined for each region of interest, records a surgical situation corresponding to that condition. and an image processing device.
- Conditions for recording include, for example, puncture, arrival of the guiding catheter at the treatment site, balloon guidance, balloon expansion, stent deployment, catheter guidance, insertion of the first coil, insertion of the second and subsequent coils, and localization of the coil within the aneurysm.
- Present includes at least one procedure selected from the group consisting of deviation of the coil into the parent vessel, catheter removal, coil removal, and end of treatment.
- the surgical context recorded may include still images, video, and/or textual information describing the surgical context.
- the moving image is, for example, a moving image of a certain period before and after the time when the condition is satisfied.
- the recorded information may be modifiable by the user.
- the image processing device further includes a report creation unit that creates a report based on the recorded information.
- the report creator can create a summary of the surgery based on, for example, recorded still images, moving images, and/or textual information describing the circumstances of the surgery.
- the image processing apparatus further includes a notification unit that notifies the user when a specific surgical situation has been recorded. In some aspects, the image processing apparatus further includes a notification unit that notifies the user according to the lapse of time after the recording of the specific surgical situation.
- the elapsed time after which a particular surgical situation is recorded may be determined based on balloon inflation time, guidewire travel time/velocity/acceleration, or coil travel time. For example, if an X-ray is coming out but nothing is moving in the screen, i.e. if the image is still, you are adding unnecessary X-ray exposure to the patient and healthcare workers. Notifying this in real time or as a postoperative record will lead to a reduction in radiation exposure dose during the entire surgery.
- each procedure time is also important. For example, when the balloon is inflated, it means that the blood vessel is occluded. from blood to cerebral infarction). Graphing the number of balloon inflations in real-time or postoperatively and the duration of each inflation provides feedback that the balloon should be deflated sooner.
- each procedure time in aneurysm coil embolization, by measuring each time of balloon guidance, catheter guidance to the aneurysm, and coil insertion, it is possible to visualize where the time was spent in that case. be able to.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the image processing apparatus further includes a notification unit that notifies the user according to the number of times a particular surgical situation has been recorded.
- a particular surgical situation is, for example, the insertion of a coil.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the images are obtained from moving images taken in real time or from previously recorded moving images.
- Some aspects of the present invention are an image processing apparatus including a storage device and a processor connected to the storage device, the processor processing an image including at least an intravascular examination or treatment device as a subject. obtained, and one or more regions including at least part of the device included in the image are obtained as regions of interest, and at least one of the regions of interest satisfies a condition defined for each region of interest,
- the present invention relates to an image processing device that records surgical conditions corresponding to the conditions.
- the boundary (ROI)
- the presence or absence or type of recognition or notification may be changed. For example, when a scene classification indicating that imaging is in progress is made, recognition or notification may be stopped. This is because the purpose of contrast imaging is generally to see blood vessels, and the device rarely moves at that time, so there is little need for notification. On the other hand, you can give notice. This is because the imaging may cause the device to move, although this is rare.
- the operator/assistant needs to select a frame with an appropriate time phase after imaging, but this is a time-consuming and troublesome task.
- the scene classification function in the case of angiography, the frame in which the artery can be seen best, or any frame before or after that, or the frame in which the lesion can be seen clearly, or the timing close to that used in the mask image during surgery Frames, etc., can be automatically selected as deemed appropriate for creating the mask image.
- the surgeon, assistants, and medical staff involved in the surgery may designate this as the base of the mask image, or may select an appropriate frame by moving forward or backward.
- it is not necessary to limit to the arterial phase and it is possible to take a frame of a time phase that is considered appropriate for the operation at that time, such as a capillary phase or a venous phase.
- an image including at least an intravascular examination or treatment device as a subject is acquired, and one or more regions including at least part of the device included in the image are obtained. Acquired as regions of interest, and when at least one of the regions of interest satisfies conditions defined for each of the regions of interest, a method of recording the surgical situation corresponding to the conditions, and executing the above method on a computer about the program for Some aspects of the invention also relate to an image processing apparatus and method of operation for performing the above method.
- Some aspects of the present invention are an image processing apparatus including a storage device and a processor connected to the storage device, the processor processing an image including at least an intravascular examination or treatment device as a subject. obtained, and one or more regions including at least part of the device included in the image are obtained as regions of interest, and at least one of the regions of interest satisfies a condition defined for each region of interest,
- the present invention relates to an image processing device that records surgical conditions corresponding to the conditions.
- some aspects of the present invention include an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject, and at least a part of the device included in the image.
- a region-of-interest acquiring unit that acquires a plurality of regions as regions of interest; and a notification that, when at least one of the regions of interest satisfies a condition defined for each region of interest, notifies the user of the image processing apparatus of that fact.
- a display unit for displaying the image including the region of interest and marks corresponding only to the vertical coordinates of the region of interest beside the image on a display device connected to the image processing device. It further comprises an image processing device.
- This image processing apparatus may further include a tracking unit that tracks each of the regions of interest in the image. Note that in some aspects of the present invention, an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the image processing apparatus may further include a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- a notification or output may be provided when the region of interest enters a vessel with a diameter below a threshold, or a manually or automatically specified vessel.
- the trajectory of the mark may be displayed for a certain period of time.
- the mark display format may be different for each of the plurality of regions of interest.
- the display style can be color, shape, pattern, symbol, character, caption, or animation.
- the image processing device can automatically recognize and highlight regions of interest to be confirmed.
- the conditions for determining the region of interest to be confirmed may be set in advance by the user, for example. Alternatively, this condition may be set automatically by machine learning. For example, the region of interest to be confirmed can be determined depending on the surgical situation. Surgical conditions can also be automatically recognized by the image processor.
- only the region of interest specified by the user is displayed on the sub-screen. Also, in some aspects, only the automatically selected region of interest is displayed on the sub-screen.
- the image processing device can notify the user when the mark exceeds the boundary value specified by the user.
- the user can specify the boundary value by setting a specific range on the sub-screen using a pointing device such as a mouse or a touch panel.
- an image including at least an intravascular examination or treatment device as a subject is acquired, and one or more regions including at least part of the device included in the image are obtained. obtained as regions of interest, tracking each of the regions of interest in the image, and notifying a user of the image processing device when at least one of the regions of interest satisfies a condition defined for each region of interest.
- a method for displaying, on a display device connected to the image processing device, the image including the region of interest, and a mark corresponding only to the vertical coordinates of the region of interest beside the image and a program for executing the above method on a computer.
- Some aspects of the present invention are an image processing apparatus including a storage device and a processor connected to the storage device, the processor processing an image including at least an intravascular examination or treatment device as a subject. obtaining one or more regions of interest that include at least a portion of the device included in the image as regions of interest; tracking each of the regions of interest in the image; When a condition defined for each region is satisfied, the user of the image processing device is notified of the fact, and the image including the region of interest and the
- the present invention relates to an image processing device that displays a mark corresponding only to the vertical coordinates of the region of interest.
- Some aspects of the present invention include: an image acquisition unit that acquires an image including at least an intravascular inspection or treatment device as a subject; a region of interest acquiring unit that acquires a region as a region of interest; a tracking unit that tracks each of the regions of interest in the image; and a notification unit that notifies the user of the image processing device of the fact, and further comprising a display unit that displays the plurality of regions of interest in different display formats when the plurality of regions of interest are tracked. .
- the display style is, for example, colors, shapes, patterns, symbols, characters, captions, or animation.
- the region of interest to be confirmed is automatically recognized and highlighted. Also, in some aspects, the region of interest to be confirmed is determined according to the surgical context. Conditions for determining the region of interest to be confirmed may be set in advance by the user, for example. Alternatively, this condition may be set automatically by machine learning. For example, the region of interest to be confirmed can be determined depending on the surgical situation. Surgical conditions can also be automatically recognized by the image processor.
- the image processing device may display only the region of interest specified by the user. Also, the image processing apparatus may display only the automatically selected region of interest. In the case of automatic selection, the conditions for such selection can be set in advance.
- Some aspects of the present invention are an image processing apparatus including a storage device and a processor connected to the storage device, the processor processing an image including at least an intravascular examination or treatment device as a subject. obtaining one or more regions of interest that include at least a portion of the device included in the image as regions of interest; tracking each of the regions of interest in the image; When a condition defined for each region is satisfied, the fact is notified to the user of the image processing device, and here, when a plurality of the regions of interest are tracked, the plurality of regions of interest are displayed in different display formats. It relates to an image processing device.
- vascular catheterization support systems for example, cerebral, heart, peripheral, and abdominal blood vessels, particularly cerebrovascular catheterization support systems.
- a system comprises an image processing device, and an image capturing device that captures an X-ray image of a patient with one or more devices inserted into a blood vessel and transmits the image to the image processing device.
- an image acquisition unit that acquires X-ray images over time of a region (for example, a fixed region) that includes at least a region of interest for achieving the purpose of surgery and a device inserted into a blood vessel; a region-of-interest acquiring unit that acquires, as regions of interest, one or more regions including at least part of the device included in the image; and at least one of the regions of interest satisfies a condition defined for each region of interest.
- a notification unit for notifying the user of the image processing device of this, wherein the one or more devices are a catheter, a guide wire, a stent and/or a balloon, and the region of interest is the tip of a catheter or a guide
- the region of interest disappears from the image, or the distance between the region of interest and the edge of the image is a predetermined threshold. It can be a system that notifies the user conditionally that the distance is less than or that the region of interest has shifted a certain distance.
- the system may further comprise a tracker for tracking each of the regions of interest in the image.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the image processing apparatus may further include a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- a notification or output may be provided when the region of interest enters a vessel with a diameter below a threshold, or a manually or automatically specified vessel.
- the notification unit may cause the display device that displays the image to display the distance between the region of interest and the edge of the image or the distance between the marker and the region of interest. can change the display mode of the distance on the display device. Changing the color of the entire screen or part of the screen of the display device, displaying graphics on the entire screen of the display device, outside the frame, or part of the screen, enlarging and displaying the region of interest according to the distance, or distance changing the color or size of the mark placed on the region of interest depending on the size of the . Also, the notification unit may generate a notification sound according to the magnitude of the distance. Further, distance may be determined either by straight-line distance or distance along a vessel.
- an aneurysm coil embolization assist system particularly a cerebral aneurysm coil embolization assist system.
- a system includes an image processing device, and an image capturing device that captures an X-ray image of a patient in a state in which a guiding catheter and a delivery wire for an embolization coil are inserted into a blood vessel, and transmits the image to the image processing device.
- the image processing device acquires X-ray images over time of a fixed region including at least an aneurysm in a blood vessel of a patient, a catheter inserted into the blood vessel, and a delivery wire for an embolization coil.
- an acquisition unit a region-of-interest acquisition unit that acquires one or more regions including at least part of the guiding catheter included in the image as regions of interest, and a marker provided on the delivery wire, wherein the delivery
- a marker detection unit that detects markers approaching one or more regions of interest set in a portion of the catheter that guides the wire, and an embolization coil triggered by the overlap of the markers and the region of interest.
- a notification unit that notifies the user of the timing at which the delivery wire may be disconnected.
- the system may further comprise a tracker for tracking each of the regions of interest and the markers in the image.
- an output unit that outputs information to a device connected to the image processing apparatus may exist instead of or in addition to the notification unit.
- the image processing apparatus may further include a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- a notification or output may be provided when the region of interest enters a vessel with a diameter below a threshold, or a manually or automatically specified vessel.
- the notification unit may cause the display device that displays the image to display the distance between the region of interest and the edge of the image or the distance between the marker and the region of interest. can change the display mode of the distance on the display device. Changing the color of the entire screen or part of the screen of the display device, displaying graphics on the entire screen of the display device, outside the frame, or part of the screen, enlarging and displaying the region of interest according to the distance, or distance changing the color or size of the mark placed on the region of interest depending on the size of the . Also, the notification unit may generate a notification sound according to the magnitude of the distance. Further, distance may be determined either by straight-line distance or distance along a vessel.
- FIG. 10 is a flowchart for explaining the flow of image analysis processing executed by the image processing device 1 according to the embodiment. The processing in this flowchart starts, for example, when the image processing apparatus 1 is activated.
- the image acquisition unit 110 acquires an X-ray image created based on the X-ray absorptance, including at least the blood vessel V and the examination or treatment device D in the blood vessel V (S2).
- the region-of-interest acquiring unit 111 acquires one or more regions including at least part of the device D included in the X-ray image as regions of interest R (S4).
- the tracking unit 112 tracks each region of interest R in the X-ray image (S6). If at least one of the regions of interest R satisfies the condition defined for each region of interest R (Yes in S8), the notification unit 113 notifies the user of the image processing device 1 of that fact (S10). If none of the regions of interest R satisfy the defined conditions (No in S8), the notification unit 113 skips the notification process.
- step S12 the image processing device 1 returns to the processing of step S6 and repeats the processing from step S6 to step S10. If the image processing ends (Yes in S12), the processing in this flowchart ends.
- blood vessels are photographed using angiography to diagnose lesions.
- it can be difficult to judge. Therefore, some aspects of the present invention are useful in angiography using a contrast agent, using deep learning or the like to detect cerebral aneurysm, stenosis, occlusion, thrombus formation, vascular perforation (outflow of contrast agent), shunt disease, tumor blood vessels, etc.
- the image processing device performs aneurysm, stenosis, vasospasm, dissection, occlusion, recanalization, thrombus formation, thrombus location and both ends, vascular perforation, imaging in the image.
- the image processing device may further include an image recognition unit that compares an angiogram in an image with a previously acquired and stored angiogram and notifies of a change. can.
- ⁇ Third modification> the case where the X-ray imaging device 3 captures the image of the subject P's surgical site has been described.
- the imaging device that captures the image of the surgical site of the subject P is not limited to the X-ray imaging device 3 .
- an image of the surgical site may be captured using a modality such as MRI (Magnetic Resonance Imaging) or an ultrasonic imaging device.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a tracker for tracking each of the regions of interest in the image; a notification unit for notifying a user of the image processing apparatus of the fact that at least one of the regions of interest satisfies a condition defined for each of the regions of interest;
- An image processing device comprising: (Appendix 2) The notification unit notifies the user on the condition that the region of interest disappears from the image when a region including the tip of a catheter or the tip of a guidewire is set as the region of interest.
- the image processing device according to appendix 1.
- Appendix 3 When a region including a distal end of a catheter or a distal end of a guidewire is set as the region of interest, the notification unit is configured to detect that the distance between the region of interest and the edge of the image is less than a predetermined threshold distance. notifying said user subject to The image processing device according to appendix 1.
- Appendix 4 The notification unit notifies the user on the condition that at least one of the moving distance, moving speed, and acceleration of the region of interest in the image exceeds a predetermined threshold. 3.
- the image processing device according to any one of Appendices 1 to 3.
- the notification unit causes a display device that displays the image to display the distance between the region of interest and the edge of the image, 5.
- the image processing device according to any one of Appendices 1 to 4.
- the notification unit changes the display mode of the distance on the display device according to the size of the distance between the region of interest and the edge of the image.
- the notification unit notifies the user on the condition that a value obtained by dividing the distance between the region of interest and the edge of the image by the moving speed of the region of interest in the image is less than a predetermined threshold. do, 7.
- the image processing device according to any one of Appendices 1 to 6.
- Appendix 8 Further comprising a marker detection unit for detecting a marker provided on the delivery wire of the embolization coil, the marker approaching a region of interest set on a part of the microcatheter that guides the delivery wire, the tracking unit further tracks the detected marker;
- the notification unit notifies the user of the timing at which the embolization coil may be disconnected from the delivery wire, triggered by the overlap of the marker and the region of interest.
- the image processing device according to any one of Appendices 1 to 7. (Appendix 9) The notification unit notifies the user when the marker passes through the region of interest.
- the notification unit causes a display device to display a distance that the marker should move until the embolization coil is disconnected from the delivery wire.
- the notification unit When the feature amount indicating the shape of the device included in the region of interest satisfies a predetermined condition, the notification unit notifies the user of the image processing device of that fact. 11.
- the feature quantity is curvature
- the notification unit notifies the user on condition that the curvature of the device included in the region of interest exceeds a predetermined threshold curvature, or that the tip does not move even though the curvature is changing. 11.
- the notification unit determines that a value obtained by subtracting the length of the center line of the blood vessel included in the image or region of interest from the length of the device included in the image or region of interest exceeds a predetermined threshold length. notifying said user as 13.
- the image processing device according to appendix 11 or 12.
- the notification unit displays the region of interest in a different color from the image, changes the font, size or color of displayed characters, changes the color of the entire screen of the display device or a part of the display device, Any of Appendices 1 to 13 to notify the user by displaying a figure on the entire screen, outside the frame, or in a part of the screen, magnifying the area of interest, or changing the color or size of the mark attached to the area of interest 1.
- the image processing device according to 1.
- the image processing device further comprising a video extracting unit that extracts from the video storage unit a video of a certain period before and after the notification by the notification unit or a video of an arbitrary time or period specified by the user.
- Appendix 18 18.
- Appendix 19 19.
- the image processing device according to appendix 17 or 18, wherein the extracted video is displayed on a display device.
- Appendix 20 19.
- the image processing device according to appendix 19, wherein the extracted video is automatically displayed repeatedly a predetermined number of times.
- Appendix 21 21.
- the image processing device according to attachment 20, wherein the extracted video is displayed based on arbitrary operations including play, stop, fast-forward, rewind, frame-by-frame, slow play, and double speed play.
- Appendix 22 The elapsed time from the time when the notification was generated, the comparison of the position of the region of interest between the time when the notification was generated and after an arbitrary time has passed, or the trajectory of the region of interest obtained by the tracking unit is superimposed on the extracted video. 22.
- the image processing device according to any one of appendices 17 to 21, further displayed.
- the image processing device according to any one of appendices 17 to 22, wherein the extracted video is displayed by cutting out a partial region near the region of interest.
- Appendix 24 24.
- the image processing device according to any one of attachments 17 to 23, wherein the extracted video is displayed at a position that does not interfere with the display of the region of interest.
- the image processing device according to any one of appendices 17 to 24, wherein the extracted video is displayed in an enlarged manner.
- the image processing device according to any one of appendices 17 to 25, wherein the extracted video is displayed at the same time as the notification is generated or after a predetermined time has passed since the notification is generated.
- Appendix 27 27.
- the image processing device according to any one of attachments 17 to 26, wherein images shot from a plurality of directions are displayed simultaneously.
- the image processing device draws the user's attention by lighting, changing the color, or highlighting the frame portion of one of the two screens.
- Appendix 32 32.
- the image processing apparatus according to any one of appendices 1 to 31, wherein the display device for displaying the image displays a product list of intravascular examination or treatment devices.
- Appendix 33 33.
- (Appendix 34) 34 34.
- the image processing device according to appendix 32 or 33, wherein the display device displays a list of recommended products based on image analysis results, facility information, or user preference information.
- Appendix 35 35.
- An image processing apparatus according to any one of appendices 1 to 34, which automatically or based on user selection creates a surgical record including information of the device used, information of the image acquired, and image analysis results.
- the notification unit converts a numerical value, color, bar or heat map according to the probability that at least one of the regions of interest satisfies a condition defined for each region of interest, or a value obtained by arbitrarily transforming a probability distribution.
- 36 The image processing device according to any one of appendices 1 to 35, wherein the based numerical value, color, bar or heat map is displayed on a display device that displays the image.
- the notification unit uses a color or a heat map based on the probability that at least one of the regions of interest satisfies a condition defined for each region of interest, or a color or heat map based on a value obtained by arbitrarily converting a probability distribution.
- the region of interest is colored and displayed on a display device that displays the image, or the probability that the condition is satisfied is replaced with a numerical value or color and displayed on the display device that displays the image.
- the image processing device causes a display device that displays the image to display the distance between the region of interest and the edge of the specific range.
- the image processing device according to attachment 41, wherein the notification unit changes the display mode of the distance on the display device according to the size of the distance between the region of interest and the edge of the specific range.
- the image processing device according to any one of appendices 38 to 42, which notifies the (Appendix 44) 44.
- the image processing device according to any one of appendices 3 to 43, wherein the distance is determined by either a straight-line distance or a distance along a blood vessel.
- Appendix 45 It includes a storage unit that acquires and stores the position and/or shape of the intravascular examination or treatment device at any time, and the stored position and/or shape of the device is superimposed on the acquired image. 45.
- the image processing device according to any one of Appendixes 1 to 44, (Appendix 46) Cerebral aneurysm, stenosis, occlusion, thrombus formation, vascular perforation, extravasation of contrast medium, shunt disease, nutrient vessel/tumor enhancement of tumor vessels, venous thrombosis, avascularity in the capillary phase in the above images 43.
- the image processing apparatus according to any one of appendices 1 to 43, further comprising an image recognition unit that compares angiograms in the image with previously acquired and stored angiograms and notifies changes.
- the processor of the image processing device acquiring an image including at least an intravascular examination or treatment device as a subject; obtaining one or more regions including at least a portion of the device included in the image as regions of interest; tracking each of the regions of interest in the image; if at least one of the regions of interest satisfies a condition defined for each of the regions of interest, the step of notifying a user of the image processing apparatus of that fact;
- (Appendix 49) to the computer a function of acquiring an image including at least an intravascular examination or treatment device as a subject; a function of acquiring one or more regions including at least part of the device included in the X-ray image as regions of interest; the ability to track each of the regions of interest in the x-ray image; a function of notifying a user of the computer when at least one of the regions of interest satisfies a condition defined for each of the regions of interest; program to realize
- Appendix 50 The image processing device according to any one of Appendices 1 to 44; an imaging device that captures an image of a person with an intravascular examination or treatment device inserted therein and transmits the image to the image processing device; image processing system.
- a cerebrovascular catheter surgery support system an image processing device; an image capturing device that captures an X-ray image of a patient with one or more devices inserted into a blood vessel and transmits the image to the image processing device;
- the image processing device is an image acquisition unit that acquires X-ray images over time of a fixed region including at least a region of interest for achieving the purpose of surgery and a device inserted into a blood vessel; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a tracker for tracking each of the regions of interest in the image; a notification unit that notifies a user of the image processing device of the fact that at least one of the regions of interest satisfies a condition defined for each of the regions of interest; said one or more devices are catheters, guidewires, stents and/or balloons; When a region including a tip of a catheter or a tip of a guide wire, both ends
- a system for assisting cerebral aneurysm coil embolization comprising: an image processing device; an imaging device that captures an X-ray image of a patient in a state in which the guiding catheter and the delivery wire of the embolization coil are inserted into a blood vessel and transmits the image to the image processing device;
- the image processing device is an image acquisition unit that acquires X-ray images over time of a fixed region including at least an aneurysm in a blood vessel of a patient, a catheter inserted into the blood vessel, and a delivery wire for an embolization coil; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the guiding catheter included in the image; a marker detection unit that detects a marker provided on the delivery wire that approaches one or more regions of interest set on a portion of the catheter that guides the delivery wire;
- the notification unit causing the notification unit to display a distance between the region of interest and the edge of the image or a distance between the marker and the region of interest on a display device that displays the image;
- the notification unit can change the display mode of the distance on the display device according to the size of the distance, and the change in the display mode changes the character displayed according to the size of the distance. change the font, size, or color of the display device, change the color of the entire screen of the display device or a part of the screen according to the size of the distance, and put a figure on the whole screen or outside the frame or a part of the display device 53.
- the system of claim 51 or 52 comprising displaying, magnifying the region of interest according to the magnitude of the distance, or changing the color or size of the mark placed on the region of interest according to the magnitude of the distance.
- (Appendix 54) 54 The system according to any one of appendices 51 to 53, wherein the notification unit is capable of producing a notification sound or transmitting vibration depending on the magnitude of the distance.
- (Appendix 55) 55 The system of any one of clauses 51-54, wherein the distance is determined by either a straight line distance or a distance along a vessel.
- a system for assisting cerebral aneurysm coil embolization comprising: an image processing device; an imaging device that captures an X-ray image of a patient in a state in which the guiding catheter and the delivery wire of the embolization coil are inserted into a blood vessel and transmits the image to the image processing device;
- the image processing device is a positional relationship storage unit that stores the positional relationship between the tip of the catheter and the second marker; a position storage unit that stores the distance a between the aneurysm neckline and the 1st marker and the position of the 2nd marker at time t1; Equipped with a distance estimator that calculates the movement distance b from the position of the 2nd marker at time t2, estimates the distance a - b from the aneurysm neckline of the tip of the catheter, and a notification unit that notifies the user of the estimated distance ,system.
- An endovascular surgery support system that recommends a product to be used based on a storage unit that stores a product list of intravascular examination or treatment devices, image analysis results, facility information, or user preference information. and a display for displaying the recommended products.
- An image processing device An image processing apparatus comprising an image acquisition unit capable of acquiring an image including at least an intravascular inspection or treatment device as a subject.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with The notification unit notifies the user on condition that a device newly appearing in the image is newly detected as a region of interest.
- the image processing device according to appendix A1.
- Appendix A3 The image processing apparatus according to Appendix A2, wherein the notification unit notifies the user on condition that a device newly appearing in an edge portion or a specific region in the image is newly detected as the region of interest.
- Appendix A4 The image processing device according to appendix A3, wherein the specific area is specified by a user.
- Appendix A5 The image processing device according to appendix A3, wherein the specific area is automatically specified.
- Appendix A6 The image processing apparatus according to appendix A5, wherein the specific area is an area automatically determined by the image processing apparatus as an area to be confirmed.
- Appendix A7 The image processing device according to any one of Appendices A2 to A6, further comprising a tracking unit that tracks the region of interest in the image.
- the region of interest is selected from the group consisting of a guiding catheter, a guide wire, an intermediate catheter, a microcatheter, an aspiration catheter for thrombectomy, a marker, a coil, a stent, a filter, an embolic material, an aneurysm embolization device, and a balloon.
- the image processing apparatus according to any one of Appendices A2 to A7, wherein a region including at least part of an intravascular examination or treatment device is set.
- Appendix A9 The image processing apparatus according to any one of Appendices A2 to A8, wherein the region including at least part of the device includes the tip of an intravascular examination or treatment device.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a tracker for tracking each of the regions of interest in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with The notification unit notifies the user on condition that the region of interest passes through a specific boundary line specified on the image.
- the image processing device according to appendix A1.
- Appendix A18 The image processing device according to any one of Appendices A9 to A16, wherein the notification unit notifies the user of one or more automatically identified regions of interest only when the condition is satisfied. .
- Appendix A19 The image processing device according to any one of Appendices A9 to A18, wherein the specific boundary line is superimposed on the X-ray image.
- Appendix A20 The image processing device according to any one of Appendices A9 to A19, wherein the specific boundary line is redrawn according to a change in the range or magnification of the X-ray image.
- Appendix A21 The image processing device according to any one of Appendices A9 to A20, wherein the notification is performed by generating a warning sound or changing a border display format.
- Annex A22 The image processing device according to any one of Appendices A9 to A21, wherein different notification methods are used depending on the direction in which the region of interest passes through a specific boundary line designated on the image.
- Appendix A23 The image processing device according to any one of Appendices A9 to A22, wherein different notification methods are used depending on the speed at which the region of interest crosses the specific boundary line.
- Appendix A24 The image processing device according to any one of Appendices A9 to A23, wherein the notification unit causes a display device that displays the image to display the distance between the region of interest and the specific boundary line.
- Appendix A25 The image according to any one of Appendices A9 to A24, wherein the notification unit changes the display mode of the distance on the display device according to the size of the distance between the region of interest and the specific boundary line. processing equipment.
- the notification unit notifies the user on the condition that a value obtained by dividing the distance between the region of interest and the specific boundary line by the moving speed of the region of interest in the image is less than a predetermined threshold value.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a tracker for tracking each of the regions of interest in the image; a notification unit for notifying a user of the image processing apparatus of the fact that at least one of the regions of interest satisfies a condition defined for each of the regions of interest; with A region of interest included in a region specified by a user on the image is acquired as the region of interest; The image processing device according to appendix A1.
- Appendix A28 The image processing device according to appendix A27, wherein the area specified by the user on the image is represented by a rectangle, a circle, or an arbitrary closed curve.
- Annex A29 The image processing device according to appendix A27 or A28, wherein candidate areas that can be specified by the user are automatically displayed.
- Appendix A30 The image processing device according to any one of Appendices A27 to A29, wherein the image temporarily becomes a still image when the user designates the area.
- Appendix A31 The image processing device according to any one of Appendices A27 to A29, wherein when a user designates an area, the image is temporarily played back with a delay.
- Appendix A32 The image processing device according to any one of Appendices A27 to A31, wherein a part of the image is enlarged and displayed when the user designates the area.
- Annex A33 The image processing device according to any one of Appendices A27 to A32, wherein only the region of interest to be confirmed is automatically selected when a region including a plurality of regions of interest is specified.
- Appendix A34 The areas to be confirmed are that the tip of the guide wire is within a certain range, that the tip of the delivery wire of the stent is within a certain range, and the 2nd marker of the catheter to see when the coil can be cut.
- Range of movement within a certain range of the filter, within a certain range of the tip of the catheter, within a certain range of embolic material, and within a certain range of the device remaining inside large blood vessels , an aneurysm, or a critical vessel.
- Appendix A35 The image processing apparatus according to appendix A33 or A34, wherein the region of interest to be confirmed is determined depending on the surgical situation.
- Appendix A36 The image processing device according to any one of Appendixes A27-A35, wherein the designated area is automatically adjusted when the image is changed.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a situation recognition unit that recognizes a specific surgical situation based on the image;
- the image processing device according to appendix A1 comprising: (Annex A38)
- the conditions of the above-mentioned specific surgery include surgical content, disease name, vascular leakage, vascular perforation, occurrence of thrombus, disappearance of peripheral blood vessels, guidance of guidewire/catheter/balloon, suction by catheter, stent placement, balloon expansion, coil Insertion, insertion of aneurysm embolization device, timing of coil cut, guide/dwell/retrieval of filter, injection of liquid embolization material, injection of contrast medium, continuation of static image for a certain period of time or more, distinction between mask image and live image,
- the image processing device according to appendix A37 wherein the condition is at least one selected from the group consisting of determination of imaging region/angle, switching of images,
- (Annex A39) a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a notification unit for notifying a user of the image processing apparatus of the fact that at least one of the regions of interest satisfies a condition defined for each of the regions of interest;
- the image processing apparatus according to Appendix A37 or A38 further comprising:
- (Appendix A41) The image processing device according to appendix A38, which notifies a recommendation to reduce radiation exposure when continuation of a still image for a certain period of time or more is recognized.
- (Annex A42) The image processing device according to any one of Appendices A37 to A41, further comprising a recording unit that records the recognized specific surgical situation.
- (Annex A43) The image processing apparatus according to any one of appendices A37-A42, wherein the specific surgical situation is recognized by pattern matching, image recognition, time-series image recognition, time-series difference, or object detection algorithms.
- (Annex A44) The image processing apparatus of any one of Appendices A37-A43, wherein the recognized device is identified in response to a recognized particular surgical situation.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a recording unit that, when at least one of the regions of interest satisfies a condition defined for each region of interest, records a surgical situation corresponding to the condition;
- the image processing device comprising: (Annex A46) The conditions were puncture, reaching the treatment site of the guiding catheter, balloon guidance, balloon expansion, stent deployment, catheter guidance, insertion of the first coil, insertion of the second and subsequent coils, localization of the coil within the aneurysm, and mother of the coil.
- the image processing device of Appendix A45 comprising at least one procedure selected from the group consisting of diversion into a vessel, catheter removal, coil removal, and treatment termination.
- the image processing apparatus of Appendix A45 or A46, wherein the surgical context recorded comprises still images, video and/or textual information describing the surgical context.
- the image processing device according to appendix A47, wherein the moving image is a moving image of a fixed period before and after the time when the condition is satisfied.
- Appendix A49 The image processing device according to any one of Appendixes A45-A48, wherein the recorded information is modifiable by the user.
- Appendix A50 The image processing device according to any one of Appendices A45 to A49, further comprising a report creation unit that creates a report based on the recorded information.
- Appendix A51 The image processing apparatus according to Appendix A50, wherein the reporting unit creates a surgical summary based on recorded still images, moving images, and/or textual information describing the surgical situation.
- Appendix A52 The image processing apparatus according to any one of Appendices A45 to A51, further comprising a notification unit for notifying a user when a specific surgical situation has been recorded.
- Appendix A53 The image processing apparatus according to any one of Appendices A45 to A52, further comprising a notification unit that notifies the user according to the lapse of time after the recording of a specific surgical situation.
- Appendix A54 The image processing apparatus of Appendix A53, wherein the elapsed time after a particular surgical situation is recorded is determined based on balloon inflation time, guidewire travel time, velocity or acceleration, or coil travel time. .
- Appendix A55 The image processing apparatus according to any one of Appendices A45 to A54, further comprising a notification unit for notifying a user according to the number of times a particular surgical situation has been recorded.
- Appendix A56 The imaging device of Appendix A55, wherein the particular surgical situation is coil insertion.
- Appendix A57 The image processing device according to any one of Appendices A45 to A56, wherein the image is obtained from a moving image taken in real time or a moving image recorded in the past.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a tracker for tracking each of the regions of interest in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with A display unit that causes a display device connected to the image processing device to display the image including the region of interest and a mark corresponding only to the vertical coordinates of the region of interest beside the image, The image processing device according to appendix A1.
- Appendix A59 The image processing device according to Appendix A58, wherein the trajectory of the mark is displayed for a certain period of time.
- Appendix A60 The image processing device according to appendix A60 or A61, wherein the display format of the mark is different for each region of interest (appendix A61)
- Annex A62 The image processing device according to any one of Appendices A58 to A61, which automatically recognizes and highlights a region of interest to be confirmed.
- Appendix A63 The image processing apparatus according to Appendix A62, wherein the region of interest to be confirmed is determined according to the surgical situation.
- Appendix A64 The image processing device according to any one of Appendixes A58 to A63, wherein only a user-specified region of interest is displayed.
- Appendix A65 The image processing device according to any one of Appendixes A58-A64, wherein only the automatically selected region of interest is displayed.
- Annex A66 The image processing device according to any one of Appendixes A58 to A65, wherein the user is notified when the mark exceeds a boundary value specified by the user.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a tracker for tracking each of the regions of interest in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with When a plurality of the regions of interest are tracked, further comprising a display unit that displays the plurality of regions of interest in different display formats, The image processing device according to appendix A1.
- (Annex A68) The image processing device according to Appendix A67, wherein the display mode is color, shape, pattern, symbol, character, caption, or animation.
- (Annex A69) The image processing device according to Appendix A67 or A68, which automatically recognizes and highlights regions of interest to be confirmed.
- (Appendix A70) The image processing device according to any one of Appendices A67-A69, wherein the region of interest to be confirmed is determined according to the surgical situation.
- (Appendix A71) The image processing device according to any one of Appendixes A67-A70, wherein only a user-specified region of interest is displayed.
- (Annex A72) The image processing device according to any one of Appendices A67-A71, wherein only the automatically selected region of interest is displayed.
- Appendix B1 An image processing device, An image processing apparatus comprising an image acquisition unit capable of acquiring an image including at least an intravascular inspection or treatment device as a subject.
- Appendix B2 An image processing device, an image acquisition unit that acquires images for intravascular inspection or treatment; a region-of-interest acquiring unit that acquires, as a region of interest, one or more regions including at least a portion of the device, when the image includes an intravascular examination or treatment device; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with The notification unit notifies the user on the condition that a device or part thereof newly appearing in an image or within a specific region is newly detected as a region of interest.
- the image processing device according to Appendix B1.
- Appendix B3 An image processing device, an image acquisition unit that acquires images for intravascular inspection or treatment; a region-of-interest acquiring unit that acquires, as a region of interest, one or more regions including at least a portion of the device, when the image includes an intravascular examination or treatment device; an output unit configured to output information to a device connected to the image processing apparatus when at least one of the regions of interest satisfies a condition defined for each region of interest; with The output unit outputs to the device on condition that a device or part thereof newly appearing in an image or within a specific region is newly detected as a region of interest.
- the image processing device according to Appendix B1.
- the notification unit notifies the user on the condition that a device newly appearing in an image or within a range of a certain distance from the edge or a part thereof is newly detected as a region of interest.
- the image processing device according to B2 or B3. (Appendix B5) The output unit is provided that a device newly appearing within an edge in an image or within a certain distance from an edge or within a specific region or a part thereof is newly detected as a region of interest.
- Appendix B6 The image processing device according to appendix B2 or B3, wherein when a display device for displaying the image has a plurality of screens, each screen is processed.
- Appendix B7 The image processing device according to appendix B6, wherein the plurality of screens display two or more images of a front live image, a side live image, a front mask image, and a side mask image.
- Appendix B8 The image processing device according to appendix B6, which utilizes information from multiple screens to improve processing accuracy.
- Appendix B9 The image processing device according to appendix B2 or B3, wherein the specific area is specified by a user.
- Appendix B10 The image processing device according to appendix B2 or B3, wherein the specific area is automatically designated.
- Appendix B11 The image processing device according to appendix B2 or B3, wherein the specific area is an area automatically determined by the image processing apparatus as an area to be confirmed.
- Appendix B12 The image processing apparatus according to appendix B2 or B3, further comprising a tracking unit that tracks the region of interest in the image.
- Appendix B13 The image processing device according to appendix B2 or B3, further comprising a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- Appendix B14 The image processing apparatus according to appendix B2 or B3, wherein notification or output is provided when the region of interest enters a blood vessel having a diameter below a threshold or a manually or automatically specified blood vessel.
- the region of interest is selected from the group consisting of a guiding catheter, a guide wire, an intermediate catheter, a microcatheter, an aspiration catheter for thrombectomy, a marker, a coil, a stent, a filter, an embolic material, an aneurysm embolization device, and a balloon.
- the image processing apparatus according to appendix B2 or B3, wherein the region including at least part of the device includes the tip of an intravascular examination or treatment device.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a notification unit for notifying a user of the image processing apparatus of the fact that at least one of the regions of interest satisfies a condition defined for each of the regions of interest; with The image processing device according to attachment B1, wherein the notification unit notifies the user on condition that the region of interest passes through or is predicted to pass through a specific boundary line specified on the image.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; an output unit configured to output information to a device connected to the image processing apparatus when at least one of the regions of interest satisfies a condition defined for each region of interest; with The image processing device according to attachment B1, wherein the output unit notifies the user on condition that the region of interest passes through or is predicted to pass through a specific boundary line specified on the image.
- Appendix B19 The image processing apparatus according to appendix B17 or B18, further comprising a tracking unit that tracks the region of interest in the image.
- Appendix B20 The image processing apparatus according to appendix B17 or B18, further comprising a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- Appendix B21 The image processing apparatus according to appendix B17 or B18, wherein notification or output is provided when the region of interest enters a blood vessel having a diameter below a threshold or a manually or automatically specified blood vessel.
- Appendix B22 The image processing device according to appendix B17 or B18, wherein passage prediction is performed based on a distance between the region of interest and a boundary line, or the distance/velocity of the region of interest.
- Appendix B23 The image processing apparatus according to appendix B17 or B18, wherein the specific boundary line is specified by a user.
- Appendix B24 The image processing device according to appendix B17 or B18, wherein the specific boundary line is automatically designated.
- Appendix B25 The image processing apparatus according to appendix B17 or B18, wherein the specific boundary line is automatically designated when the region of interest has not moved for a certain period of time.
- Appendix B26 The image processing apparatus according to appendix B17 or B18, wherein the specific boundary line is automatically designated or automatically proposed according to the surgical situation and the user can accept or reject it.
- Appendix B27 The image processing apparatus according to appendix B17 or B18, wherein the specific boundary line is automatically canceled or automatically proposed and the user can accept or reject it depending on the surgical situation.
- Appendix B28 According to Appendix B17 or B18, wherein when the display device for displaying the image has a plurality of screens, the specific boundary line designated on one screen is automatically designated at a corresponding position on another screen. image processing device.
- Appendix B29 The image processing device according to appendix B17 or B18, wherein when a display device for displaying the image has a plurality of screens, each screen is processed.
- Appendix B30 The image processing device according to Appendix B29, wherein the plurality of screens display two or more images of a front live image, a side live image, a front mask image, and a side mask image.
- Appendix B31 The image processing device according to Appendix B29, which utilizes information on multiple screens to improve accuracy of processing.
- Appendix B32 The image processing device according to appendix B17 or B18, wherein the specific boundary line is represented by a straight line, a curved line, a circle, a rectangle, or any polygon or closed curve.
- Appendix B33 The image processing device according to appendix B17 or B18, wherein the boundary line can be moved, transformed, enlarged, or reduced by a user's operation.
- Appendix B34 The image processing apparatus according to Appendix B17 or B18, wherein the specific boundary line is used to detect a vessel, a lesion site such as an aneurysm or stenosis, or an extravascular invasion into which the device is not desired to enter.
- Appendix B35 The image processing device according to appendix B17 or B18, wherein the notification unit or the output unit notifies or outputs one or more regions for each of the regions of interest only when the condition is satisfied for the first time.
- Appendix B36 The image processing according to appendix B17 or B18, wherein the notification unit or the output unit notifies or outputs one or more regions for each of the regions of interest only when the condition is satisfied for the first time after a specific point in time.
- Device Appendix B37
- Appendix B38 The image processing according to Appendix B17 or B18, wherein the notification unit or the output unit notifies or outputs one or more regions in the entire region of interest only when the condition is satisfied for the first time after a specific point in time.
- Device. The image processing device according to appendix B17 or B18, wherein the notification unit or the output unit notifies or outputs one or more specific regions of interest specified by the user only when the condition is satisfied.
- Appendix B40 The image processing device according to appendix B17 or B18, wherein the notification unit or the output unit notifies or outputs one or more automatically identified regions of interest only when the condition is satisfied.
- Appendix B41 The image processing device according to appendix B17 or B18, wherein the specific boundary line is superimposed on the X-ray image.
- Appendix B42 The image processing apparatus according to appendix B17 or B18, wherein the specific boundary line is redrawn according to a change or movement of the range or magnification of the X-ray image.
- Appendix B43 The image processing apparatus according to appendix B17 or B18, wherein the particular boundary line is redrawn after interruption and reacquisition of the X-ray image.
- Appendix B44 The image processing device according to appendix B17, wherein the notification is performed by generating a warning sound or changing a border display format.
- Appendix B45 The image processing device according to appendix B17, wherein the notification is a part or all of which screen, which device, and which event is sent by voice.
- Appendix B46 The image processing apparatus according to appendix B17, wherein different notification methods are used depending on the direction in which the region of interest passes through a specified boundary line designated on the image.
- Appendix B47 The direction in which the region of interest passes through a specific boundary line designated on the image can be automatically recognized according to the surgical situation, or automatically proposed and the user can agree or reject it.
- the image processing device according to Appendix B17 or B18 capable of.
- Appendix B48 The image processing device according to appendix B17, wherein different notification methods are used depending on the speed at which the region of interest crosses the specific boundary line.
- Appendix B49 The image processing device according to Appendix B17, wherein the notification unit causes a display device that displays the image to display the distance between the region of interest and the specific boundary line.
- Appendix B50 The image processing device according to appendix B17, wherein the notification unit changes the display mode of the distance on the display device according to the size of the distance between the region of interest and the specific boundary line.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with A region of interest included in a region specified by a user on the image is acquired as the region of interest; The image processing device according to Appendix B1.
- Appendix B53 An image processing device, an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; an output unit configured to output information to a device connected to the image processing apparatus when at least one of the regions of interest satisfies a condition defined for each region of interest; with A region of interest included in a region specified by a user on the image is acquired as the region of interest; The image processing device according to Appendix B1.
- Appendix B54 The image processing apparatus according to appendix B52 or B53, further comprising a tracking unit that tracks the region of interest in the image.
- Appendix B55 The image processing apparatus according to appendix B52 or B53, further comprising a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- Appendix B56 The image processing apparatus according to appendix B52 or B53, wherein notification or output is provided when the region of interest enters a blood vessel having a diameter below a threshold or a manually or automatically specified blood vessel.
- Appendix B57 The image processing apparatus according to appendix B52 or B53, wherein the area specified by the user on the image is represented by a rectangle, circle, or any closed curve.
- Appendix B58 The image processing device according to appendix B52 or B53, wherein candidate areas that can be specified by a user are automatically displayed.
- Appendix B59 The image processing apparatus according to appendix B52 or B53, wherein candidates for regions that can be specified by the user are automatically displayed according to surgical conditions.
- Appendix B60 The image processing device according to appendix B52 or B53, wherein the image temporarily becomes a still image when the user designates the area.
- Appendix B61 The image processing device according to appendix B52 or B53, wherein the image is temporarily delayed when the user designates the area.
- Appendix B62 The image processing device according to appendix B52 or B53, wherein a part of the image is enlarged and displayed when the user designates the area.
- Appendix B63 The image processing device according to appendix B52 or B53, wherein when a display device for displaying the image has a plurality of screens, each screen is processed.
- Appendix B64 The image processing device according to Appendix B63, wherein the plurality of screens display two or more images of a front live image, a side live image, a front mask image, and a side mask image.
- Appendix B65 The image processing device according to Appendix B63, which utilizes information of multiple screens to improve accuracy of processing.
- Appendix B66 The image processing device according to appendix B52 or B53, wherein only the region of interest to be confirmed is automatically selected when a region including a plurality of regions of interest is specified.
- Appendix B67 The image processing apparatus according to appendix B66, wherein the region of interest to be confirmed is determined based on the importance of the plurality of devices, the position of the region of interest within the screen, and the surgical scene.
- Appendix B68 The image processing apparatus according to Appendix B66, wherein the region of interest to be confirmed is determined according to the surgical situation.
- Appendix B69 The image processing apparatus according to appendix B52 or B53, wherein the specified region is automatically adjusted when the image is changed.
- the conditions of the above-mentioned specific surgery include surgical content, disease name, vascular leakage, vascular perforation, occurrence of thrombus, disappearance of peripheral blood vessels, guidance of guidewire/catheter/balloon, suction by catheter, stent placement, balloon expansion, coil Insertion, insertion of the aneurysm embolization device, timing of cutting the coil, guidance/dwelling/retrieval of the filter, injection of liquid embolization material, injection of contrast medium, continuation of static images for a certain period of time or longer, distinction between mask images and live images,
- the image processing device according to appendix B70, wherein at least one situation is selected from the group consisting of determination
- Appendix B72 a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest;
- Appendix B74 When the injection of contrast agent is recognized, when acquiring a 3D image, or when acquiring cone beam CT (CBCT), the acquisition of the region of interest is stopped, or acquired but not notified No, the image processing device according to Appendix B70.
- Appendix B75 The image processing device according to appendix B70, which notifies a recommendation to reduce radiation exposure when continuation of a still image for a certain period of time or longer is recognized.
- Appendix B76 The image processing apparatus according to Appendix B70 or B71, further comprising a recording unit for recording recognized specific surgical situations.
- Appendix B80 The image processing device according to appendix B70 or B71, wherein when a display device for displaying the image has a plurality of screens, each screen is processed.
- Appendix B81 The image processing device according to Appendix B80, wherein the plurality of screens display two or more images of a front live image, a side live image, a front mask image, and a side mask image.
- Appendix B82 The image processing apparatus according to Appendix B80, which utilizes information from multiple screens to improve accuracy of processing.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a recording unit that, when at least one of the regions of interest satisfies a condition defined for each region of interest, records a surgical situation corresponding to the condition;
- the image processing device comprising: (Appendix B84) The conditions were puncture, reaching the treatment site of the guiding catheter, balloon guidance, balloon expansion, stent deployment, catheter guidance, first coil insertion, second and subsequent coil insertions, localization of the coil within the aneurysm, and mother of the coil.
- the image processing device comprising at least one procedure selected from the group consisting of digression into a blood vessel, catheter removal, coil removal, and treatment termination.
- the image processing apparatus according to Appendix B83 or B84, wherein the surgical conditions recorded include still images, moving images, and/or text information describing the surgical conditions.
- the image processing device according to appendix B83 or B84, wherein the moving image is a moving image of a fixed period before and after the time when the condition is satisfied.
- the image processing apparatus according to Appendix B83 or B84, wherein the recorded information is modifiable by the user.
- Appendix B88 The image processing apparatus according to appendix B83 or B84, further including a report creation unit that creates a report based on the recorded information.
- Appendix B89 The image processing device according to appendix B83 or B84, wherein the report creation unit creates a summary of the surgery based on the recorded still images, moving images, and/or text information describing the situation of the surgery.
- Appendix B90 The image processing apparatus according to appendix B83 or B84, further comprising a notification unit for notifying a user or an output unit for outputting when a specific surgical situation is recorded.
- Appendix B91 The image processing apparatus according to appendix B83 or B84, further comprising a notification unit for notifying a user or an output unit for outputting depending on the passage of time after the recording of a specific surgical situation.
- Appendix B92 The image processing apparatus of Appendix B91, wherein the passage of time after a particular surgical situation is recorded is determined based on balloon inflation time, guidewire travel time, velocity or acceleration, or coil travel time. .
- Appendix B93 The image processing apparatus according to appendix B83 or B84, further comprising a notification unit for notifying a user or an output unit for outputting according to the number of times a particular surgical situation is recorded.
- Appendix B94 The image processing apparatus of Appendix B83 or B84, wherein the particular surgical situation is coil insertion.
- Appendix B95 The image processing device according to appendix B83 or B84, wherein the image is obtained from a moving image taken in real time or a moving image recorded in the past.
- Appendix B96 The image processing device according to appendix B83 or B84, wherein when a display device for displaying the image has a plurality of screens, each screen is processed.
- Appendix B97 The image processing device according to Appendix B96, wherein the plurality of screens display two or more images of a front live image, a side live image, a front mask image, and a side mask image.
- Appendix B98 The image processing apparatus according to Appendix B96, which utilizes information from multiple screens to improve accuracy of processing.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with A display unit that causes a display device connected to the image processing device to display the image including the region of interest and a mark corresponding only to the vertical coordinates of the region of interest beside the image, The image processing device according to Appendix B1.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; an output unit configured to output information to a device connected to the image processing apparatus when at least one of the regions of interest satisfies a condition defined for each region of interest; with A display unit that causes a display device connected to the image processing device to display the image including the region of interest and a mark corresponding only to the vertical coordinates of the region of interest beside the image, The image processing device according to Appendix B1.
- Appendix B101 The image processing apparatus according to Appendix B99 or B100, further comprising a tracking unit that tracks the region of interest in the image.
- Appendix B102 The image processing apparatus according to appendix B99 or B100, further comprising a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- Appendix B103 The image processing apparatus according to Appendix B99 or B100, wherein notification or output is provided when the region of interest enters a blood vessel having a diameter below a threshold or a manually or automatically specified blood vessel.
- Appendix B104 The image processing device according to appendix B99 or B100, wherein the trajectory of the mark is displayed for a certain period of time.
- Appendix B105 The image processing apparatus according to appendix B99 or B100, wherein the display format of the mark is different for each region of interest
- Appendix B106 The image processing device according to Appendix B99 or B100, wherein the display mode is color, shape, pattern, symbol, character, caption, or animation.
- Appendix B107 The image processing apparatus according to Appendix B99 or B100, which automatically recognizes and highlights a region of interest to be confirmed.
- Appendix B108 The image processing apparatus according to Appendix B92, wherein the region of interest to be confirmed is determined according to the surgical situation.
- Appendix B109 The image processing apparatus according to Appendix B99 or B100, wherein only a user-specified region of interest is displayed.
- Appendix B110 The image processing apparatus according to Appendix B99 or B100, wherein only the automatically selected region of interest is displayed.
- Appendix B111 The image processing apparatus according to Appendix B99 or B100, wherein the user is notified when the mark exceeds a user-specified boundary value.
- Appendix B112 The image processing device according to appendix B99 or B100, wherein when a display device for displaying the image has a plurality of screens, each screen is processed.
- Appendix B113 The image processing device according to Appendix B112, wherein the plurality of screens display two or more images of a front live image, a side live image, a front mask image, and a side mask image.
- Appendix B114 The image processing device according to appendix B112, which uses information of a plurality of screens to improve accuracy of processing.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a tracker for tracking each of the regions of interest in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with When a plurality of the regions of interest are tracked, further comprising a display unit that displays the plurality of regions of interest in different display formats, The image processing device according to Appendix B1.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a tracker for tracking each of the regions of interest in the image; an output unit configured to output information to a device connected to the image processing apparatus when at least one of the regions of interest satisfies a condition defined for each region of interest; with When a plurality of the regions of interest are tracked, further comprising a display unit that displays the plurality of regions of interest in different display formats, The image processing device according to Appendix B1.
- Appendix B117 The image processing apparatus according to appendix B115 or B116, further comprising a blood vessel recognition unit that recognizes at least part of blood vessels in the image.
- Appendix B118 The image processing apparatus according to Appendix B117, wherein notification or output is provided when the region of interest enters a blood vessel having a diameter less than or equal to a threshold or a manually or automatically specified blood vessel.
- Appendix B119 The image processing device according to appendix B115 or B116, wherein the display mode is color, shape, pattern, symbol, character, caption, or animation.
- Appendix B120 The image processing device according to appendix B115 or B116, which automatically recognizes and highlights a region of interest to be confirmed.
- Appendix B121 The image processing apparatus according to Appendix B120, wherein the region of interest to be confirmed is determined according to surgical conditions.
- Appendix B122 The image processing apparatus according to appendix B115 or B116, wherein only a user-specified region of interest is displayed.
- Appendix B123 The image processing apparatus according to Appendix B115 or B116, wherein only the automatically selected region of interest is displayed.
- Appendix B124 The image processing device according to appendix B115 or B116, wherein when a display device for displaying the image has a plurality of screens, each screen is processed.
- Appendix B125 The image processing device according to Appendix B124, wherein the plurality of screens display two or more images of a front live image, a side live image, a front mask image, and a side mask image.
- Appendix B126 The image processing device according to Appendix B124, which utilizes information of multiple screens to improve accuracy of processing.
- Appendix B127 A method for creating a machine learning model, characterized by performing training using cerebrovascular images acquired from multiple directions as input.
- An image processing device an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; a notification unit for notifying a user of the image processing apparatus that at least one of the regions of interest satisfies a condition defined for each region of interest; with The notification unit notifies when the region of interest includes a guidewire and the length or segmentation area of the guidewire in a specific region including the tip of the guidewire exceeds a certain threshold.
- the described image processing device An image processing device, an image acquisition unit that acquires an image including at least an intravascular examination or treatment device as a subject; a region-of-interest acquisition unit that acquires, as a region of interest, one or more regions including at least part of the device included in the image; an output unit configured to output information to a device connected to the image processing apparatus when at least one of the regions of interest satisfies a condition defined for each region of interest; with The output unit outputs when the region of interest includes a guidewire and the length of the guidewire or the area of the segmentation in a specific region including the tip of the guidewire exceeds a certain threshold.
- Image processing device 10 ... memory unit 11... control unit 110 image acquisition unit 111 ... region of interest acquisition unit 112 Tracking unit 113 Notification part 114 Marker detector 115 ... rangefinder 116 Output part 117 Video storage unit 118 Video extractor 119 state estimator 2... Display device 3 ⁇ X-ray imaging device 4 ⁇ External device 20 CPU 21 ROM 22 RAM 23... Storage 24 ⁇ Input/output interface 25... Input section 26... Output section 27 Storage medium 30 ⁇ X-ray irradiator 31 X-ray detector 32 bed D: Device E ⁇ Coil for embolization P: Subject S ⁇ Image processing system
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Vascular Medicine (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
図1は、実施の形態に係る画像処理システムSの外観を模式的に示す図である。画像処理システムSは、画像処理装置1、表示装置2、及びX線撮像装置3を備える。以下、図1を参照して、実施の形態の概要を述べる。
図2は、実施の形態に係る画像処理装置1の機能構成を模式的に示す図である。画像処理装置1は、記憶部10と制御部11とを備える。図2において、矢印は主なデータの流れを示しており、図2に示していないデータの流れがあってもよい。図2において、各機能ブロックはハードウェア(装置)単位の構成ではなく、機能単位の構成を示している。そのため、図2に示す機能ブロックは単一の装置内に実装されてもよく、あるいは複数の装置内に分かれて実装されてもよい。機能ブロック間のデータの授受は、データバス、ネットワーク、可搬記憶媒体等、任意の手段を介して行われてもよい。また、図33は、別の実施の形態に係る画像処理装置1の機能構成を模式的に示す図である。図33では、出力部116から外部デバイス4に対して出力が行われることが示されている。
検出方法は機械学習には限らないが、例えば機械学習の場合、学習のための入力データとしては、実際の手術動画・検査動画、ファントムを用いた血管撮影装置の動画、バーチャルで作った手術動画などを用いる。動画でなく静止画でも良い。バーチャルの手術動画は、手作業で作成しても良いし、アルゴリズムで手術に似た画像を生成しても良いし、「敵対的生成ネットワーク」(Generative Adversarial Networks;GAN)などを用いても良い。
実施の形態に係る画像処理装置1は、デバイスを含む関心領域だけなく、血管も認識することができる。方法は上記に記載したデバイスの認識と同じである。機械学習・ディープラーニング(AI)やルールベースの手法を用いるが、手法はこれに限らない。例えば、ライブもしくはマスクの動画もしくは静止画において血管のアノテーション(セグメンテーションによる塗りつぶしなど)を行い、ディープラーニングで学習する。もしくは、画像はグレースケール(モノクロ)であるため、閾値をもうけて白い部分・黒い部分のみを抽出してもよい。輪郭抽出により血管を抽出してもよい。動画であれば加算平均をとってもよい。これらの組み合わせにより血管を抽出する。血管は1つのもしくは経時的な複数のマスク画像から認識しても良いし、造影時の1つまたは複数の画像を用いて認識しても良い。更に、正面画像と側面画像をあわせて認識しても良い。2方向からの画像で認識することにより認識精度の向上が期待される。
血管を認識する際に、解剖の情報を同時に推測することもできる。例えば、左右の識別、内頚動脈・外頸動脈、中大脳動脈、後大脳動脈、後交通動脈、前交通動脈などの解剖的分類で認識しても良い。内頚動脈をC1~C5や、中大脳動脈をM1, M2 (superior trunk, inferior trunk, anterior temporal arteryなど), M3などに分けて認識しても良い。更に血管をセグメンテーション(塗りつぶし、2D)だけでなく、そこから中心線をとるなどによりツリー状(線状、1D)に変換してもよい。認識の手法や表現方法は上に記載したものと同じである。
血管を認識する際に病変を認識しても良い。例えば、脳動脈瘤、狭窄、脳動静脈奇形、閉塞などを認識しても良い。認識の手法や表現方法は上に記載したものと同じである。
手術における重要な血管を手動もしくは自動で抽出して、それを強調表示もしくは、その血管のみを表示しても良い。自動抽出は例えば、血管径が閾値より太い血管を選択したり、上記の血管解剖の認識から内頚動脈などのメインの血管を抽出したり、上記の病変部の認識から動脈瘤や狭窄を同定してそれに沿った血管のみを抽出したりするが、これに限らない。
血管抽出の具体例を図32に示す。このような血管抽出を自動で行うことができる。
血管内治療は最大4画面のX線画像を見て行うことが特徴的である(正面像・側面像の2方向×マスク・ライブ画像)。そのため、input画像として、4画面のうち、1画面ではなく、2~4画面を入力として学習しても良い。もしくは、4画面を適宜組み合わせて生成した画像を入力画像にしても良い(差分を取るなど)。もしくは、マスク画像の元になるのは、前に撮影したDSA画像(digital subtraction angiography=造影剤が入る前後のライブ画像の差分)とリアルタイムのDA画像(digital angiography)の差分であるため、これらの大元になっている画像を入力画像として使用しても良い。
Faster R-CNNは、領域の切り出しと認識とを同時に行うCNN(Convolutional Neural Network)である。畳み込みニューラルネットワーク(CNN)は、「畳み込み層」や「プーリング層」などの幾つかの特徴的な機能を持った層を積み上げることで構成される何段もの深い層を持つニューラルネットワークであり、特に画像認識の分野で優れた性能を発揮している。Faster R-CNNは、ほぼ実時間(1秒あたり10~20フレーム程度)で入力画像から関心領域の切り出しと認識を行うことができる。Faster R-CNNでは、画像の入力から物体の検出までEnd-to-Endでの学習が可能である。
続いて、関心領域Rに設定される条件の具体例を説明する。
続いて、関心領域Rに設定される条件の別の例として、塞栓用コイルのデリバリーワイヤーを誘導するガイディングカテーテルに関する条件について説明する。
続いて、関心領域Rに設定される条件のさらに別の例として、デバイスD(例えば、ガイドワイヤー)の形状に関する条件について説明する。
動脈瘤塞栓用カテーテルには、先端マーカー(1stマーカー)と2ndマーカー(通常は先端から3cmの位置)が付されている。動脈瘤内のどこにカテーテルの先端があるかを術者が把握することは、手術を安全に行うために重要であるが、動脈瘤内にコイルが入ると、先端マーカーの位置がわかりにくくなり、安全性が低下する(図17を参照)。例えば、先端マーカーが動脈瘤の奥に移動すると、その先端、もしくは、その先端から出るコイルによって動脈瘤を穿孔し、くも膜下出血という重大な合併症につながる。逆に先端マーカーが動脈瘤から出そうになっていると、カテーテルやコイルが動脈瘤外に逸脱し、再度、動脈瘤内に入れなければならず、この操作に動脈瘤壁の穿孔の危険が伴う。
図22は、実施の形態に係る画像処理装置が実行する2ndマーカーを利用したカテーテル先端位置の推定処理の流れを説明するためのフローチャートである。本フローチャートにおける処理は、例えば画像処理装置1が起動したとき、あるいはユーザまたは画像処理装置1により処理の開始が必要と判断されたときに開始される。
医療従事者は施術中、特定の場所を注目しているため、本開示に係るシステムが通知を発した際に、警告が出されたデバイスに注目していない場合、もしくは別の画面を見ている場合、警告発生以降に状況の把握に努めることになる。しかしリアルタイムの映像は刻々と更新されるため、警告が出た時刻周辺の詳細を確認することはしばしば困難である。また、警告の性質により、デバイスの動作自体の詳細、警告発生時点と現在の差異、警告発生時点からの経過時間、等の把握が必要となる。
図23は、実施の形態に係る画像処理装置が実行するリプレイ機能の処理の流れを説明するためのフローチャートである。本フローチャートにおける処理は、例えば画像処理装置1が起動したときに開始する。
ガイドワイヤー先端やガイディングカテーテル先端などの関心領域が、X線画角の範囲外へ移動(フレームアウト)することは危険なことであるが、危険度はその移動量により異なる。具体的には例えば、ガイドワイヤーの先端が少し(5mm以内など)枠外に出ただけであれば血管穿孔の可能性は低いが、大きく枠外に出た場合(20mm以上など)、血管穿孔のリスクが高く、関心領域がフレームアウトした時には、X線画像の画角内に引き戻す必要がるが、状況等によっては即座に対処できない場合もある。このような場合に、フレームアウトした関心領域が、X線画角の範囲外へどの程度移動しているか、またその移動がどの程度危険であるかを知ることは重要である。よって、本発明の一部の態様は、フレームアウトした関心領域の位置、速度、加速度の状態を推定し、表示する装置に関する。
図24は、実施の形態に係る画像処理装置が実行するフレーム外の関心領域の位置の推定処理の流れを説明するためのフローチャートである。本フローチャートにおける処理は、例えば画像処理装置1が起動したときに開始する。
本発明の一部の態様では、画像解析の結果を、サイズの異なる2つの画面で表示することができる。上述のように、血管内手術では一般的に、複数の画面(例えば、4画面)を見ながら手術が行われ、術者は最低でも2画面(一般的には正面(AP方向もしくはF方向:Anterior-Posterior, Frontal)と側面(RL方向もしくはLAT方向:Right to Left, Lateral))を見ることにより立体的な情報を把握している。そのため、正面と側面の画像を表示しておくことは重要であり、物理的に限られたサイズのモニターに見やすく表示することは非常に重要である。実際の手術では、モニターは患者のベッド越しに1m以上離れて配置されることが多く、モニターを1cmでも術者に近づけるように指示が出されることも多い。なお、正面と側面は角度が立体的に動き、例えば正面の管球を用いて、右15度、頭側に10度の角度をつけて、斜めから見ることは常に行っている。これにより、3次元が2次元に投影された場合でも、2次元でもよく分かるように角度をつけて見ている。更に、4画面の場合は、正面のLiveとMask、側面のLiveとMaskとなる。Liveは通常の透視画像で一般的なレントゲン写真と似ており、それをリアルタイムで見ている。Maskは術者が選ぶ過去の任意のLive画像との差分(サブトラクション)を取っている。それにより、Liveで見えていた骨が消え、例えば、造影剤で写っている血管とデバイスのみが見えて、術者にわかりやすい映像が得られる。
本発明の一部の態様では、画面内にある関心領域は確率分布として出力されるため、関心領域の存在を確率で表現することも可能である。存在する領域の確率分布は数値、色、バーなどで表示しても良い。さらに、通知すべき場面かどうかも確率で表現することも可能である。通知すべき場面かどうかを確率に応じた数値、色、バーなどで表示しても良い。また、画面内のどの部分が責任部分かを確率分布としてのヒートマップなどによる表示をしても良い。
確率分布はわかりやすいように変換して表示しても良い。例えば、0~30%はlow、30~70%はmiddle、70~100%はhighとして文字表示したり、3色で表示したりしても良い。別の例として、70%未満の領域を少し暗くして、関心領域もしくは通知すべき領域がスポットライトのように明るく見えるようにしてもよい。
血管内治療で使用するデバイスは様々なものがあり、それぞれにおいて多くの種類が存在する。デバイスの例としては、様々なカテーテル、バルーン付きカテーテル、ガイドワイヤー、ステント、フローダイバータステント(メッシュの細かいステント)、コイル、塞栓物質(液体や粒子などの物質)、その他の塞栓デバイス(WEBなど)がある。また、それぞれのデバイスにおいて、様々な種類がある。例えばカテーテルでは、先端形状・長さ・内腔・外腔・硬さなどの規格がある。コイルでは、メーカー・太さ・全長・直径・硬さなどの規格があり、数百種類存在する。これらを全て覚えておくことは不可能であり、また、在庫もあるかわからないため、手術中にそれを術者が業者に確認しながら、治療を進めていく。組み合わせも重要であり、またコイルの大きさなどは画像を見ながら判断していく。コイルは動脈瘤であれば通常5~15本くらい使用するため、次にどれを使うかを考えなければならないが、在庫管理やラインナップなどを術者・助手は覚えることは困難である。新製品が出たり、旧製品がなくなったりするため、状況を把握することも困難であり、また施設によって揃えているラインナップが変わる。現在は、治療中に業者とコミュニケーションを取ることにより、デバイスの選択をしているが、スムーズではない。
例えば、脳動脈瘤の治療のためにデバイスを挿入した際、術者は、いつX線画面内にデバイスが到達するかわからないため、デバイスの移動を進めながらずっとX線を出して画面をみておく必要があり、通常は、デバイスが見えるようになるまでは素早く、見えるようになった後はゆっくりとデバイスの移動を進める。この際、デバイスの出現を見落として、スピードを落とすことができないと、血管穿孔や解離のリスクがある。デバイスが見えるまでは本来は画面の変化がない状況でX線を照射しており、患者および術者・助手・医療従事者の被爆の増大につながる。よって、本発明の一部の態様では、画像処理装置は、画像中、例えば、画像中の縁部もしくは縁部から一定の距離の範囲内または特定の領域内に新たに出現したデバイスが関心領域として新たに検出されることを条件としてユーザ(術者)に通知することができる(図25)。より詳細には、本発明の一部の態様は、血管内の検査用又は治療用のために画像を取得する画像取得部と、前記画像中に血管内の検査用又は治療用のデバイスが含まれている場合、前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、を備え、前記通知部は、画像中または特定の領域内の新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記ユーザに通知する、画像処理装置に関する。これにより、血管内治療を行う際の術者の負担を減らすことができる。
例えば、動脈瘤にコイルを挿入しているときに、ガイディングカテーテルや中間カテーテルが移動したり、ステントが移動したり、バルーンが移動したり、バルーンを誘導するガイドワイヤーの先端が移動したりするが、術者は動脈瘤を中心に見ているため、その移動や変化に気づかない。本発明の一部の態様では、画像処理装置は、画像中、例えば、特定の領域内に新たに出現したデバイスが関心領域として新たに検出されることを条件としてユーザ(術者)に通知することができる(図37)。より詳細には、本発明の一部の態様は、血管内の検査用又は治療用のために画像を取得する画像取得部と、前記画像中に血管内の検査用又は治療用のデバイスが含まれている場合、前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、を備え、前記通知部は、画像中または特定の領域内の新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記ユーザに通知する、画像処理装置に関する。これにより、血管内治療を行う際の術者の負担を減らすことができる。加えて、本発明の一部の態様では、画像処理装置は、前記画像において前記関心領域を追跡する追跡部をさらに含んでいてもよい。ここで、通知部は、画像中の縁部もしくは縁部から一定の距離の範囲内または特定の領域内に新たに出現したデバイスが関心領域として新たに検出されることを条件として前記ユーザに通知することができる。
図26は、実施の形態に係る画像処理装置が実行する画面内に新たに入ってきたデバイスの認識の処理の流れを説明するためのフローチャートである。本フローチャートにおける処理は、例えば画像処理装置が起動したときに開始する。
(特定の境界線の指定に基づく通知)
本発明の一部の態様では、画像処理装置は、関心領域が、画像上において指定された特定の境界線を通過すること、もしくは通過が予測されることを条件としてユーザ(術者)に通知することができる。より詳細には、本発明の一部の態様は、血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、を備え、前記通知部は、前記関心領域が、画像上において指定された特定の境界線を通過すること、もしくは通過が予測されることを条件として前記ユーザに通知する、画像処理装置に関する。これにより、画像処理装置が通知を行う条件をより柔軟に設定することができる(図27および図28)。通過の予測は、前記関心領域と境界線との間の距離、または前記距離/前記関心領域の速度にもとづき行われうる。この画像処理装置は、前記画像において前記関心領域それぞれを追跡する追跡部をさらに備えていてもよい。なお、本発明の一部の態様においては、通知部に替えて、またはそれに加えて、画像処理装置に接続された機器に情報を出力する出力部が存在してもよい。
例えば、手術中にユーザ(術者)が画面上のデバイスをマウスやタッチパネルなどのポインティングデバイスを使って指定する必要がある場合、動いているデバイスを手術中にピンポイントで選択することは容易でない。例えば、カテーテル治療中にはデバイスが意図しない動きを見せることがある。そのため、画面上で矩形など任意の図形で一定の領域で囲むことで、その中にあるデバイスを選択することができれば、デバイスが動いていても取りこぼしにくい。よって、本発明の一部の態様では、画像処理装置は、関心領域として、画像上でユーザが指定する領域内に含まれる関心領域を取得することができる。より詳細には、本発明の一部の態様は、血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、を備え、前記関心領域として、前記画像上でユーザが指定する領域内に含まれる関心領域が取得される、画像処理装置に関する。この画像処理装置は、前記画像において前記関心領域それぞれを追跡する追跡部をさらに備えていてもよい。なお、本発明の一部の態様においては、通知部に替えて、またはそれに加えて、画像処理装置に接続された機器に情報を出力する出力部が存在してもよい。
手術中のリアルタイムでのデバイス認識の際、血管造影、穿刺、ノイズ発生時などの場面で推論してしまうと、間違った認識結果を出してしまうことがある。そのため、特定の手術の状況を自動で認識して、状況に応じてデバイス認識を停止するなどの対応をとることが望ましい。よって、本発明の一部の態様は、血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、前記画像に基づき特定の手術の状況を認識する状況認識部と、を備える画像処理装置に関する。
手術動画は記録、医療従事者の教育、患者およびその家族への説明の上で重要である。しかし、例えば脳血管手術の場合、最大4つの画面に対応する動画があり、それぞれ1~2時間の長さがあるため、閲覧や編集には時間がかかる。そこで、AIで自動的にイベントの検出やシーン分類を行い、手術記録、教育、説明用の動画の保存や、重要な画像の静止画保存を行うことが望ましい。よって、本発明の一部の態様は、血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、前記関心領域の少なくともいずれか1つが前記関心領域毎に定められた条件を満たす場合、その条件に対応する手術の状況を記録する記録部とを備える、画像処理装置に関する。
リアルタイム手術支援の場合には、ROIの設定時に、境界(ROI)をシーン分類に応じて、提案・出現・消去・移動・拡大・縮小しても良い。また認識や通知の有無やタイプを変更しても良い例えば、造影中であるというシーン分類ができた場合は、認識を止めたり、通知をとめたりしても良い。それは、一般的には、造影の目的は血管を見ることであり、そのときにデバイスが動くことは稀であり、通知をする必要性は低いからである。一方で通知をしても良い。稀ではあるが、造影により、デバイスが動くことがあるからである。
複数のデバイスを含む画像においては、画面内に様々な情報が増えすぎることで、瞬間的な状況の把握が難しくなる場合がある。そこで、メインの手術画面の横にデバイスの上下位置のみを表示するサブ画面を用意することにより、そのような問題を解決する(図29)。よって、本発明の一部の態様は、血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、を備え、前記画像処理装置に接続された表示装置に、前記関心領域を含む前記画像と、前記画像の横に前記関心領域の縦の座標のみに対応するマークとを表示させる表示部をさらに備える、画像処理装置に関する。この画像処理装置は、前記画像において前記関心領域それぞれを追跡する追跡部をさらに備えていてもよい。なお、本発明の一部の態様においては、通知部に替えて、またはそれに加えて、画像処理装置に接続された機器に情報を出力する出力部が存在してもよい。
本発明の一部の態様は、血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、前記画像において前記関心領域それぞれを追跡する追跡部と、前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、を備え、前記関心領域が複数追跡される場合、複数の関心領域をそれぞれ異なる表示様式で表示する表示部をさらに備える、画像処理装置に関する。このような態様では、複数の関心領域をそれぞれ異なる表示様式で表示されるため、個々のデバイスの識別が容易となる。表示様式は、例えば、色、形、模様、記号、文字、キャプション、またはアニメーションである。
本発明の一部の態様は、血管カテーテル手術支援システム、例えば、脳、心臓、四肢抹消、および腹部の血管、特に脳血管のカテーテル手術支援システムに関する。このようなシステムは、画像処理装置と、血管内に1又は複数のデバイスを挿入した状態の患者のX線画像を撮像して前記画像処理装置に送信する画像撮像装置とを備え、前記画像処理装置が、手術の目的を達成するための注目部と、血管内に挿入されたデバイスとを少なくとも含む領域(例えば、固定された領域)のX線画像を経時的に取得する画像取得部と、前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、前記関心領域の少なくともいずれか1つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部とを備え、前記1又は複数のデバイスがカテーテル、ガイドワイヤー、ステント及び/又はバルーンであり、前記関心領域としてカテーテルの先端部又はガイドワイヤーの先端部、ステントの両端、バルーンの両端を含む領域が設定された場合において、前記関心領域が前記画像から消失すること、又は前記関心領域と前記画像の縁部との距離が所定の閾距離未満となること、または前記関心領域が一定距離ずれたことを条件として前記ユーザに通知する、システムでありうる。このシステムは、前記画像において前記関心領域それぞれを追跡する追跡部をさらに備えていてもよい。なお、本発明の一部の態様においては、通知部に替えて、またはそれに加えて、画像処理装置に接続された機器に情報を出力する出力部が存在してもよい。
図10は、実施の形態に係る画像処理装置1が実行する画像解析処理の流れを説明するためのフローチャートである。本フローチャートにおける処理は、例えば画像処理装置1が起動したときに開始する。
血管撮影の検査や治療において、造影によって血管の撮影を行い、病変を診断するが、見落とすことや、同日の前に撮った撮影や、別の日の撮影と比較して判断しなければならず、時間がかかったり、比較が難しかったりすることがある。また二次元に投影されるため、判断が難しいことがある。よって、本発明の一部の態様は、造影剤による血管撮影において、ディープラーニングなどを用いて、脳動脈瘤、狭窄、閉塞、血栓形成、血管穿孔(造影剤の流出)、シャント疾患、腫瘍血管の栄養血管・腫瘍濃染、静脈血栓症、毛細血管相の無血管領域(血管閉塞の所見)、側副血行路などを含むが、これらに限定はされない病変または部位を指摘する画像診断装置に関する。よって、本発明の一部の態様において、画像処理装置は、前記画像中の動脈瘤、狭窄、血管攣縮、解離、閉塞、再開通、血栓形成、血栓の部位と両端の位置、血管穿孔、造影剤の血管外への漏出、血管の石灰化、動脈硬化、シャント疾患やその栄養血管や流出血管、血液(造影剤)の逆流、脳動静脈奇形、硬膜動静脈瘻、無血管領域、骨のメルクマーク(内耳道、眼底、眼窩上縁、錐体部、大後頭孔、頚椎、鎖骨、肋骨・脊椎の番号、大腿骨頭、骨盤)、腫瘍血管の栄養血管・腫瘍濃染、静脈閉塞、静脈洞血栓症、毛細血管相の無血管領域、血管閉塞、コイルの動脈瘤内の形状や分布、バルーンの位置・膨らみ・形状、コイルの正常血管への逸脱、ステントの拡張不足や血管への密着度合いやねじれ、ステントの移動、ステントの両端の位置、穿刺部と血管の位置関係(狭窄がないか、分岐部近くにないか)、血管の蛇行、大動脈弓部のタイプ(右腕頭動脈が大動脈弓部のトップからどのくらい下にあるか)、液体塞栓物質の浸透範囲、血液(造影剤)の流れの遅延や停滞、血管のバリエーション(前交通動脈・前大脳動脈A1・後交通動脈・後大脳動脈P1・後下小脳動脈・前下小脳動脈・上小脳動脈・浅側頭動脈・各静脈洞・各静脈の有無・発達度合い)、もやもや血管(内頚動脈先端部の狭窄・閉塞とその先の側副血行路の発達)、動脈の分岐部とセグメント(内頚動脈錐体部・海綿静脈洞部・眼動脈部、中大脳動脈M1の分岐部)の位置、過去の手術痕跡(クリップ、コイル、プレート、シャントチューブ・バルブ、脳室チューブ、脳槽チューブ)、WEBデバイスの位置・開き具合、異物(義歯、プレート)、および側副血行路から成る群より選択される病変を認識する病変認識部をさらに含むことができる。指摘まで行わず、異常所見が疑わしい場合に通知したり、どのあたりが異常な可能性があるかを領域で指摘したりしても良い。その場合、最終的には医師が判断することができる。同様に、本発明の一部の態様は、血管撮影を行った際に、前の撮影と比較して、もしくは別の日に行った撮影と比較して、その変化を指摘する画像診断装置に関する。よって、本発明の一部の態様において、画像処理装置は、画像中の血管撮影像を以前に取得して記憶した血管撮影像と比較して、変化を通知する画像認識部をさらに含むことができる。例えば、血管攣縮の程度の変化、血栓形成の変化(出現、消失、拡大、縮小など)、閉塞血管の解除、血管の閉塞、コイルの逸脱、ステントの移動などを指摘することができる。
以上説明したように、実施の形態に係る画像処理装置1によれば、血管のカテーテル検査又は治療において、医療従事者であるユーザを注目部における作業に集中させ、注目部の判断をサポートするための技術を提供することができる。
上記では、脳血管の検査又は治療について主に説明した。しかしながら、本発明の適用対象は脳血管に限られず、心臓、四肢抹消、腹部など、循環器領域を含む血管内の検査及び治療に適用できる。
上記では、図7(a)-(b)のような2画面の装置を例に説明を行ったが、画面の数はこれに限定されず、例えば1画面や3画面以上であってもよい。
上記では、X線撮像装置3が被験者Pの術部の画像を撮影する場合について説明した。しかしながら、被験者Pの術部の画像を撮影する撮像装置はX線撮像装置3に限られない。この他、例えばMRI(Magnetic Resonance Imaging)や超音波撮影装置等のモダリティを用いて術部の画像を撮影してもよい。
上記の実施形態の一部または全部は、以下の付記のようにも記載されうるが、本出願の開示事項は以下の付記に限定はされない。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備える画像処理装置。
(付記2)
前記通知部は、前記関心領域としてカテーテルの先端部又はガイドワイヤーの先端部を含む領域が設定された場合において、前記関心領域が前記画像から消失することを条件として前記ユーザに通知する、
付記1に記載の画像処理装置。
(付記3)
前記通知部は、前記関心領域としてカテーテルの先端部又はガイドワイヤーの先端部を含む領域が設定された場合において、前記関心領域と前記画像の縁部との距離が所定の閾距離未満となることを条件として前記ユーザに通知する、
付記1に記載の画像処理装置。
(付記4)
前記通知部は、前記画像内における前記関心領域の移動距離、移動速度、及び加速度の少なくともいずれか1つがあらかじめ定められた閾値を超えることを条件として前記ユーザに通知する、
付記1から3のいずれか1項に記載の画像処理装置。
(付記5)
前記通知部は、前記関心領域と前記画像の縁部との距離を、前記画像を表示する表示装置に表示させる、
付記1から4のいずれか1項に記載の画像処理装置。
(付記6)
前記通知部は、前記関心領域と前記画像の縁部との距離の大きさに応じて、前記表示装置における前記距離の表示態様を変化させる、
付記5に記載の画像処理装置。
(付記7)
前記通知部は、前記関心領域と前記画像の縁部との距離を前記画像内における前記関心領域の移動速度で除算した値があらかじめ定められた閾値未満となることを条件として、前記ユーザに通知する、
付記1から6のいずれか1項に記載の画像処理装置。
(付記8)
塞栓用コイルのデリバリーワイヤーに設けられたマーカーであって、前記デリバリーワイヤーを誘導するマイクロカテーテルの一部に設定された関心領域に接近するマーカーを検出するマーカー検出部をさらに備え、
前記追跡部は、検出された前記マーカーをさらに追跡し、
前記通知部は、前記マーカーと前記関心領域とが重畳することを契機として、前記塞栓用コイルを前記デリバリーワイヤーから切断してもよいタイミングを前記ユーザに通知する、
付記1から7のいずれか1項に記載の画像処理装置。
(付記9)
前記通知部は、前記マーカーが前記関心領域を通過した場合、そのことを前記ユーザに通知する、
付記8に記載の画像処理装置。
(付記10)
前記通知部は、前記塞栓用コイルを前記デリバリーワイヤーから切断するまでに前記マーカーが移動すべき距離を表示装置に表示させる、
付記8又は9に記載の画像処理装置。
(付記11)
前記通知部は、前記関心領域に含まれる前記デバイスの形状を示す特徴量が所定の条件を満たす場合、そのことを前記画像処理装置のユーザに通知する、
付記1から10のいずれか1項に記載の画像処理装置。
(付記12)
前記特徴量は曲率であり、
前記通知部は、前記関心領域に含まれる前記デバイスの曲率が所定の閾曲率を超えること、もしくは曲率が変化しているのに先端が動かないことを条件として前記ユーザに通知する、
付記11に記載の画像処理装置。
(付記13)
前記通知部は、前記画像または関心領域に含まれる前記デバイスの長さから前記画像または関心領域に含まれる前記血管の中心線の長さを減じた値が所定の閾長さを超えることを条件として前記ユーザに通知する、
付記11又は12に記載の画像処理装置。
(付記14)
前記通知部は、前記関心領域を前記画像と異なる色に着色して表示させる、表示する文字のフォント、サイズもしくは色を変える、表示装置の画面全体もしくは一部の場所の色を変える、表示装置の画面全体もしくは枠外もしくは一部の場所に図形を表示する、関心領域を拡大表示する、または関心領域に付するマークの色もしくはサイズを変えることによりユーザに通知を行う、付記1から13のいずれか1項に記載の画像処理装置。
前記通知部は、前記関心領域としてカテーテルの先端部又はガイドワイヤーの先端部を含む領域が設定された場合において、前記関心領域が動いたこと、または前記関心領域が、前記画像上において指定された特定の範囲を超えたことを条件として前記ユーザに通知する、
付記1から14のいずれか1項に記載の画像処理装置。
(付記16)
画像取得部から得た画像または動画を経時的に保存する映像記憶部をさらに含む、付記1から15のいずれか1項に記載の画像処理装置。
(付記17)
通知部が通知を発した前後の一定の期間の映像またはユーザが指定した任意の時間または期間の映像を前記映像記憶部より抽出する映像抽出部をさらに含む、付記16に記載の画像処理装置。
(付記18)
通知が発生した際の関心領域の移動距離、移動速度、及び加速度の少なくともいずれか1つに基づき、映像の抽出期間が自動で決定されることを特徴とする、付記17に記載の画像処理装置。
(付記19)
抽出された映像を表示装置に表示させる、付記17または18に記載の画像処理装置。
(付記20)
抽出された映像が、自動的に所定の回数繰り返して表示される、付記19に記載の画像処理装置。
(付記21)
再生、停止、早送り、巻き戻し、コマ送り、スロー再生、倍速再生を含む任意の操作に基づいて、抽出された映像が表示される、付記20に記載の画像処理装置。
(付記22)
通知が発生した時点からの経過時間、通知が発生した時点と任意の時間の経過後における関心領域の位置の比較、または追跡部が取得した関心領域の軌跡が、抽出された映像に重畳してさらに表示される、付記17から21のいずれか1項に記載の画像処理装置。
(付記23)
抽出された映像が、関心領域の近傍の一部の領域を切り出して表示される、付記17から22のいずれか1項に記載の画像処理装置。
(付記24)
抽出された映像が、関心領域の表示を妨げない位置に表示される、付記17から23のいずれか1項に記載の画像処理装置。
(付記25)
抽出された映像が、拡大されて表示される、付記17から24のいずれか1項に記載の画像処理装置。
(付記26)
抽出された映像が、通知の発生と同時または通知の発生から所定時間の経過後に表示される、付記17から25のいずれか1項に記載の画像処理装置。
(付記27)
複数の方向から撮影された映像が同時に表示される、付記17から26のいずれか1項に記載の画像処理装置。
(付記28)
前記関心領域が前記画像から消失する直前のカテーテルの先端部又はガイドワイヤーの先端部の位置、速度および/または加速度に基づき、前記画像から消失したカテーテルの先端部又はガイドワイヤーの先端部の現在位置および/または速度を推定する状態推定部をさらに備える、付記2から27のいずれか1項に記載の画像処理装置。
(付記29)
状態推定部が推定した関心領域の現在位置および/または速度が所定の閾値を超えた場合にユーザに警告が通知される、付記28に記載の画像処理装置。
(付記30)
前記画像を表示する表示装置が、2つの画像をサイズの異なる2つの画面に表示する、付記1から29のいずれか1項に記載の画像処理装置。
(付記31)
前記表示装置が、2つの画面のうち一方の画面の枠部分を光らせる、色を変える、または強調表示することにより、ユーザに注意を促す、付記30項に記載の画像処理装置。
(付記32)
前記画像を表示する表示装置が、血管内の検査用又は治療用のデバイスの製品一覧を表示する、付記1から31のいずれか1項に記載の画像処理装置。
(付記33)
前記表示装置が、サイズまたは在庫により絞り込んだ製品一覧を表示する、付記32に記載の画像処理装置。
(付記34)
前記表示装置が、画像解析結果、施設の情報、またはユーザの好みの情報に基づき、推奨される製品のリストを表示する、付記32または33に記載の画像処理装置。
(付記35)
使用されたデバイスの情報、取得された画像の情報、および画像解析結果を含む手術記録を自動で、またはユーザの選択に基づき作成する、付記1から34のいずれか1項に記載の画像処理装置。
(付記36)
前記通知部は、前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす確率に応じた数値、色、バーもしくはヒートマップ、または確率分布に任意の変換をほどこした値にもとづく数値、色、バーもしくはヒートマップを、前記画像を表示する表示装置に表示する、付記1から35のいずれか1項に記載の画像処理装置。
(付記37)
前記通知部は、前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす確率に応じた色もしくはヒートマップ、または確率分布を任意に変換した値に基づく色もしくはヒートマップにより、前記関心領域を着色して前記画像を表示する表示装置に表示する、あるいは条件を満たす確率を数値もしくは色に置き換えて前記画像を表示する表示装置に表示する、付記1から35のいずれか1項に記載の画像処理装置。
(付記38)
前記通知部は、前記関心領域としてカテーテルの先端部又はガイドワイヤーの先端部を含む領域が設定された場合において、前記関心領域が動いたこと、または前記関心領域が、画像上において指定された特定の範囲を超えたことを条件として前記ユーザに通知する、付記1から37のいずれか1項に記載の画像処理装置。
(付記39)
前記特定の範囲の境界線が、直線、曲線、円形、矩形、または、その他の多角形により表される、付記38に記載の画像処理装置。
(付記40)
前記特定の範囲が、X線画像上に重畳表示される、付記38または39に記載の画像処理装置。
(付記41)
前記通知部は、前記関心領域と前記特定の範囲の縁部との距離を、前記画像を表示する表示装置に表示させる、付記38から40のいずれか1項に記載の画像処理装置。
(付記42)
前記通知部は、前記関心領域と前記特定の範囲の縁部との距離の大きさに応じて、前記表示装置における前記距離の表示態様を変化させる、付記41に記載の画像処理装置。
(付記43)
前記通知部は、前記関心領域と前記特定の範囲の縁部との距離を前記画像内における前記関心領域の移動速度で除算した値があらかじめ定められた閾値未満となることを条件として、前記ユーザに通知する、付記38から42のいずれか1項に記載の画像処理装置。
(付記44)
前記距離が、直線距離又は血管に沿った距離のいずれかにより決定される、付記3から43のいずれか1項に記載の画像処理装置。
(付記45)
血管内の検査用又は治療用のデバイスの任意の時点における位置および/もしくは形状を取得して記憶する記憶部を含み、記憶されたデバイスの位置および/もしくは形状が取得以降の画像に重畳表示される、付記1から44のいずれか1項に記載の画像処理装置。
(付記46)
前記画像中の脳動脈瘤、狭窄、閉塞、血栓形成、血管穿孔、造影剤の血管外への漏出、シャント疾患、腫瘍血管の栄養血管・腫瘍濃染、静脈血栓症、毛細血管相の無血管領域、血管閉塞、および側副血行路から成る群より選択される病変を認識する病変認識部をさらに含む、付記1から42のいずれか1項に記載の画像処理装置。
(付記47)
前記画像中の血管撮影像を以前に取得して記憶した血管撮影像と比較して、変化を通知する画像認識部をさらに含む、付記1から43のいずれか1項に記載の画像処理装置。
画像処理装置のプロセッサが、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得するステップと、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得するステップと、
前記画像において前記関心領域それぞれを追跡するステップと、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知するステップと、
を実行する画像処理方法。
コンピュータに、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する機能と、
前記X線画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する機能と、
前記X線画像において前記関心領域それぞれを追跡する機能と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記コンピュータのユーザに通知する機能と、
を実現させるプログラム。
付記1から44のいずれか1項に記載の画像処理装置と、
血管内の検査用又は治療用のデバイスを挿入した状態の人物の画像を撮像して前記画像処理装置に送信する撮像装置と、
を備える画像処理システム。
脳血管カテーテル手術支援システムであって、
画像処理装置と、
血管内に1又は複数のデバイスを挿入した状態の患者のX線画像を撮像して前記画像処理装置に送信する画像撮像装置と
を備え、
前記画像処理装置が、
手術の目的を達成するための注目部と、血管内に挿入されたデバイスとを少なくとも含む固定領域のX線画像を経時的に取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか1つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と
を備え、
前記1又は複数のデバイスがカテーテル、ガイドワイヤー、ステント及び/又はバルーンであり、
前記関心領域としてカテーテルの先端部又はガイドワイヤーの先端部、ステントの両端、バルーンの両端を含む領域が設定された場合において、前記関心領域が前記画像から消失すること、又は前記関心領域と前記画像の縁部との距離が所定の閾距離未満となることを条件として前記ユーザに通知する、
システム。
(付記52)
脳動脈瘤コイル塞栓術補助システムであって、
画像処理装置と、
血管内にガイディングカテーテルと塞栓用コイルのデリバリーワイヤーとを挿入した状態の患者のX線画像を撮像して前記画像処理装置に送信する画像撮像装置と
を備え、
前記画像処理装置が、
患者の血管に生じた動脈瘤と、血管内に挿入されたカテーテルと、塞栓用コイルのデリバリーワイヤーとを少なくとも含む固定領域のX線画像を経時的に取得する画像取得部と、
前記画像に含まれる前記ガイディングカテーテルの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記デリバリーワイヤーに設けられたマーカーであって、前記デリバリーワイヤーを誘導する前記カテーテルの一部に設定された1又は複数の関心領域に接近するマーカーを検出するマーカー検出部と、
前記画像において前記関心領域及び前記マーカーそれぞれを追跡する追跡部と、
前記マーカーと前記関心領域とが重畳することを契機として、塞栓用コイルを該デリバリーワイヤーから切断してもよいタイミングをユーザに通知する通知部と
を備える、システム。
(付記53)
前記通知部が、前記関心領域と前記画像の縁部との距離又は前記マーカーと前記関心領域との距離を、前記画像を表示する表示装置に表示させ、
ここで、前記通知部は、前記距離の大きさに応じて、前記表示装置における前記距離の表示態様を変化させることができ、表示態様の変化が、前記距離の大きさに応じて表示する文字のフォント、サイズ、もしくは色を変えること、前記距離の大きさに応じて表示装置の画面全体もしくは一部の場所の色を変えること、表示装置の画面全体もしくは枠外もしくは一部の場所に図形を表示する、前記距離の大きさに応じて関心領域を拡大表示する、又は前記距離の大きさに応じて関心領域に付するマークの色もしくはサイズを変えることを含む、付記51または52記載のシステム。
(付記54)
前記通知部が、前記距離の大きさに応じて通知音を鳴らすことができる、もしくは振動を伝えることができる、付記51から53のいずれか1項に記載のシステム。
(付記55)
前記距離が、直線距離又は血管に沿った距離のいずれかにより決定される、付記51から54のいずれか1項に記載のシステム。
(付記56)
脳動脈瘤コイル塞栓術補助システムであって、
画像処理装置と、
血管内にガイディングカテーテルと塞栓用コイルのデリバリーワイヤーとを挿入した状態の患者のX線画像を撮像して前記画像処理装置に送信する画像撮像装置と
を備え、
前記画像処理装置が、
カテーテルの先端と2ndマーカーとの位置関係を記憶する位置関係記憶部と、
動脈瘤のネックラインと1stマーカーとの距離a及びその時点t1における2ndマーカーの位置を記憶する位置記憶部と、
時点t2における2ndマーカーの位置から移動距離bを算出し、カテーテルの先端の動脈瘤ネックラインからの距離a-bを推定する距離推定部と
推定された距離をユーザに通知する通知部と
を備える、システム。
(付記57)
推定された距離が確率分布により表される、付記56記載のシステム。
(付記58)
推定されるカテーテル先端の位置が確率分布に基づき着色して表示される、付記56または57に記載のシステム。
血管内手術支援システムであって、血管内の検査用又は治療用のデバイスの製品一覧を記憶した記憶部と、画像解析結果、施設の情報、またはユーザの好みの情報に基づき使用する製品を推奨する推奨部と、推奨される製品を表示する表示部とを含む、システム。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得可能な画像取得部を備える、画像処理装置。
(付記A2)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記通知部は、画像中に新たに出現したデバイスが関心領域として新たに検出されることを条件として前記ユーザに通知する、
付記A1に記載の画像処理装置。
(付記A3)
前記通知部は、画像中の縁部または特定の領域内に新たに出現したデバイスが関心領域として新たに検出されることを条件として前記ユーザに通知する、付記A2に記載の画像処理装置。
(付記A4)
前記特定の領域がユーザにより指定される、付記A3に記載の画像処理装置。
(付記A5)
前記特定の領域が自動で指定される、付記A3に記載の画像処理装置。
(付記A6)
前記特定の領域が、画像処理装置が確認すべき領域であると自動で判断した領域である、付記A5に記載の画像処理装置。
(付記A7)
前記画像において前記関心領域を追跡する追跡部をさらに含む、付記A2~A6のいずれか1項に記載の画像処理装置。
(付記A8)
前記関心領域として、ガイディングカテーテル、ガイドワイヤー、中間カテーテル、マイクロカテーテル、血栓回収用吸引カテーテル、マーカー、コイル、ステント、フィルター、塞栓物質、動脈瘤塞栓器具、およびバルーンから成る群より選択される、血管内の検査用又は治療用のデバイスの少なくとも一部を含む領域が設定される、付記A2~A7のいずれか1項に記載の画像処理装置。
(付記A9)
前記デバイスの少なくとも一部を含む領域が、血管内の検査用又は治療用のデバイスの先端を含む、付記A2~A8のいずれか1項に記載の画像処理装置。
(付記A10)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記通知部は、前記関心領域が、画像上において指定された特定の境界線を通過することを条件として前記ユーザに通知する、
付記A1に記載の画像処理装置。
(付記A11)
前記特定の境界線が、ユーザにより指定される、付記A10に記載の画像処理装置。
(付記A12)
前記特定の境界線が、自動で指定される、付記A10に記載の画像処理装置。
(付記A13)
前記特定の境界線が、直線、曲線、円形、矩形、または任意の多角形もしくは閉曲線により表される、付記A9~A12のいずれか1項に記載の画像処理装置。
(付記A14)
前記特定の境界線が、前記デバイスの侵入が望ましくない血管への侵入を検出するために用いられる、付記A9~A13のいずれか1項に記載の画像処理装置。
(付記A15)
前記通知部が、1又は複数の領域を関心領域のそれぞれについて、前記条件が初めて満たされたときにのみ前記ユーザに通知する、付記A9~A14のいずれか1項に記載の画像処理装置。
(付記A16)
前記通知部が、1又は複数の領域を関心領域の全体において、前記条件が初めて満たされたときにのみ前記ユーザに通知する、付記A9~A15のいずれか1項に記載の画像処理装置。
(付記A17)
前記通知部が、ユーザが指定した特定の1又は複数の領域を関心領域について、前記条件が満たされたときにのみ前記ユーザに通知する、付記A9~A16のいずれか1項に記載の画像処理装置。
(付記A18)
前記通知部が、自動で特定された1又は複数の領域を関心領域について、前記条件が満たされたときにのみ前記ユーザに通知する、付記A9~A16のいずれか1項に記載の画像処理装置。
(付記A19)
前記特定の境界線が、X線画像上に重畳表示される、付記A9~A18のいずれか1項に記載の画像処理装置。
(付記A20)
前記特定の境界線が、X線画像の範囲または拡大率の変更に応じて再描画される、付記A9~A19のいずれか1項に記載の画像処理装置。
(付記A21)
前記通知が、警告音の発生、または境界線の表示様式の変更により行われる、付記A9~A20のいずれか1項に記載の画像処理装置。
(付記A22)
前記関心領域が、画像上において指定された特定の境界線を通過する向きに応じて、異なる通知の方法が用いられる、付記A9~A21のいずれか1項に記載の画像処理装置。
(付記A23)
前記関心領域が、前記特定の境界線を越える際の速度に応じて、異なる通知の方法が用いられる、付記A9~A22のいずれか1項に記載の画像処理装置。
(付記A24)
前記通知部は、前記関心領域と前記特定の境界線との距離を、前記画像を表示する表示装置に表示させる、付記A9~A23のいずれか1項に記載の画像処理装置。
(付記A25)
前記通知部は、前記関心領域と前記特定の境界線との距離の大きさに応じて、前記表示装置における前記距離の表示態様を変化させる、付記A9~A24のいずれか1項に記載の画像処理装置。
(付記A26)
前記通知部は、前記関心領域と前記特定の境界線との距離を前記画像内における前記関心領域の移動速度で除算した値があらかじめ定められた閾値未満となることを条件として、前記ユーザに通知する、付記A9~A25のいずれか1項に記載の画像処理装置。
(付記A27)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記関心領域として、前記画像上でユーザが指定する領域内に含まれる関心領域が取得される、
付記A1に記載の画像処理装置。
(付記A28)
前記画像上でユーザが指定する領域が、矩形、円形、または任意の閉曲線により表される、付記A27に記載の画像処理装置。
(付記A29)
ユーザが指定しうる領域の候補が、自動で表示される、付記A27またはA28に記載の画像処理装置。
(付記A30)
ユーザが領域を指定する際に、一時的に画像が静止画となる、付記A27~A29のいずれか1項に記載の画像処理装置。
(付記A31)
ユーザが領域を指定する際に、一時的に画像が遅延再生となる、付記A27~A29のいずれか1項に記載の画像処理装置。
(付記A32)
ユーザが領域を指定する際に、画像の一部が拡大表示される、付記A27~A31のいずれか1項に記載の画像処理装置。
(付記A33)
複数の関心領域が含まれる領域が指定された場合、確認すべき関心領域のみが自動で選択される、付記A27~A32のいずれか1項に記載の画像処理装置。
(付記A34)
確認すべき領域が、ガイドワイヤーの先端が一定の範囲内にあること、ステントのデリバリーワイヤーの先端が一定の範囲内にあること、コイルを切断してよいタイミングを見るためにカテーテルの2ndマーカーが移動する範囲、フィルターが一定の範囲内にあること、カテーテルの先端が一定の範囲内にあること、塞栓物質が一定の範囲内に収まっていること、デバイスが太い血管の内部にとどまっている範囲、動脈瘤、または重要な血管に基づき判断される、付記A33に記載の画像処理装置。
(付記A35)
確認すべき関心領域が、手術の状況に応じて決定される、付記A33またはA34記載の画像処理装置。
(付記A36)
画像が変更された場合に指定領域が自動的に調整される、付記A27~A35のいずれか1項に記載の画像処理装置。
(付記A37)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に基づき特定の手術の状況を認識する状況認識部と、
を備える、付記A1に記載の画像処理装置。
(付記A38)
前記特定の手術の状況が、手術内容、疾患名、血管漏出、血管穿孔、血栓の発生、抹消血管の消失、ガイドワイヤー・カテーテル・バルーンの誘導、カテーテルによる吸引、ステント留置、バルーン拡張、コイルの挿入、動脈瘤塞栓デバイスの挿入、コイルを切るタイミング、フィルターの誘導・留置・回収、液体塞栓物質の注入、造影剤の注入、一定時間以上の静止画像の継続、マスク画像・ライブ画像の判別、撮像部位・角度の判別、画像の切り換え、およびノイズの発生から成る群より選択される少なくとも1つの状況である、付記A37に記載の画像処理装置。
(付記A39)
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
をさらに備える、付記A37またはA38に記載の画像処理装置。
(付記A40)
ノイズの発生が認識された場合に、前記関心領域の取得を停止する、付記A38に記載の画像処理装置。
(付記A41)
一定時間以上の静止画像の継続が認識された場合に、被爆の低減推奨を通知する、付記A38に記載の画像処理装置。
(付記A42)
認識された特定の手術の状況を記録する記録部をさらに含む、付記A37~A41のいずれか1項に記載の画像処理装置。
(付記A43)
前記特定の手術の状況が、パターンマッチング、画像認識、時系列画像認識、時系列差分、または物体検出アルゴリズムにより認識される、付記A37~A42のいずれか1項に記載の画像処理装置。
(付記A44)
認識された特定の手術の状況に応じて、認識されるデバイスが特定される、付記A37~A43のいずれか1項に記載の画像処理装置。
(付記A45)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか1つが前記関心領域毎に定められた条件を満たす場合、その条件に対応する手術の状況を記録する記録部と、
を備える、付記A1に記載の画像処理装置。
(付記A46)
条件が、穿刺、ガイディングカテーテルの処置部位到達、バルーン誘導、バルーン拡張、ステント展開、カテーテル誘導、1本目のコイル挿入、2本目以降のコイル挿入、コイルの動脈瘤内の局在、コイルの母血管への逸脱、カテーテル抜去、コイル抜去、処置終了から成る群より選択される少なくとも1つの処置を含む、付記A45に記載の画像処理装置。
(付記A47)
記録される手術の状況が、静止画、動画、および/または手術の状況を説明するテキスト情報を含む、付記A45またはA46に記載の画像処理装置。
(付記A48)
動画が、前記条件が満たされた時点の前後一定期間の動画である、付記A47に記載の画像処理装置。
(付記A49)
記録された情報が、ユーザにより修正可能である、付記A45~A48のいずれか1項に記載の画像処理装置。
(付記A50)
記録された情報に基づきレポートを作成するレポート作成部をさらに含む、付記A45~A49のいずれか1項に記載の画像処理装置。
(付記A51)
レポート作成部が、記録された静止画、動画、および/または手術の状況を説明するテキスト情報に基づき、手術のサマリーを作成する、付記A50に記載の画像処理装置。
(付記A52)
特定の手術の状況が記録された場合に、ユーザに通知する通知部をさらに含む、付記A45~A51のいずれか1項に記載の画像処理装置。
(付記A53)
特定の手術の状況が記録された後の時間経過に応じて、ユーザに通知する通知部をさらに含む、付記A45~A52のいずれか1項に記載の画像処理装置。
(付記A54)
特定の手術の状況が記録された後の時間経過が、バルーンの拡張時間、ガイドワイヤーの移動時間、速度、もしくは加速度、またはコイルの移動時間に基づき判断される、付記A53に記載の画像処理装置。
(付記A55)
特定の手術の状況が記録された回数に応じて、ユーザに通知する通知部をさらに含む、付記A45~A54のいずれか1項に記載の画像処理装置。
(付記A56)
特定の手術の状況がコイルの挿入である、付記A55に記載の画像処理装置。
(付記A57)
前記画像が、リアルタイムで撮影された動画、または過去に記録された動画から取得される、付記A45~A56のいずれか1項に記載の画像処理装置。
(付記A58)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記画像処理装置に接続された表示装置に、前記関心領域を含む前記画像と、前記画像の横に前記関心領域の縦の座標のみに対応するマークとを表示させる表示部をさらに備える、
付記A1に記載の画像処理装置。
(付記A59)
マークの軌跡が一定時間表示される、付記A58記載の画像処理装置。
(付記A60)
マークの表示様式が、関心領域ごとに異なる、付記A60またはA61記載の画像処理装置
(付記A61)
前記表示様式が、色、形、模様、記号、文字、キャプション、またはアニメーションである、付記A60記載の画像処理装置。
(付記A62)
確認すべき関心領域を自動で認識して強調表示する、付記A58~A61のいずれか1項に記載の画像処理装置。
(付記A63)
確認すべき関心領域が、手術の状況に応じて決定される、付記A62に記載の画像処理装置。
(付記A64)
ユーザが指定した関心領域のみを表示する、付記A58~A63のいずれか1項に記載の画像処理装置。
(付記A65)
自動で選択された関心領域のみを表示する、付記A58~A64のいずれか1項に記載の画像処理装置。
(付記A66)
ユーザが指定した境界値をマークが超えた場合、ユーザに通知を行う、付記A58~A65のいずれか1項に記載の画像処理装置。
(付記A67)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記関心領域が複数追跡される場合、複数の関心領域をそれぞれ異なる表示様式で表示する表示部をさらに備える、
付記A1に記載の画像処理装置。
(付記A68)
前記表示様式が、色、形、模様、記号、文字、キャプション、またはアニメーションである、付記A67記載の画像処理装置。
(付記A69)
確認すべき関心領域を自動で認識して強調表示する、付記A67またはA68記載の画像処理装置。
(付記A70)
確認すべき関心領域が、手術の状況に応じて決定される、付記A67~A69のいずれか1項に記載の画像処理装置。
(付記A71)
ユーザが指定した関心領域のみを表示する、付記A67~A70のいずれか1項に記載の画像処理装置。
(付記A72)
自動で選択された関心領域のみを表示する、付記A67~A71のいずれか1項に記載の画像処理装置。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得可能な画像取得部を備える、画像処理装置。
(付記B2)
画像処理装置であって、
血管内の検査用又は治療用のために画像を取得する画像取得部と、
前記画像中に血管内の検査用又は治療用のデバイスが含まれている場合、前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記通知部は、画像中または特定の領域内に新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記ユーザに通知する、
付記B1に記載の画像処理装置。
(付記B3)
画像処理装置であって、
血管内の検査用又は治療用のために画像を取得する画像取得部と、
前記画像中に血管内の検査用又は治療用のデバイスが含まれている場合、前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記出力部は、画像中または特定の領域内に新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記機器に出力する、
付記B1に記載の画像処理装置。
(付記B4)
前記通知部は、画像中の縁部もしくは縁部から一定の距離の範囲内に新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記ユーザに通知する、付記B2またはB3に記載の画像処理装置。
(付記B5)
前記出力部は、画像中の縁部もしくは縁部から一定の距離の範囲内、または特定の領域内に新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記機器に出力する、付記B2またはB3に記載の画像処理装置。
(付記B6)
前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、付記B2またはB3に記載の画像処理装置。
(付記B7)
前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、付記B6に記載の画像処理装置。
(付記B8)
処理の精度を高めるために複数の画面の情報を利用する、付記B6に記載の画像処理装置。
(付記B9)
前記特定の領域がユーザにより指定される、付記B2またはB3に記載の画像処理装置。
(付記B10)
前記特定の領域が自動で指定される、付記B2またはB3に記載の画像処理装置。
(付記B11)
前記特定の領域が、画像処理装置が確認すべき領域であると自動で判断した領域である、付記B2またはB3に記載の画像処理装置。
(付記B12)
前記画像において前記関心領域を追跡する追跡部をさらに含む、付記B2またはB3に記載の画像処理装置。
(付記B13)
前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、付記B2またはB3に記載の画像処理装置。
(付記B14)
前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、付記B2またはB3記載の画像処理装置。
(付記B15)
前記関心領域として、ガイディングカテーテル、ガイドワイヤー、中間カテーテル、マイクロカテーテル、血栓回収用吸引カテーテル、マーカー、コイル、ステント、フィルター、塞栓物質、動脈瘤塞栓器具、およびバルーンから成る群より選択される、血管内の検査用又は治療用のデバイスの少なくとも一部を含む領域が設定される、付記B2またはB3に記載の画像処理装置。
(付記B16)
前記デバイスの少なくとも一部を含む領域が、血管内の検査用又は治療用のデバイスの先端を含む、付記B2またはB3に記載の画像処理装置。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記通知部は、前記関心領域が、画像上において指定された特定の境界線を通過すること、もしくは通過が予測されることを条件として前記ユーザに通知する、付記B1に記載の画像処理装置。
(付記B18)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記出力部は、前記関心領域が、画像上において指定された特定の境界線を通過すること、もしくは通過が予測されることを条件として前記ユーザに通知する、付記B1に記載の画像処理装置。
(付記B19)
前記画像において前記関心領域を追跡する追跡部をさらに含む、付記B17またはB18に記載の画像処理装置。
(付記B20)
前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、付記B17またはB18に記載の画像処理装置。
(付記B21)
前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、付記B17またはB18に記載の画像処理装置。
(付記B22)
通過の予測が、前記関心領域と境界線との間の距離、または前記距離/前記関心領域の速度にもとづき行われる、付記B17またはB18に記載の画像処理装置。
(付記B23)
前記特定の境界線が、ユーザにより指定される、付記B17またはB18に記載の画像処理装置。
(付記B24)
前記特定の境界線が、自動で指定される、付記B17またはB18に記載の画像処理装置。
(付記B25)
前記特定の境界線が、関心領域が一定時間移動していない場合に自動で指定される、付記B17またはB18に記載の画像処理装置。
(付記B26)
前記特定の境界線が、手術の状況に応じて自動で指定される、もしくは、自動で提案されてそれをユーザが同意もしくは拒否することができる、付記B17またはB18に記載の画像処理装置。
(付記B27)
前記特定の境界線が、手術の状況に応じて自動で解除される、もしくは、自動で提案されてそれをユーザが同意もしくは拒否することができる、付記B17またはB18に記載の画像処理装置。
(付記B28)
前記画像を表示する表示装置が複数の画面を有する場合において、1つの画面において指定された前記特定の境界線が、他の画面の対応する位置に自動で指定される、付記B17またはB18に記載の画像処理装置。
(付記B29)
前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、付記B17またはB18に記載の画像処理装置。
(付記B30)
前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、付記B29に記載の画像処理装置。
(付記B31)
処理の精度を高めるために複数の画面の情報を利用する、付記B29に記載の画像処理装置。
(付記B32)
前記特定の境界線が、直線、曲線、円形、矩形、または任意の多角形もしくは閉曲線により表される、付記B17またはB18に記載の画像処理装置。
(付記B33)
前記境界線がユーザの操作により移動、変形、拡大、または縮小することが可能である、付記B17またはB18に記載の画像処理装置。
(付記B34)
前記特定の境界線が、前記デバイスの侵入が望ましくない血管、動脈瘤や狭窄などの病変部位、もしくは血管外への侵入を検出するために用いられる、付記B17またはB18に記載の画像処理装置。
(付記B35)
前記通知部または出力部が、1又は複数の領域を関心領域のそれぞれについて、前記条件が初めて満たされたときにのみ通知または出力を行う、付記B17またはB18に記載の画像処理装置。
(付記B36)
前記通知部または出力部が、1又は複数の領域を関心領域のそれぞれについて、前記条件が特定の時点以降に初めて満たされたときにのみ通知または出力を行う、付記B17またはB18に記載の画像処理装置。
(付記B37)
前記通知部または出力部が、1又は複数の領域を関心領域の全体において、前記条件が初めて満たされたときにのみ通知または出力を行う、付記B17またはB18に記載の画像処理装置。
(付記B38)
前記通知部または出力部が、1又は複数の領域を関心領域の全体において、前記条件が特定の時点以降に初めて満たされたときにのみ通知または出力を行う、付記B17またはB18に記載の画像処理装置。
(付記B39)
前記通知部または出力部が、ユーザが指定した特定の1又は複数の領域を関心領域について、前記条件が満たされたときにのみ通知または出力を行う、付記B17またはB18に記載の画像処理装置。
(付記B40)
前記通知部または出力部が、自動で特定された1又は複数の領域を関心領域について、前記条件が満たされたときにのみ通知または出力を行う、付記B17またはB18に記載の画像処理装置。
(付記B41)
前記特定の境界線が、X線画像上に重畳表示される、付記B17またはB18に記載の画像処理装置。
(付記B42)
前記特定の境界線が、X線画像の範囲もしくは拡大率の変更または移動に応じて再描画される、付記B17またはB18に記載の画像処理装置。
(付記B43)
前記特定の境界線が、X線画像の中断および再取得後に再描画される、付記B17またはB18に記載の画像処理装置。
(付記B44)
前記通知が、警告音の発生、または境界線の表示様式の変更により行われる、付記B17に記載の画像処理装置。
(付記B45)
前記通知は、どの画面の、どのデバイスの、どのようなイベントか、の一部または全てを音声で伝える、付記B17に記載の画像処理装置。
(付記B46)
前記関心領域が、画像上において指定された特定の境界線を通過する向きに応じて、異なる通知の方法が用いられる、付記B17に記載の画像処理装置。
(付記B47)
前記関心領域が、画像上において指定された特定の境界線を通過する向きが、手術の状況に応じて自動で認識されるか、または自動で提案されてそれをユーザが同意もしくは拒否することができる、付記B17またはB18に記載の画像処理装置。
(付記B48)
前記関心領域が、前記特定の境界線を越える際の速度に応じて、異なる通知の方法が用いられる、付記B17に記載の画像処理装置。
(付記B49)
前記通知部は、前記関心領域と前記特定の境界線との距離を、前記画像を表示する表示装置に表示させる、付記B17に記載の画像処理装置。
(付記B50)
前記通知部は、前記関心領域と前記特定の境界線との距離の大きさに応じて、前記表示装置における前記距離の表示態様を変化させる、付記B17に記載の画像処理装置。
(付記B51)
前記通知部または出力部は、前記関心領域と前記特定の境界線との距離を前記画像内における前記関心領域の移動速度で除算した値があらかじめ定められた閾値未満となることを条件として、通知または出力を行う、付記B17またはB18に記載の画像処理装置。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記関心領域として、前記画像上でユーザが指定する領域内に含まれる関心領域が取得される、
付記B1に記載の画像処理装置。
(付記B53)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記関心領域として、前記画像上でユーザが指定する領域内に含まれる関心領域が取得される、
付記B1に記載の画像処理装置。
(付記B54)
前記画像において前記関心領域を追跡する追跡部をさらに含む、付記B52またはB53に記載の画像処理装置。
(付記B55)
前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、付記B52またはB53に記載の画像処理装置。
(付記B56)
前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、付記B52またはB53に記載の画像処理装置。
(付記B57)
前記画像上でユーザが指定する領域が、矩形、円形、または任意の閉曲線により表される、付記B52またはB53に記載の画像処理装置。
(付記B58)
ユーザが指定しうる領域の候補が、自動で表示される、付記B52またはB53に記載の画像処理装置。
(付記B59)
ユーザが指定しうる領域の候補が、手術の状況に応じて自動で表示される、付記B52またはB53に記載の画像処理装置。
(付記B60)
ユーザが領域を指定する際に、一時的に画像が静止画となる、付記B52またはB53に記載の画像処理装置。
(付記B61)
ユーザが領域を指定する際に、一時的に画像が遅延再生となる、付記B52またはB53に記載の画像処理装置。
(付記B62)
ユーザが領域を指定する際に、画像の一部が拡大表示される、付記B52またはB53に記載の画像処理装置。
(付記B63)
前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、付記B52またはB53に記載の画像処理装置。
(付記B64)
前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、付記B63に記載の画像処理装置。
(付記B65)
処理の精度を高めるために複数の画面の情報を利用する、付記B63に記載の画像処理装置。
(付記B66)
複数の関心領域が含まれる領域が指定された場合、確認すべき関心領域のみが自動で選択される、付記B52またはB53に記載の画像処理装置。
(付記B67)
確認すべき関心領域は、複数のデバイスの重要度、画面内における関心領域の位置、手術のシーンに基づき判断される、付記B66に記載の画像処理装置。
(付記B68)
確認すべき関心領域が、手術の状況に応じて決定される、付記B66に記載の画像処理装置。
(付記B69)
画像が変更された場合に指定領域が自動的に調整される、付記B52またはB53に記載の画像処理装置。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に基づき特定の手術の状況を認識する状況認識部と、
を備える、付記B1に記載の画像処理装置。
(付記B71)
前記特定の手術の状況が、手術内容、疾患名、血管漏出、血管穿孔、血栓の発生、抹消血管の消失、ガイドワイヤー・カテーテル・バルーンの誘導、カテーテルによる吸引、ステント留置、バルーン拡張、コイルの挿入、動脈瘤塞栓デバイスの挿入、コイルを切るタイミング、フィルターの誘導・留置・回収、液体塞栓物質の注入、造影剤の注入、一定時間以上の静止画像の継続、マスク画像・ライブ画像の判別、撮像部位・角度の判別、画像の切り換え、およびノイズの発生から成る群より選択される少なくとも1つの状況である、付記B70に記載の画像処理装置。
(付記B72)
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
をさらに備える、付記B70またはB71に記載の画像処理装置。
(付記B73)
ノイズの発生が認識された場合に、前記関心領域の取得を停止する、付記B70に記載の画像処理装置。
(付記B74)
造影剤の注入が認識された場合、もしくは3Dの画像を取得する場合、もしくはコーンビームCT(CBCT)を取得する場合に、前記関心領域の取得を停止する、もしくは、取得はするが通知は行わない、付記B70に記載の画像処理装置。
(付記B75)
一定時間以上の静止画像の継続が認識された場合に、被爆の低減推奨を通知する、付記B70に記載の画像処理装置。
(付記B76)
認識された特定の手術の状況を記録する記録部をさらに含む、付記B70またはB71に記載の画像処理装置。
(付記B77)
前記特定の手術の状況が、パターンマッチング、画像認識、時系列画像認識、時系列差分、または物体検出アルゴリズムにより認識される、付記B70またはB71に記載の画像処理装置。
(付記B78)
認識された特定の手術の状況に応じて、認識されるデバイスが特定される、付記B70またはB71に記載の画像処理装置。
(付記B79)
認識された特定の手術の状況に応じて、関心領域が画像上において指定された特定の境界線を通過することを条件としてユーザに通知するための前記特定の境界線が、自動で関心領域の周囲に指定または提案される、付記B70またはB71に記載の画像処理装置。
(付記B80)
前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、付記B70またはB71に記載の画像処理装置。
(付記B81)
前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、付記B80に記載の画像処理装置。
(付記B82)
処理の精度を高めるために複数の画面の情報を利用する、付記B80に記載の画像処理装置。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか1つが前記関心領域毎に定められた条件を満たす場合、その条件に対応する手術の状況を記録する記録部と、
を備える、付記B1に記載の画像処理装置。
(付記B84)
条件が、穿刺、ガイディングカテーテルの処置部位到達、バルーン誘導、バルーン拡張、ステント展開、カテーテル誘導、1本目のコイル挿入、2本目以降のコイル挿入、コイルの動脈瘤内の局在、コイルの母血管への逸脱、カテーテル抜去、コイル抜去、処置終了から成る群より選択される少なくとも1つの処置を含む、付記B83に記載の画像処理装置。
(付記B85)
記録される手術の状況が、静止画、動画、および/または手術の状況を説明するテキスト情報を含む、付記B83またはB84に記載の画像処理装置。
(付記B86)
動画が、前記条件が満たされた時点の前後一定期間の動画である、付記B83またはB84に記載の画像処理装置。
(付記B87)
記録された情報が、ユーザにより修正可能である、付記B83またはB84に記載の画像処理装置。
(付記B88)
記録された情報に基づきレポートを作成するレポート作成部をさらに含む、付記B83またはB84に記載の画像処理装置。
(付記B89)
レポート作成部が、記録された静止画、動画、および/または手術の状況を説明するテキスト情報に基づき、手術のサマリーを作成する、付記B83またはB84に記載の画像処理装置。
(付記B90)
特定の手術の状況が記録された場合に、ユーザに通知する通知部または出力を行う出力部をさらに含む、付記B83またはB84に記載の画像処理装置。
(付記B91)
特定の手術の状況が記録された後の時間経過に応じて、ユーザに通知する通知部または出力を行う出力部をさらに含む、付記B83またはB84に記載の画像処理装置。
(付記B92)
特定の手術の状況が記録された後の時間経過が、バルーンの拡張時間、ガイドワイヤーの移動時間、速度、もしくは加速度、またはコイルの移動時間に基づき判断される、付記B91に記載の画像処理装置。
(付記B93)
特定の手術の状況が記録された回数に応じて、ユーザに通知する通知部または出力を行う出力部をさらに含む、付記B83またはB84に記載の画像処理装置。
(付記B94)
特定の手術の状況がコイルの挿入である、付記B83またはB84に記載の画像処理装置。
(付記B95)
前記画像が、リアルタイムで撮影された動画、または過去に記録された動画から取得される、付記B83またはB84に記載の画像処理装置。
(付記B96)
前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、付記B83またはB84に記載の画像処理装置。
(付記B97)
前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、付記B96に記載の画像処理装置。
(付記B98)
処理の精度を高めるために複数の画面の情報を利用する、付記B96に記載の画像処理装置。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記画像処理装置に接続された表示装置に、前記関心領域を含む前記画像と、前記画像の横に前記関心領域の縦の座標のみに対応するマークとを表示させる表示部をさらに備える、
付記B1に記載の画像処理装置。
(付記B100)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記画像処理装置に接続された表示装置に、前記関心領域を含む前記画像と、前記画像の横に前記関心領域の縦の座標のみに対応するマークとを表示させる表示部をさらに備える、
付記B1に記載の画像処理装置。
(付記B101)
前記画像において前記関心領域を追跡する追跡部をさらに含む、付記B99またはB100に記載の画像処理装置。
(付記B102)
前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、付記B99またはB100に記載の画像処理装置。
(付記B103)
前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、付記B99またはB100に記載の画像処理装置。
(付記B104)
マークの軌跡が一定時間表示される、付記B99またはB100に記載の画像処理装置。
(付記B105)
マークの表示様式が、関心領域ごとに異なる、付記B99またはB100に記載の画像処理装置
(付記B106)
前記表示様式が、色、形、模様、記号、文字、キャプション、またはアニメーションである、付記B99またはB100に記載の画像処理装置。
(付記B107)
確認すべき関心領域を自動で認識して強調表示する、付記B99またはB100に記載の画像処理装置。
(付記B108)
確認すべき関心領域が、手術の状況に応じて決定される、付記B92に記載の画像処理装置。
(付記B109)
ユーザが指定した関心領域のみを表示する、付記B99またはB100に記載の画像処理装置。
(付記B110)
自動で選択された関心領域のみを表示する、付記B99またはB100に記載の画像処理装置。
(付記B111)
ユーザが指定した境界値をマークが超えた場合、ユーザに通知を行う、付記B99またはB100に記載の画像処理装置。
(付記B112)
前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、付記B99またはB100に記載の画像処理装置。
(付記B113)
前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、付記B112に記載の画像処理装置。
(付記B114)
処理の精度を高めるために複数の画面の情報を利用する、付記B112に記載の画像処理装置。
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記関心領域が複数追跡される場合、複数の関心領域をそれぞれ異なる表示様式で表示する表示部をさらに備える、
付記B1に記載の画像処理装置。
(付記B116)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記関心領域が複数追跡される場合、複数の関心領域をそれぞれ異なる表示様式で表示する表示部をさらに備える、
付記B1に記載の画像処理装置。
(付記B117)
前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、付記B115またはB116に記載の画像処理装置。
(付記B118)
前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、付記B117に記載の画像処理装置。
(付記B119)
前記表示様式が、色、形、模様、記号、文字、キャプション、またはアニメーションである、付記B115またはB116に記載の画像処理装置。
(付記B120)
確認すべき関心領域を自動で認識して強調表示する、付記B115またはB116に記載の画像処理装置。
(付記B121)
確認すべき関心領域が、手術の状況に応じて決定される、付記B120に記載の画像処理装置。
(付記B122)
ユーザが指定した関心領域のみを表示する、付記B115またはB116に記載の画像処理装置。
(付記B123)
自動で選択された関心領域のみを表示する、付記B115またはB116に記載の画像処理装置。
(付記B124)
前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、付記B115またはB116に記載の画像処理装置。
(付記B125)
前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、付記B124に記載の画像処理装置。
(付記B126)
処理の精度を高めるために複数の画面の情報を利用する、付記B124に記載の画像処理装置。
複数の方向から取得された脳血管画像を入力として用いてトレーニングを行うことを特徴とする、機械学習モデルの作成方法。
(付記B128)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記通知部は、前記関心領域がガイドワイヤーを含み、ガイドワイヤー先端を含む特定の領域内におけるガイドワイヤーの長さもしくはセグメンテーションの面積が、一定の閾値を超えた場合に通知を行う、付記B1に記載の画像処理装置。
(付記B129)
画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記出力部は、前記関心領域がガイドワイヤーを含み、ガイドワイヤー先端を含む特定の領域内におけるガイドワイヤーの長さもしくはセグメンテーションの面積が、一定の閾値を超えた場合に出力を行う、付記B1に記載の画像処理装置。
10・・・記憶部
11・・・制御部
110・・・画像取得部
111・・・関心領域取得部
112・・・追跡部
113・・・通知部
114・・・マーカー検出部
115・・・測距部
116・・・出力部
117・・・映像記憶部
118・・・映像抽出部
119・・・状態推定部
2・・・表示装置
3・・・X線撮像装置
4・・・外部デバイス
20・・・CPU
21・・・ROM
22・・・RAM
23・・・ストレージ
24・・・入出力インターフェース
25・・・入力部
26・・・出力部
27・・・記憶媒体
30・・・X線照射器
31・・・X線検出器
32・・・寝台
D・・・デバイス
E・・・塞栓用コイル
P・・・被験者
S・・・画像処理システム
Claims (129)
- 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得可能な画像取得部を備える、画像処理装置。 - 画像処理装置であって、
血管内の検査用又は治療用のために画像を取得する画像取得部と、
前記画像中に血管内の検査用又は治療用のデバイスが含まれている場合、前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記通知部は、画像中または特定の領域内に新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記ユーザに通知する、
請求項1に記載の画像処理装置。 - 画像処理装置であって、
血管内の検査用又は治療用のために画像を取得する画像取得部と、
前記画像中に血管内の検査用又は治療用のデバイスが含まれている場合、前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記出力部は、画像中または特定の領域内に新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記機器に出力する、
請求項1に記載の画像処理装置。 - 前記通知部は、画像中の縁部もしくは縁部から一定の距離の範囲内に新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記ユーザに通知する、請求項2または3に記載の画像処理装置。
- 前記出力部は、画像中の縁部もしくは縁部から一定の距離の範囲内、または特定の領域内に新たに出現したデバイスまたはその一部が関心領域として新たに検出されることを条件として前記機器に出力する、請求項2または3に記載の画像処理装置。
- 前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、請求項2または3に記載の画像処理装置。
- 前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、請求項6に記載の画像処理装置。
- 処理の精度を高めるために複数の画面の情報を利用する、請求項6に記載の画像処理装置。
- 前記特定の領域がユーザにより指定される、請求項2または3に記載の画像処理装置。
- 前記特定の領域が自動で指定される、請求項2または3に記載の画像処理装置。
- 前記特定の領域が、画像処理装置が確認すべき領域であると自動で判断した領域である、請求項2または3に記載の画像処理装置。
- 前記画像において前記関心領域を追跡する追跡部をさらに含む、請求項2または3に記載の画像処理装置。
- 前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、請求項2または3に記載の画像処理装置。
- 前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、請求項2または3記載の画像処理装置。
- 前記関心領域として、ガイディングカテーテル、ガイドワイヤー、中間カテーテル、マイクロカテーテル、血栓回収用吸引カテーテル、マーカー、コイル、ステント、フィルター、塞栓物質、動脈瘤塞栓器具、およびバルーンから成る群より選択される、血管内の検査用又は治療用のデバイスの少なくとも一部を含む領域が設定される、請求項2または3に記載の画像処理装置。
- 前記デバイスの少なくとも一部を含む領域が、血管内の検査用又は治療用のデバイスの先端を含む、請求項2または3に記載の画像処理装置。
- 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記通知部は、前記関心領域が、画像上において指定された特定の境界線を通過すること、もしくは通過が予測されることを条件として前記ユーザに通知する、請求項1に記載の画像処理装置。 - 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記出力部は、前記関心領域が、画像上において指定された特定の境界線を通過すること、もしくは通過が予測されることを条件として前記ユーザに通知する、請求項1に記載の画像処理装置。 - 前記画像において前記関心領域を追跡する追跡部をさらに含む、請求項17または18に記載の画像処理装置。
- 前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、請求項17または18に記載の画像処理装置。
- 前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、請求項17または18に記載の画像処理装置。
- 通過の予測が、前記関心領域と境界線との間の距離、または前記距離/前記関心領域の速度にもとづき行われる、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、ユーザにより指定される、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、自動で指定される、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、関心領域が一定時間移動していない場合に自動で指定される、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、手術の状況に応じて自動で指定される、もしくは、自動で提案されてそれをユーザが同意もしくは拒否することができる、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、手術の状況に応じて自動で解除される、もしくは、自動で提案されてそれをユーザが同意もしくは拒否することができる、請求項17または18に記載の画像処理装置。
- 前記画像を表示する表示装置が複数の画面を有する場合において、1つの画面において指定された前記特定の境界線が、他の画面の対応する位置に自動で指定される、請求項17または18に記載の画像処理装置。
- 前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、請求項17または18に記載の画像処理装置。
- 前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、請求項29に記載の画像処理装置。
- 処理の精度を高めるために複数の画面の情報を利用する、請求項29に記載の画像処理装置。
- 前記特定の境界線が、直線、曲線、円形、矩形、または任意の多角形もしくは閉曲線により表される、請求項17または18に記載の画像処理装置。
- 前記境界線がユーザの操作により移動、変形、拡大、または縮小することが可能である、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、前記デバイスの侵入が望ましくない血管、動脈瘤や狭窄などの病変部位、もしくは血管外への侵入を検出するために用いられる、請求項17または18に記載の画像処理装置。
- 前記通知部または出力部が、1又は複数の領域を関心領域のそれぞれについて、前記条件が初めて満たされたときにのみ通知または出力を行う、請求項17または18に記載の画像処理装置。
- 前記通知部または出力部が、1又は複数の領域を関心領域のそれぞれについて、前記条件が特定の時点以降に初めて満たされたときにのみ通知または出力を行う、請求項17または18に記載の画像処理装置。
- 前記通知部または出力部が、1又は複数の領域を関心領域の全体において、前記条件が初めて満たされたときにのみ通知または出力を行う、請求項17または18に記載の画像処理装置。
- 前記通知部または出力部が、1又は複数の領域を関心領域の全体において、前記条件が特定の時点以降に初めて満たされたときにのみ通知または出力を行う、請求項17または18に記載の画像処理装置。
- 前記通知部または出力部が、ユーザが指定した特定の1又は複数の領域を関心領域について、前記条件が満たされたときにのみ通知または出力を行う、請求項17または18に記載の画像処理装置。
- 前記通知部または出力部が、自動で特定された1又は複数の領域を関心領域について、前記条件が満たされたときにのみ通知または出力を行う、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、X線画像上に重畳表示される、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、X線画像の範囲もしくは拡大率の変更または移動に応じて再描画される、請求項17または18に記載の画像処理装置。
- 前記特定の境界線が、X線画像の中断および再取得後に再描画される、請求項17または18に記載の画像処理装置。
- 前記通知が、警告音の発生、または境界線の表示様式の変更により行われる、請求項17に記載の画像処理装置。
- 前記通知は、どの画面の、どのデバイスの、どのようなイベントか、の一部または全てを音声で伝える、請求項17に記載の画像処理装置。
- 前記関心領域が、画像上において指定された特定の境界線を通過する向きに応じて、異なる通知の方法が用いられる、請求項17に記載の画像処理装置。
- 前記関心領域が、画像上において指定された特定の境界線を通過する向きが、手術の状況に応じて自動で認識されるか、または自動で提案されてそれをユーザが同意もしくは拒否することができる、請求項17または18に記載の画像処理装置。
- 前記関心領域が、前記特定の境界線を越える際の速度に応じて、異なる通知の方法が用いられる、請求項17に記載の画像処理装置。
- 前記通知部は、前記関心領域と前記特定の境界線との距離を、前記画像を表示する表示装置に表示させる、請求項17に記載の画像処理装置。
- 前記通知部は、前記関心領域と前記特定の境界線との距離の大きさに応じて、前記表示装置における前記距離の表示態様を変化させる、請求項17に記載の画像処理装置。
- 前記通知部または出力部は、前記関心領域と前記特定の境界線との距離を前記画像内における前記関心領域の移動速度で除算した値があらかじめ定められた閾値未満となることを条件として、通知または出力を行う、請求項17または18に記載の画像処理装置。
- 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記関心領域として、前記画像上でユーザが指定する領域内に含まれる関心領域が取得される、
請求項1に記載の画像処理装置。 - 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記関心領域として、前記画像上でユーザが指定する領域内に含まれる関心領域が取得される、
請求項1に記載の画像処理装置。 - 前記画像において前記関心領域を追跡する追跡部をさらに含む、請求項52または53に記載の画像処理装置。
- 前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、請求項52または53に記載の画像処理装置。
- 前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、請求項52または53に記載の画像処理装置。
- 前記画像上でユーザが指定する領域が、矩形、円形、または任意の閉曲線により表される、請求項52または53に記載の画像処理装置。
- ユーザが指定しうる領域の候補が、自動で表示される、請求項52または53に記載の画像処理装置。
- ユーザが指定しうる領域の候補が、手術の状況に応じて自動で表示される、請求項52または53に記載の画像処理装置。
- ユーザが領域を指定する際に、一時的に画像が静止画となる、請求項52または53に記載の画像処理装置。
- ユーザが領域を指定する際に、一時的に画像が遅延再生となる、請求項52または53に記載の画像処理装置。
- ユーザが領域を指定する際に、画像の一部が拡大表示される、請求項52または53に記載の画像処理装置。
- 前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、請求項52または53に記載の画像処理装置。
- 前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、請求項63に記載の画像処理装置。
- 処理の精度を高めるために複数の画面の情報を利用する、請求項63に記載の画像処理装置。
- 複数の関心領域が含まれる領域が指定された場合、確認すべき関心領域のみが自動で選択される、請求項52または53に記載の画像処理装置。
- 確認すべき関心領域は、複数のデバイスの重要度、画面内における関心領域の位置、手術のシーンに基づき判断される、請求項66に記載の画像処理装置。
- 確認すべき関心領域が、手術の状況に応じて決定される、請求項66に記載の画像処理装置。
- 画像が変更された場合に指定領域が自動的に調整される、請求項52または53に記載の画像処理装置。
- 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に基づき特定の手術の状況を認識する状況認識部と、
を備える、請求項1に記載の画像処理装置。 - 前記特定の手術の状況が、手術内容、疾患名、血管漏出、血管穿孔、血栓の発生、抹消血管の消失、ガイドワイヤー・カテーテル・バルーンの誘導、カテーテルによる吸引、ステント留置、バルーン拡張、コイルの挿入、動脈瘤塞栓デバイスの挿入、コイルを切るタイミング、フィルターの誘導・留置・回収、液体塞栓物質の注入、造影剤の注入、一定時間以上の静止画像の継続、マスク画像・ライブ画像の判別、撮像部位・角度の判別、画像の切り換え、およびノイズの発生から成る群より選択される少なくとも1つの状況である、請求項70に記載の画像処理装置。
- 前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
をさらに備える、請求項70または71に記載の画像処理装置。 - ノイズの発生が認識された場合に、前記関心領域の取得を停止する、請求項70に記載の画像処理装置。
- 造影剤の注入が認識された場合、もしくは3Dの画像を取得する場合、もしくはコーンビームCT(CBCT)を取得する場合に、前記関心領域の取得を停止する、もしくは、取得はするが通知は行わない、請求項70に記載の画像処理装置。
- 一定時間以上の静止画像の継続が認識された場合に、被爆の低減推奨を通知する、請求項70に記載の画像処理装置。
- 認識された特定の手術の状況を記録する記録部をさらに含む、請求項70または71に記載の画像処理装置。
- 前記特定の手術の状況が、パターンマッチング、画像認識、時系列画像認識、時系列差分、または物体検出アルゴリズムにより認識される、請求項70または71に記載の画像処理装置。
- 認識された特定の手術の状況に応じて、認識されるデバイスが特定される、請求項70または71に記載の画像処理装置。
- 認識された特定の手術の状況に応じて、関心領域が画像上において指定された特定の境界線を通過することを条件としてユーザに通知するための前記特定の境界線が、自動で関心領域の周囲に指定または提案される、請求項70または71に記載の画像処理装置。
- 前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、請求項70または71に記載の画像処理装置。
- 前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、請求項80に記載の画像処理装置。
- 処理の精度を高めるために複数の画面の情報を利用する、請求項80に記載の画像処理装置。
- 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか1つが前記関心領域毎に定められた条件を満たす場合、その条件に対応する手術の状況を記録する記録部と、
を備える、請求項1に記載の画像処理装置。 - 条件が、穿刺、ガイディングカテーテルの処置部位到達、バルーン誘導、バルーン拡張、ステント展開、カテーテル誘導、1本目のコイル挿入、2本目以降のコイル挿入、コイルの動脈瘤内の局在、コイルの母血管への逸脱、カテーテル抜去、コイル抜去、処置終了から成る群より選択される少なくとも1つの処置を含む、請求項83に記載の画像処理装置。
- 記録される手術の状況が、静止画、動画、および/または手術の状況を説明するテキスト情報を含む、請求項83または84に記載の画像処理装置。
- 動画が、前記条件が満たされた時点の前後一定期間の動画である、請求項83または84に記載の画像処理装置。
- 記録された情報が、ユーザにより修正可能である、請求項83または84に記載の画像処理装置。
- 記録された情報に基づきレポートを作成するレポート作成部をさらに含む、請求項83または84に記載の画像処理装置。
- レポート作成部が、記録された静止画、動画、および/または手術の状況を説明するテキスト情報に基づき、手術のサマリーを作成する、請求項83または84に記載の画像処理装置。
- 特定の手術の状況が記録された場合に、ユーザに通知する通知部または出力を行う出力部をさらに含む、請求項83または84に記載の画像処理装置。
- 特定の手術の状況が記録された後の時間経過に応じて、ユーザに通知する通知部または出力を行う出力部をさらに含む、請求項83または84に記載の画像処理装置。
- 特定の手術の状況が記録された後の時間経過が、バルーンの拡張時間、ガイドワイヤーの移動時間、速度、もしくは加速度、またはコイルの移動時間に基づき判断される、請求項91に記載の画像処理装置。
- 特定の手術の状況が記録された回数に応じて、ユーザに通知する通知部または出力を行う出力部をさらに含む、請求項83または84に記載の画像処理装置。
- 特定の手術の状況がコイルの挿入である、請求項83または84に記載の画像処理装置。
- 前記画像が、リアルタイムで撮影された動画、または過去に記録された動画から取得される、請求項83または84に記載の画像処理装置。
- 前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、請求項83または84に記載の画像処理装置。
- 前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、請求項96に記載の画像処理装置。
- 処理の精度を高めるために複数の画面の情報を利用する、請求項96に記載の画像処理装置。
- 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記画像処理装置に接続された表示装置に、前記関心領域を含む前記画像と、前記画像の横に前記関心領域の縦の座標のみに対応するマークとを表示させる表示部をさらに備える、
請求項1に記載の画像処理装置。 - 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記画像処理装置に接続された表示装置に、前記関心領域を含む前記画像と、前記画像の横に前記関心領域の縦の座標のみに対応するマークとを表示させる表示部をさらに備える、
請求項1に記載の画像処理装置。 - 前記画像において前記関心領域を追跡する追跡部をさらに含む、請求項99または100に記載の画像処理装置。
- 前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、請求項99または100に記載の画像処理装置。
- 前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、請求項99または100に記載の画像処理装置。
- マークの軌跡が一定時間表示される、請求項99または100に記載の画像処理装置。
- マークの表示様式が、関心領域ごとに異なる、請求項99または100に記載の画像処理装置
- 前記表示様式が、色、形、模様、記号、文字、キャプション、またはアニメーションである、請求項99または100に記載の画像処理装置。
- 確認すべき関心領域を自動で認識して強調表示する、請求項99または100に記載の画像処理装置。
- 確認すべき関心領域が、手術の状況に応じて決定される、請求項92に記載の画像処理装置。
- ユーザが指定した関心領域のみを表示する、請求項99または100に記載の画像処理装置。
- 自動で選択された関心領域のみを表示する、請求項99または100に記載の画像処理装置。
- ユーザが指定した境界値をマークが超えた場合、ユーザに通知を行う、請求項99または100に記載の画像処理装置。
- 前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、請求項99または100に記載の画像処理装置。
- 前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、請求項112に記載の画像処理装置。
- 処理の精度を高めるために複数の画面の情報を利用する、請求項112に記載の画像処理装置。
- 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記関心領域が複数追跡される場合、複数の関心領域をそれぞれ異なる表示様式で表示する表示部をさらに備える、
請求項1に記載の画像処理装置。 - 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記画像において前記関心領域それぞれを追跡する追跡部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記関心領域が複数追跡される場合、複数の関心領域をそれぞれ異なる表示様式で表示する表示部をさらに備える、
請求項1に記載の画像処理装置。 - 前記画像中の血管の少なくとも一部を認識する血管認識部をさらに含む、請求項115または116に記載の画像処理装置。
- 前記関心領域が、閾値以下の直径を有する血管、または手動もしくは自動で指定された血管の中に入った場合に、通知または出力が行われる、請求項117に記載の画像処理装置。
- 前記表示様式が、色、形、模様、記号、文字、キャプション、またはアニメーションである、請求項115または116に記載の画像処理装置。
- 確認すべき関心領域を自動で認識して強調表示する、請求項115または116に記載の画像処理装置。
- 確認すべき関心領域が、手術の状況に応じて決定される、請求項120に記載の画像処理装置。
- ユーザが指定した関心領域のみを表示する、請求項115または116に記載の画像処理装置。
- 自動で選択された関心領域のみを表示する、請求項115または116に記載の画像処理装置。
- 前記画像を表示する表示装置が複数の画面を有する場合において、それぞれの画面に対して処理が行われる、請求項115または116に記載の画像処理装置。
- 前記複数の画面が、正面ライブ画像、側面ライブ画像、正面マスク画像、側面マスク画像のうちの2つ以上の画像を表示する、請求項124に記載の画像処理装置。
- 処理の精度を高めるために複数の画面の情報を利用する、請求項124に記載の画像処理装置。
- 複数の方向から取得された脳血管画像を入力として用いてトレーニングを行うことを特徴とする、機械学習モデルの作成方法。
- 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、そのことを前記画像処理装置のユーザに通知する通知部と、
を備え、
前記通知部は、前記関心領域がガイドワイヤーを含み、ガイドワイヤー先端を含む特定の領域内におけるガイドワイヤーの長さもしくはセグメンテーションの面積が、一定の閾値を超えた場合に通知を行う、請求項1に記載の画像処理装置。 - 画像処理装置であって、
血管内の検査用又は治療用のデバイスを少なくとも被写体に含む画像を取得する画像取得部と、
前記画像に含まれる前記デバイスの少なくとも一部を含む1又は複数の領域を関心領域として取得する関心領域取得部と、
前記関心領域の少なくともいずれか一つが前記関心領域毎に定められた条件を満たす場合、その情報を前記画像処理装置に接続された機器に出力する出力部と、
を備え、
前記出力部は、前記関心領域がガイドワイヤーを含み、ガイドワイヤー先端を含む特定の領域内におけるガイドワイヤーの長さもしくはセグメンテーションの面積が、一定の閾値を超えた場合に出力を行う、請求項1に記載の画像処理装置。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA3237390A CA3237390A1 (en) | 2021-11-09 | 2022-11-08 | Image processing apparatus, image processing method, program, and image processing system |
EP22891193.9A EP4431023A1 (en) | 2021-11-09 | 2022-11-08 | Image-processing device, image-processing method, program, and image-processing system |
CN202280087710.5A CN118524807A (zh) | 2021-11-09 | 2022-11-08 | 图像处理设备、图像处理方法、程序和图像处理系统 |
JP2023538665A JP7409627B2 (ja) | 2021-11-09 | 2022-11-08 | 画像処理装置、画像処理方法、プログラム、及び画像処理システム |
JP2023209980A JP7523831B2 (ja) | 2021-11-09 | 2023-12-13 | 画像処理装置、画像処理方法、プログラム、及び画像処理システム |
JP2024110241A JP2024138391A (ja) | 2021-11-09 | 2024-07-09 | 画像処理装置、画像処理方法、プログラム、及び画像処理システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021182423 | 2021-11-09 | ||
JP2021-182423 | 2021-11-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023085253A1 true WO2023085253A1 (ja) | 2023-05-19 |
Family
ID=86336097
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/041503 WO2023085253A1 (ja) | 2021-11-09 | 2022-11-08 | 画像処理装置、画像処理方法、プログラム、及び画像処理システム |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4431023A1 (ja) |
JP (3) | JP7409627B2 (ja) |
CN (1) | CN118524807A (ja) |
CA (1) | CA3237390A1 (ja) |
WO (1) | WO2023085253A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017139894A1 (en) | 2016-02-16 | 2017-08-24 | Goyal Mayank | Systems and methods for routing a vessel line such as a catheter within a vessel |
JP2017185007A (ja) * | 2016-04-05 | 2017-10-12 | 株式会社島津製作所 | 放射線撮影装置、放射線画像の対象物検出プログラムおよび放射線画像における対象物検出方法 |
JP2021094410A (ja) * | 2016-11-21 | 2021-06-24 | 東芝エネルギーシステムズ株式会社 | 医用画像処理装置、医用画像処理方法、医用画像処理プログラム、動体追跡装置および放射線治療システム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5209730A (en) | 1989-12-19 | 1993-05-11 | Scimed Life Systems, Inc. | Method for placement of a balloon dilatation catheter across a stenosis and apparatus therefor |
US7650179B2 (en) | 2005-12-09 | 2010-01-19 | Siemens Aktiengesellschaft | Computerized workflow method for stent planning and stenting procedure |
JP3928978B1 (ja) | 2006-09-22 | 2007-06-13 | 国立大学法人岐阜大学 | 医用画像処理装置、医用画像処理方法及びプログラム |
US9675310B2 (en) | 2014-04-30 | 2017-06-13 | Siemens Healthcare Gmbh | Regression for periodic phase-dependent modeling in angiography |
JP7195868B2 (ja) | 2018-10-19 | 2022-12-26 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、x線診断装置及び医用画像処理プログラム |
JP7179288B2 (ja) | 2018-11-27 | 2022-11-29 | 株式会社アールテック | カテーテル操作支援装置及びその作動方法、プログラム、並びにx線医療システム |
-
2022
- 2022-11-08 WO PCT/JP2022/041503 patent/WO2023085253A1/ja active Application Filing
- 2022-11-08 JP JP2023538665A patent/JP7409627B2/ja active Active
- 2022-11-08 CA CA3237390A patent/CA3237390A1/en active Pending
- 2022-11-08 CN CN202280087710.5A patent/CN118524807A/zh active Pending
- 2022-11-08 EP EP22891193.9A patent/EP4431023A1/en active Pending
-
2023
- 2023-12-13 JP JP2023209980A patent/JP7523831B2/ja active Active
-
2024
- 2024-07-09 JP JP2024110241A patent/JP2024138391A/ja active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017139894A1 (en) | 2016-02-16 | 2017-08-24 | Goyal Mayank | Systems and methods for routing a vessel line such as a catheter within a vessel |
JP2017185007A (ja) * | 2016-04-05 | 2017-10-12 | 株式会社島津製作所 | 放射線撮影装置、放射線画像の対象物検出プログラムおよび放射線画像における対象物検出方法 |
JP2021094410A (ja) * | 2016-11-21 | 2021-06-24 | 東芝エネルギーシステムズ株式会社 | 医用画像処理装置、医用画像処理方法、医用画像処理プログラム、動体追跡装置および放射線治療システム |
Also Published As
Publication number | Publication date |
---|---|
JP2024020647A (ja) | 2024-02-14 |
CN118524807A (zh) | 2024-08-20 |
JP7409627B2 (ja) | 2024-01-09 |
CA3237390A1 (en) | 2023-05-19 |
JP2024138391A (ja) | 2024-10-08 |
JPWO2023085253A1 (ja) | 2023-05-19 |
JP7523831B2 (ja) | 2024-07-29 |
EP4431023A1 (en) | 2024-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4988557B2 (ja) | Ptca血管造影図の制御に対するビューイング装置 | |
JP5727583B2 (ja) | 医療介入中の脈管構造を記録する装置及び装置の作動方法 | |
EP2482726B1 (en) | Vascular roadmapping | |
US8295577B2 (en) | Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ | |
US20080275467A1 (en) | Intraoperative guidance for endovascular interventions via three-dimensional path planning, x-ray fluoroscopy, and image overlay | |
EP2114252B1 (en) | Phase-free cardiac roadmapping | |
JP2010519002A5 (ja) | ||
JP2017185007A (ja) | 放射線撮影装置、放射線画像の対象物検出プログラムおよび放射線画像における対象物検出方法 | |
US10022101B2 (en) | X-ray/intravascular imaging colocation method and system | |
US12048492B2 (en) | Estimating the endoluminal path of an endoluminal device along a lumen | |
CN107347249B (zh) | 自动移动检测 | |
WO2023085253A1 (ja) | 画像処理装置、画像処理方法、プログラム、及び画像処理システム | |
WO2021225155A1 (ja) | 画像処理装置、画像処理方法、プログラム、及び画像処理システム | |
US20140294149A1 (en) | Image Support | |
US20230190212A1 (en) | Actuation method for x-ray device and x-ray device | |
EP4275639A1 (en) | System and method for assistance in a surgical procedure | |
US20250062000A1 (en) | Image-guided therapy system | |
EP4202951A1 (en) | Image-guided therapy system | |
JP2024142135A (ja) | プログラム、情報処理方法、情報処理装置および学習済モデル生成方法 | |
CN118634033A (zh) | 引导介入手术的方法、系统和存储介质 | |
JP2023130134A (ja) | プログラム、情報処理方法及び情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2023538665 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22891193 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3237390 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022891193 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022891193 Country of ref document: EP Effective date: 20240610 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280087710.5 Country of ref document: CN |