WO2024195100A1 - Endoscopy assistance device, endoscopy assistance method, and recording medium - Google Patents
Endoscopy assistance device, endoscopy assistance method, and recording medium Download PDFInfo
- Publication number
- WO2024195100A1 WO2024195100A1 PCT/JP2023/011446 JP2023011446W WO2024195100A1 WO 2024195100 A1 WO2024195100 A1 WO 2024195100A1 JP 2023011446 W JP2023011446 W JP 2023011446W WO 2024195100 A1 WO2024195100 A1 WO 2024195100A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- endoscopic
- polyp
- image
- display
- major axis
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 31
- 238000001839 endoscopy Methods 0.000 title abstract 3
- 208000037062 Polyps Diseases 0.000 claims abstract description 142
- 238000004364 calculation method Methods 0.000 claims abstract description 22
- 238000001514 detection method Methods 0.000 claims abstract description 17
- 238000003384 imaging method Methods 0.000 claims description 17
- 238000013507 mapping Methods 0.000 claims description 11
- 238000004458 analytical method Methods 0.000 description 14
- 230000015654 memory Effects 0.000 description 12
- 238000012545 processing Methods 0.000 description 12
- 230000003902 lesion Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 6
- 238000010801 machine learning Methods 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 210000002429 large intestine Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
Definitions
- This disclosure relates to technology that supports endoscopic examinations.
- Patent Document 1 discloses a method of displaying a cross line on an image captured by an endoscope and displaying a scale indicating the actual size on the cross line.
- Patent Document 1 merely displays intersecting lines with scale marks near polyps, etc., and requires a doctor or other examiner to visually determine the size of the polyp.
- One objective of this disclosure is to estimate and display the size of polyps from endoscopic images.
- an endoscopic examination support apparatus includes: an image acquisition means for acquiring an endoscopic image captured by an endoscopic camera; a detection means for detecting a polyp from the endoscopic image; a three-dimensional reconstruction means for reconstructing a three-dimensional point cloud of an imaging region from the endoscopic image; a major axis calculation means for calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp; and a display control means for causing a display image including the value of the major axis to be displayed on a display device.
- a method for assisting endoscopic examination is provided, the method being executed by a computer and comprising: Acquire an endoscopic image captured by an endoscopic camera; Detecting polyps from the endoscopic image; Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image; Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp; A display image including the value of the major axis is displayed on a display device.
- a recording medium includes: Acquire an endoscopic image captured by an endoscopic camera; Detecting polyps from the endoscopic image; Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image; Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp; A program for causing a computer to execute a process for displaying a display image including the value of the major axis on a display device is recorded.
- FIG. 1 is a block diagram showing a schematic configuration of an endoscopic inspection system.
- 2 is a block diagram showing a hardware configuration of the endoscopic examination support device.
- FIG. 2 is a block diagram showing the functional configuration of the endoscopic examination support device.
- FIG. 13 is a flowchart of a polyp information display process.
- 1 shows a first display example of a display image.
- 11 shows a second display example of a display image. 13 shows a third display example of a display image. 13 shows a fourth display example of a display image.
- 13 shows a fifth display example of a display image.
- 13 shows a sixth display example of a display image.
- 13 shows a seventh display example of a display image.
- FIG. 11 is a block diagram showing the functional configuration of an endoscopic examination support device according to a second embodiment.
- 10 is a flowchart of a process performed by the endoscopic examination support device of the second embodiment.
- FIG. 1 shows a schematic configuration of an endoscopic examination system 100.
- the endoscopic examination system 100 estimates and displays the size of a polyp from an endoscopic image during examination (including treatment) using an endoscope.
- the endoscopic examination system 100 mainly comprises an endoscopic examination support device 1, a display device 2, and an endoscope scope 3 connected to the endoscopic examination support device 1.
- the endoscopic examination support device 1 acquires from the endoscope scope 3 images (i.e., video images; hereinafter, also referred to as "endoscopic images Ic") captured by the endoscope scope 3 during an endoscopic examination, and displays on the display device 2 display data for confirmation by the examiner performing the endoscopic examination. Specifically, the endoscopic examination support device 1 acquires video images of the large intestine captured by the endoscope scope 3 during an endoscopic examination as the endoscopic images Ic. The endoscopic examination support device 1 extracts frame images from the endoscopic images Ic, estimates the size of the polyps based on the frame images, and displays information regarding the size of the polyps on the display device 2 together with the endoscopic images.
- endoscopic images Ic i.e., video images; hereinafter, also referred to as "endoscopic images Ic”
- the display device 2 is a display or the like that displays a predetermined image based on a display signal supplied from the endoscopic examination support device 1.
- the endoscope scope 3 mainly comprises an operation section 36 that allows the examiner to input instructions such as air supply, water supply, angle adjustment, and imaging instructions, a flexible shaft 37 that is inserted into the subject's organ to be examined, a tip section 38 that incorporates an imaging section such as a miniature imaging element, and a connection section 39 for connecting to the endoscopic examination support device 1.
- the imaging section provided in the endoscope scope 3 will be referred to as the "endoscopic camera" below.
- the endoscopic examination support device 1 mainly includes a processor 11, a memory 12, an interface 13, an input unit 14, a light source unit 15, a sound output unit 16, and a database (hereinafter, referred to as "DB") 17. These elements are connected via a data bus 19.
- DB database
- the processor 11 executes a predetermined process by executing a program stored in the memory 12.
- the processor 11 may be a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor), an MPU (Micro Processing Unit), an FPU (Floating Point number Processing Unit), a PPU (Physics Processing Unit), a TPU (Tensor Processing Unit), a quantum processor, a microcontroller, or a combination of these.
- the processor 11 may be composed of multiple processors.
- the processor 11 is an example of a computer.
- the memory 12 is composed of various volatile memories used as working memories, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memories that store information necessary for the processing of the endoscopic examination support device 1.
- the memory 12 may include an external storage device such as a hard disk connected to or built into the endoscopic examination support device 1, or may include a storage medium such as a removable flash memory or disk medium.
- the memory 12 stores programs that allow the endoscopic examination support device 1 to execute each process in this embodiment.
- the memory 12 temporarily stores a series of endoscopic images Ic captured by the endoscope scope 3 during an endoscopic examination.
- the interface 13 performs interface operations between the endoscopic examination support device 1 and an external device.
- the interface 13 supplies the display data Id generated by the processor 11 to the display device 2.
- the interface 13 also supplies illumination light generated by the light source unit 15 to the endoscope scope 3.
- the interface 13 also supplies an electrical signal indicating the endoscopic image Ic supplied from the endoscope scope 3 to the processor 11.
- the interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, or may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc.
- the input unit 14 generates an input signal based on the examiner's operation.
- the input unit 14 is, for example, a button, a touch panel, a remote controller, a voice input device, etc.
- the light source unit 15 generates light to be supplied to the tip 38 of the endoscope 3.
- the light source unit 15 may also incorporate a pump or the like for sending water or air to be supplied to the endoscope 3.
- the sound output unit 16 outputs sound based on the control of the processor 11.
- DB17 stores endoscopic images and the like acquired during past endoscopic examinations of the subject.
- DB17 may include an external storage device such as a hard disk connected to or built into the endoscopic examination support device 1, or may include a removable storage medium such as a flash memory.
- DB17 may be provided on an external server, etc., and related information may be acquired from the server via communication.
- the endoscopic examination support device 1 may also be equipped with a sensor capable of measuring the rotation and translation of the endoscopic camera, such as a magnetic sensor.
- [Functional configuration] 3 is a block diagram showing the functional configuration of the endoscopic examination support device 1.
- the endoscopic examination support device 1 functionally includes an interface 13, a depth estimation unit 21, a three-dimensional reconstruction unit 22, a lesion detection unit 23, a three-dimensional reconstruction unit 24, a polyp region mapping unit 25, a polyp long axis calculation unit 26, and a display image generation unit 27.
- the depth estimation unit 21 receives an endoscopic image from the interface 13.
- the depth estimation unit 21 estimates the depth from the endoscopic camera (i.e., the tip of the endoscopic scope 3) corresponding to each pixel contained in the endoscopic image from the endoscopic image, and generates depth information.
- “Depth” is the distance from the endoscopic camera to each pixel in the endoscopic image
- “depth information” is information indicating the depth for each pixel in the endoscopic image. Note that the depth information corresponding to one endoscopic image is also called a "depth image.”
- image processing techniques such as Structure from Motion (hereinafter, referred to as "SfM"
- machine learning techniques such as deep learning
- a machine learning technique a monocular depth estimation technique that estimates depth on a pixel-by-pixel basis for one frame image may be used.
- a depth estimation model that has been trained in advance is used.
- a technique for training the depth estimation model a supervised learning technique in which correct depth data is given, or a self-supervised learning technique in which learning is performed using video data as input can be used.
- the depth estimation unit 21 outputs the generated depth information to the 3D restoration units 22 and 24.
- the 3D restoration unit 22 generates a 3D point cloud using the depth information input from the depth estimation unit 21 and the internal parameters of the endoscopic camera provided in the endoscope scope 3. That is, the 3D restoration unit 22 restores the 2D endoscopic image to a 3D point cloud in a 3D coordinate system.
- the endoscopic image is a 3D shape of the photographed area in the large intestine that has been converted into a 2D shape using the internal parameters of the endoscopic camera. Therefore, the 3D restoration unit 22 restores the endoscopic image to its original 3D shape (3D point cloud) using the internal parameters of the camera and the depth information of the endoscopic image.
- the internal parameters of the camera may be treated as known, or may be estimated from the endoscopic image.
- the three-dimensional restoration unit 22 may perform three-dimensional restoration using multiple endoscopic images (frame images).
- the three-dimensional restoration unit 22 calculates the movement of the camera position in other frame images relative to the camera position in the reference frame image (transformation of the camera position and orientation: referred to as "external parameters").
- the three-dimensional restoration unit 22 transforms each frame image into the coordinate system of the reference frame image using the external parameters, and integrates them in the coordinate system of the reference frame image.
- the three-dimensional restoration unit 22 may use an optimization method to improve the accuracy of the three-dimensional restoration.
- the three-dimensional restoration unit 22 may perform bundle adjustment to minimize reprojection errors based on corresponding feature points between the frame images.
- the three-dimensional restoration unit 22 may use NeRF (Neural Radiance Field) to estimate the distance between the object and the camera, and the brightness of the object obtained from the distance.
- the 3D reconstruction unit 22 outputs a 3D point cloud of the imaging site in the endoscopic image to the polyp region mapping unit 25.
- the lesion detection unit 23 receives an endoscopic image from the interface 13.
- the lesion detection unit 23 detects lesion (also called "polyp") areas from the endoscopic image using a pre-prepared image recognition model, etc., and generates a polyp image including the polyp area.
- the polyp area in the polyp image may be a rectangle surrounding the polyp, or may be a pixel-by-pixel area detected by segmentation.
- the lesion detection unit 23 outputs the polyp image to the 3D restoration unit 24 and the display image generation unit 27.
- the three-dimensional restoration unit 24 three-dimensionally restores the polyp region included in the polyp image, and generates a three-dimensional point cloud of the polyp region in a three-dimensional coordinate system.
- the three-dimensional restoration unit 24 basically needs to generate a three-dimensional point cloud of the polyp region using a method similar to that used by the three-dimensional restoration unit 22, and in practice the three-dimensional restoration units 22 and 24 may be configured using a common three-dimensional restoration unit.
- the three-dimensional restoration unit 24 outputs the three-dimensional point cloud of the polyp region to the polyp region mapping unit 25.
- the polyp long diameter calculation unit 26 calculates the long diameter of the polyp using the three-dimensional point cloud onto which the polyp region is mapped. Polyps come in a variety of shapes, but the polyp long diameter calculation unit 26 determines the longest part of the polyp as the long diameter. Specifically, the polyp long diameter calculation unit 26 calculates the distance between the two farthest points in the three-dimensional point cloud of the polyp indicated by the polyp region as the long diameter. The polyp long diameter calculation unit 26 outputs the calculated long diameter to the display image generation unit 27.
- the display image generating unit 27 generates a display image using the endoscopic image input from the interface 13, the polyp image input from the lesion detection unit 23, and the polyp long diameter input from the polyp long diameter calculation unit 26. Specifically, the display image generating unit 27 generates a display image including an image showing a polyp region on the endoscopic image and the value of the polyp long diameter. An example of the display image will be described later.
- the display image generating unit 27 outputs the display data Id of the generated display image to the display device 2 and causes it to be displayed on the display device 2.
- the interface 13 is an example of an image acquisition means
- the three-dimensional reconstruction unit 22 is an example of a three-dimensional reconstruction means
- the lesion detection unit 23 is an example of a detection means
- the polyp long diameter calculation unit 26 is an example of a long diameter calculation means
- the display image generation unit 27 is an example of a display control means.
- Fig. 4 is a flowchart of the polyp information display process. This process is realized by the processor 11 shown in Fig. 2 executing a program prepared in advance and operating as each element shown in Fig. 3.
- the endoscopic video Ic is input from the endoscope scope 3 to the interface 13.
- the interface 13 acquires an endoscopic image from the input endoscopic video Ic (step S11).
- the depth estimation unit 21 estimates the depth from the endoscopic image and generates depth information (step S12).
- the three-dimensional reconstruction unit 22 uses the depth information and the internal parameters of the endoscopic camera to reconstruct a three-dimensional point cloud of the captured area in the endoscopic image (step S13).
- the lesion detection unit 23 also detects polyps from the endoscopic image (step S14).
- the three-dimensional reconstruction unit 24 generates a three-dimensional point cloud of the polyp region from the polyp image (step S15).
- the polyp region mapping unit 25 maps the three-dimensional point cloud of the polyp region onto the three-dimensional point cloud of the imaging site of the endoscopic image (step S16).
- the polyp long diameter calculation unit 26 calculates the long diameter of the polyp region using the three-dimensional point cloud onto which the polyp region is mapped (step S17).
- the display image generation unit 27 generates a display image using the endoscopic image, the polyp image, and the long diameter value of the polyp, and displays it on the display device 2 (step S18).
- the three-dimensional reconstruction units 22 and 24 perform three-dimensional reconstruction based on the internal parameters and depth information of the camera. Instead, the three-dimensional reconstruction units 22 and 24 may perform three-dimensional reconstruction by applying image processing to the endoscopic image without performing depth estimation. Specifically, the three-dimensional reconstruction units 22 and 24 may use SfM, which estimates the camera posture and three-dimensional coordinates of feature points by performing feature point matching. In addition, since SfM can only perform three-dimensional reconstruction of sparse point clouds, a dense point cloud may be restored using MVS (Multi-View Stereo). In this case, instead of the depth estimation process, three-dimensional reconstruction is performed from the endoscopic image using the following processing procedure. (1) Estimation of camera position using SfM and 3D reconstruction of feature points (2) Reconstruction of dense 3D point cloud using MVS
- FIG. 5B is an enlarged view of the rectangle 44.
- the top side 44a of the rectangle 44 has a scale 44b indicating the length.
- the distance between two adjacent scales 44b is 1 mm.
- the length indicated by the scale 44b is not the length on the displayed image, but the actual length of the imaged part in the endoscopic image, and is calculated based on the three-dimensional point cloud of the imaged part described above.
- the examiner can estimate the size of the polyp 43 from the endoscopic image displayed in the video area 41.
- the scale 44b is on the top side of the rectangle 44, but the scale 44b may be on any one of the four sides of the rectangle.
- the analysis result area 42 displays an analysis result image 45 and a polyp's long diameter value 46.
- the analysis result image 45 is basically the same as the endoscopic image displayed in the video area 41, and a rectangle 45x indicating the position of the polyp is superimposed.
- the polyp's long diameter value 46 is the long diameter value calculated by the polyp long diameter calculation unit 26 based on the polyp region on the three-dimensional point cloud.
- the analysis result image 45 and the polyp's long diameter value 46 are not displayed. In this way, the polyp's long diameter value 46 calculated from the endoscopic image is displayed in the displayed image, so the examiner can consider treatment for the detected polyp by referring to the polyp's long diameter value calculated by the endoscopic examination support device 1.
- Fig. 6 shows a second display example of the display image displayed on the display device 2.
- a display image 40x of the second display example includes a video area 41 and an analysis result area 42, similar to the first display example.
- the second display example is different from the first display example in the rectangle 47 surrounding the polyp, but is otherwise similar to the first display example.
- FIG. 6(B) is an enlarged view of rectangle 47.
- Rectangle 47 in the second display example has scale marks 47b on its upper side 47a.
- scale marks 47b are provided every 5 mm.
- a numerical value 47c indicating the interval between scale marks 47b is displayed near scale marks 47b. This allows the examiner to estimate the size of the polyp on the assumption that the interval between scale marks 47b is 5 mm.
- polyp long diameter value 46 is displayed in analysis result area 42.
- FIG. 7 shows a third display example of a display image displayed on the display device 2.
- a display image 40y of the third display example includes a video area 41 and an analysis result area 42, similar to the first display example.
- the third display example is different from the first display example in the area within a rectangle 48 surrounding a polyp, but is otherwise similar to the first display example.
- FIG. 7B is an enlarged view of the rectangle 48.
- a scale is provided on one side of the rectangle.
- a line segment (axis) 48a indicating the long diameter of the polyp is provided inside the rectangle 48, and a scale 48b is provided on the line segment 48a.
- the line segment 48a is displayed so as to extend in the direction of the long diameter used in the calculation by the polyp long diameter calculation unit 26.
- the scale 48b is provided at intervals of 5 mm, as in the second display example. Note that the polyp long diameter value 46 is also displayed in the analysis result area 42 in the third display example.
- the line segment 48a indicating the long diameter of the polyp is displayed, so the examiner can know which part of the polyp 43 the polyp long diameter value 46 displayed in the analysis result area 42 is measured, and can judge the reliability of the displayed long diameter value 46.
- FIG. 8 shows a fourth display example of the display image displayed on the display device 2.
- the display image 40z of the fourth display example includes a video area 41 and an analysis result area 42, similar to the first display example.
- a contour 49 of the polyp 43 is superimposed on the endoscopic image.
- the fourth display example can be used when the lesion detection unit 23 shown in FIG. 3 detects the area of the polyp in pixel units by segmentation. Other than this, the fourth display example is the same as the third display example.
- a line segment 48a indicating the long axis of the polyp is displayed, and a scale is displayed on the line segment 48a, similar to the third display example. Therefore, the examiner can know which part of the polyp 43 is measured by the long axis value 46 of the polyp displayed in the analysis result area 42.
- Fig. 9 shows a fifth display example of a display image displayed on the display device 2.
- a display image 50 in the fifth display example includes only an image area 51.
- An endoscopic image is displayed in the image area 51.
- a polyp 53 is detected in the endoscopic image, and a rectangle (frame) 54 surrounding the polyp 53 is displayed superimposed on the endoscopic image displayed in the image area 51.
- the rectangle 54 is basically the same as in the first display example, and a scale indicating the length is displayed on the upper side of the rectangle 54.
- a box 55 indicating the long diameter value of the polyp is displayed near the rectangle 54 (in this example, below). By looking at the box 55, the examiner can know the long diameter value of the polyp calculated by the endoscopic examination support device 1.
- (Sixth display example) 10 shows a sixth display example of a display image displayed on the display device 2.
- a display image 50x in the sixth display example includes only an image area 51.
- the sixth display example is obtained by using a rectangle 56 similar to the rectangle 47 in the second display example in the fifth display example. That is, the rectangle 56 has scale marks at 5 mm intervals on its upper side. Apart from this, the sixth display example is the same as the fifth display example.
- Second Embodiment 12 is a block diagram showing the functional configuration of an endoscopic examination support device according to the second embodiment.
- the endoscopic examination support device 70 includes an image acquisition unit 71, a detection unit 72, a three-dimensional reconstruction unit 73, a major axis calculation unit 74, and a display control unit 75.
- FIG. 13 is a flowchart of processing by the endoscopic examination support device of the second embodiment.
- the image acquisition means 71 acquires an endoscopic image captured by an endoscopic camera (step S71).
- the detection means 72 detects polyps from the endoscopic image (step S72).
- the three-dimensional reconstruction means 73 reconstructs a three-dimensional point cloud of the photographed area from the endoscopic image (step S73).
- the major axis calculation means 74 calculates the major axis of the polyp based on the three-dimensional point cloud and the polyp (step S74).
- the display control means 75 causes a display image including the major axis value to be displayed on the display device (step S75).
- the second embodiment of the endoscopic examination support device 70 makes it possible to estimate and display the size of polyps from endoscopic images.
- An endoscopic examination support device comprising:
- (Appendix 2) a depth estimation means for estimating a depth of the endoscopic image
- the endoscopic examination support device according to claim 1, wherein the three-dimensional reconstruction means reconstructs a three-dimensional point cloud of the imaging area based on the depth and parameters of the endoscopic camera.
- (Appendix 3) a mapping means for mapping the polyp onto the three-dimensional point cloud;
- the endoscopic examination support device according to claim 1, wherein the long diameter calculation means calculates the long diameter based on a region of the polyp on the three-dimensional point cloud.
- Appendix 5 An endoscopic examination support device as described in Appendix 4, wherein the display image includes a rectangle displayed at the position of the polyp on the endoscopic image, and the scale is displayed on one side of the rectangle.
- the display image has a first area for displaying the endoscopic image and a second area different from the first area,
- An endoscopic examination support device as described in Appendix 4 wherein the scale is displayed in the first area and a numerical value indicating the value of the major diameter is displayed in the second area.
- Appendix 8 An endoscopic examination support device as described in Appendix 4, wherein a numerical value indicating the value of the major axis is displayed on the endoscopic image.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
Abstract
In this endoscopy assistance device, an image acquisition means acquires an endoscopic image taken by an endoscopic camera. A detection means detects a polyp from the endoscopic image. A three-dimensional restoration means restores a three-dimensional point cloud, for an imaged site, from the endoscopic image. A major axis calculation means calculates the major axis of the polyp on the basis of the three-dimensional point cloud and the polyp. A display control means causes a display device to display a display image including the value of the major axis.
Description
本開示は、内視鏡検査を支援する技術に関する。
This disclosure relates to technology that supports endoscopic examinations.
内視鏡検査において、見つかったポリープのサイズを判定することは、切除の要否や切除方法、サーベイランスを決定するために重要である。特許文献1は、内視鏡による撮影画像に交差ラインを表示し、交差ライン上に実寸サイズを示す目盛を表示する手法を開示している。
In endoscopic examinations, determining the size of polyps found is important in order to determine whether or not they need to be removed, the method of removal, and surveillance. Patent Document 1 discloses a method of displaying a cross line on an image captured by an endoscope and displaying a scale indicating the actual size on the cross line.
しかし、特許文献1は、単にポリープなどの近傍に目盛を有する交差ラインを表示するに過ぎず、医師などの検査者がポリープのサイズを目視により判定する必要がある。
However, Patent Document 1 merely displays intersecting lines with scale marks near polyps, etc., and requires a doctor or other examiner to visually determine the size of the polyp.
本開示の1つの目的は、内視鏡画像からポリープのサイズを推定し、表示することにある。
One objective of this disclosure is to estimate and display the size of polyps from endoscopic images.
本開示の一つの観点では、内視鏡検査支援装置は、
内視鏡カメラによって撮影された内視鏡画像を取得する画像取得手段と、
前記内視鏡画像からポリープを検出する検出手段と、
前記内視鏡画像から、撮影部位の3次元点群を復元する3次元復元手段と、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算する長径計算手段と、
前記長径の値を含む表示画像を表示装置に表示させる表示制御手段と、を備える。 According to one aspect of the present disclosure, an endoscopic examination support apparatus includes:
an image acquisition means for acquiring an endoscopic image captured by an endoscopic camera;
a detection means for detecting a polyp from the endoscopic image;
a three-dimensional reconstruction means for reconstructing a three-dimensional point cloud of an imaging region from the endoscopic image;
a major axis calculation means for calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
and a display control means for causing a display image including the value of the major axis to be displayed on a display device.
内視鏡カメラによって撮影された内視鏡画像を取得する画像取得手段と、
前記内視鏡画像からポリープを検出する検出手段と、
前記内視鏡画像から、撮影部位の3次元点群を復元する3次元復元手段と、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算する長径計算手段と、
前記長径の値を含む表示画像を表示装置に表示させる表示制御手段と、を備える。 According to one aspect of the present disclosure, an endoscopic examination support apparatus includes:
an image acquisition means for acquiring an endoscopic image captured by an endoscopic camera;
a detection means for detecting a polyp from the endoscopic image;
a three-dimensional reconstruction means for reconstructing a three-dimensional point cloud of an imaging region from the endoscopic image;
a major axis calculation means for calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
and a display control means for causing a display image including the value of the major axis to be displayed on a display device.
本開示の他の観点では、内視鏡検査支援方法は、コンピュータにより実行され、
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる。 In another aspect of the present disclosure, a method for assisting endoscopic examination is provided, the method being executed by a computer and comprising:
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
A display image including the value of the major axis is displayed on a display device.
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる。 In another aspect of the present disclosure, a method for assisting endoscopic examination is provided, the method being executed by a computer and comprising:
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
A display image including the value of the major axis is displayed on a display device.
本開示のさらに他の観点では、記録媒体は、
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる処理をコンピュータに実行させるプログラムを記録する。 According to yet another aspect of the present disclosure, a recording medium includes:
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
A program for causing a computer to execute a process for displaying a display image including the value of the major axis on a display device is recorded.
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる処理をコンピュータに実行させるプログラムを記録する。 According to yet another aspect of the present disclosure, a recording medium includes:
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
A program for causing a computer to execute a process for displaying a display image including the value of the major axis on a display device is recorded.
本開示によれば、内視鏡画像からポリープのサイズを推定し、表示することが可能となる。
According to this disclosure, it is possible to estimate and display the size of polyps from endoscopic images.
以下、図面を参照して、本開示の好適な実施形態について説明する。
<第1実施形態>
[システム構成]
図1は、内視鏡検査システム100の概略構成を示す。内視鏡検査システム100は、内視鏡を利用した検査(治療を含む)の際に、内視鏡画像からポリープのサイズを推定し、表示する。 Hereinafter, preferred embodiments of the present disclosure will be described with reference to the drawings.
First Embodiment
[System configuration]
Fig. 1 shows a schematic configuration of an endoscopic examination system 100. The endoscopic examination system 100 estimates and displays the size of a polyp from an endoscopic image during examination (including treatment) using an endoscope.
<第1実施形態>
[システム構成]
図1は、内視鏡検査システム100の概略構成を示す。内視鏡検査システム100は、内視鏡を利用した検査(治療を含む)の際に、内視鏡画像からポリープのサイズを推定し、表示する。 Hereinafter, preferred embodiments of the present disclosure will be described with reference to the drawings.
First Embodiment
[System configuration]
Fig. 1 shows a schematic configuration of an endoscopic examination system 100. The endoscopic examination system 100 estimates and displays the size of a polyp from an endoscopic image during examination (including treatment) using an endoscope.
図1に示すように、内視鏡検査システム100は、主に、内視鏡検査支援装置1と、表示装置2と、内視鏡検査支援装置1に接続された内視鏡スコープ3と、を備える。
As shown in FIG. 1, the endoscopic examination system 100 mainly comprises an endoscopic examination support device 1, a display device 2, and an endoscope scope 3 connected to the endoscopic examination support device 1.
内視鏡検査支援装置1は、内視鏡検査中に内視鏡スコープ3が撮影する映像(即ち、動画。以下、「内視鏡映像Ic」とも呼ぶ。)を内視鏡スコープ3から取得し、内視鏡検査の検査者が確認するための表示データを表示装置2に表示させる。具体的に、内視鏡検査支援装置1は、内視鏡検査中に、内視鏡スコープ3により撮影された大腸の動画を内視鏡映像Icとして取得する。内視鏡検査支援装置1は、内視鏡映像Icからフレーム画像を抽出し、フレーム画像を基に、ポリープのサイズを推定し、ポリープのサイズに関する情報を内視鏡画像とともに表示装置2に表示する。
The endoscopic examination support device 1 acquires from the endoscope scope 3 images (i.e., video images; hereinafter, also referred to as "endoscopic images Ic") captured by the endoscope scope 3 during an endoscopic examination, and displays on the display device 2 display data for confirmation by the examiner performing the endoscopic examination. Specifically, the endoscopic examination support device 1 acquires video images of the large intestine captured by the endoscope scope 3 during an endoscopic examination as the endoscopic images Ic. The endoscopic examination support device 1 extracts frame images from the endoscopic images Ic, estimates the size of the polyps based on the frame images, and displays information regarding the size of the polyps on the display device 2 together with the endoscopic images.
表示装置2は、内視鏡検査支援装置1から供給される表示信号に基づき所定の表示を行うディスプレイ等である。
The display device 2 is a display or the like that displays a predetermined image based on a display signal supplied from the endoscopic examination support device 1.
内視鏡スコープ3は、主に、検査者が送気、送水、アングル調整、撮影指示などの入力を行うための操作部36と、被検者の検査対象となる臓器内に挿入され、柔軟性を有するシャフト37と、超小型撮影素子などの撮影部を内蔵した先端部38と、内視鏡検査支援装置1と接続するための接続部39とを有する。内視鏡スコープ3に設けられた撮影部を以下「内視鏡カメラ」と呼ぶ。
The endoscope scope 3 mainly comprises an operation section 36 that allows the examiner to input instructions such as air supply, water supply, angle adjustment, and imaging instructions, a flexible shaft 37 that is inserted into the subject's organ to be examined, a tip section 38 that incorporates an imaging section such as a miniature imaging element, and a connection section 39 for connecting to the endoscopic examination support device 1. The imaging section provided in the endoscope scope 3 will be referred to as the "endoscopic camera" below.
[ハードウェア構成]
図2は、内視鏡検査支援装置1のハードウェア構成を示す。内視鏡検査支援装置1は、主に、プロセッサ11と、メモリ12と、インターフェース13と、入力部14と、光源部15と、音出力部16と、データベース(以下、「DB」と記す。)17と、を含む。これらの各要素は、データバス19を介して接続されている。 [Hardware configuration]
2 shows a hardware configuration of the endoscopic examination support device 1. The endoscopic examination support device 1 mainly includes a processor 11, a memory 12, an interface 13, an input unit 14, a light source unit 15, a sound output unit 16, and a database (hereinafter, referred to as "DB") 17. These elements are connected via a data bus 19.
図2は、内視鏡検査支援装置1のハードウェア構成を示す。内視鏡検査支援装置1は、主に、プロセッサ11と、メモリ12と、インターフェース13と、入力部14と、光源部15と、音出力部16と、データベース(以下、「DB」と記す。)17と、を含む。これらの各要素は、データバス19を介して接続されている。 [Hardware configuration]
2 shows a hardware configuration of the endoscopic examination support device 1. The endoscopic examination support device 1 mainly includes a processor 11, a memory 12, an interface 13, an input unit 14, a light source unit 15, a sound output unit 16, and a database (hereinafter, referred to as "DB") 17. These elements are connected via a data bus 19.
プロセッサ11は、メモリ12に記憶されているプログラム等を実行することにより、所定の処理を実行する。プロセッサ11は、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、DSP(Digital Signal Processor)、MPU(Micro Processing Unit)、FPU(Floating Point number Processing Unit)、PPU(Physics Processing Unit)、TPU(TensorProcessingUnit)、量子プロセッサ、マイクロコントローラ、又は、これらの組み合わせなどを用いることができる。なお、プロセッサ11は、複数のプロセッサから構成されてもよい。プロセッサ11は、コンピュータの一例である。
The processor 11 executes a predetermined process by executing a program stored in the memory 12. The processor 11 may be a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor), an MPU (Micro Processing Unit), an FPU (Floating Point number Processing Unit), a PPU (Physics Processing Unit), a TPU (Tensor Processing Unit), a quantum processor, a microcontroller, or a combination of these. The processor 11 may be composed of multiple processors. The processor 11 is an example of a computer.
メモリ12は、RAM(Random Access Memory)、ROM(Read Only Memory)などの、作業メモリとして使用される各種の揮発性メモリ及び内視鏡検査支援装置1の処理に必要な情報を記憶する不揮発性メモリにより構成される。なお、メモリ12は、内視鏡検査支援装置1に接続又は内蔵されたハードディスクなどの外部記憶装置を含んでもよく、着脱自在なフラッシュメモリやディスク媒体などの記憶媒体を含んでもよい。メモリ12には、内視鏡検査支援装置1が本実施形態における各処理を実行するためのプログラムが記憶される。
The memory 12 is composed of various volatile memories used as working memories, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memories that store information necessary for the processing of the endoscopic examination support device 1. The memory 12 may include an external storage device such as a hard disk connected to or built into the endoscopic examination support device 1, or may include a storage medium such as a removable flash memory or disk medium. The memory 12 stores programs that allow the endoscopic examination support device 1 to execute each process in this embodiment.
また、メモリ12は、プロセッサ11の制御に基づき、内視鏡検査において内視鏡スコープ3が撮影した一連の内視鏡映像Icを一時的に記憶する。
In addition, under the control of the processor 11, the memory 12 temporarily stores a series of endoscopic images Ic captured by the endoscope scope 3 during an endoscopic examination.
インターフェース13は、内視鏡検査支援装置1と外部装置とのインターフェース動作を行う。例えば、インターフェース13は、プロセッサ11が生成した表示データIdを表示装置2に供給する。また、インターフェース13は、光源部15が生成する照明光を内視鏡スコープ3に供給する。また、インターフェース13は、内視鏡スコープ3から供給される内視鏡映像Icを示す電気信号をプロセッサ11に供給する。インターフェース13は、外部装置と有線又は無線により通信を行うためのネットワークアダプタなどの通信インターフェースであってもよく、USB(Universal Serial Bus)、SATA(Serial AT Attachment)などに準拠したハードウェアインターフェースであってもよい。
The interface 13 performs interface operations between the endoscopic examination support device 1 and an external device. For example, the interface 13 supplies the display data Id generated by the processor 11 to the display device 2. The interface 13 also supplies illumination light generated by the light source unit 15 to the endoscope scope 3. The interface 13 also supplies an electrical signal indicating the endoscopic image Ic supplied from the endoscope scope 3 to the processor 11. The interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, or may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc.
入力部14は、検査者の操作に基づく入力信号を生成する。入力部14は、例えば、ボタン、タッチパネル、リモートコントローラ、音声入力装置等である。光源部15は、内視鏡スコープ3の先端部38に供給するための光を生成する。また、光源部15は、内視鏡スコープ3に供給する水や空気を送り出すためのポンプ等も内蔵してもよい。音出力部16は、プロセッサ11の制御に基づき音を出力する。
The input unit 14 generates an input signal based on the examiner's operation. The input unit 14 is, for example, a button, a touch panel, a remote controller, a voice input device, etc. The light source unit 15 generates light to be supplied to the tip 38 of the endoscope 3. The light source unit 15 may also incorporate a pump or the like for sending water or air to be supplied to the endoscope 3. The sound output unit 16 outputs sound based on the control of the processor 11.
DB17は、被検者の過去の内視鏡検査により取得された内視鏡映像などを記憶している。DB17は、内視鏡検査支援装置1に接続又は内蔵されたハードディスクなどの外部記憶装置を含んでもよく、着脱自在なフラッシュメモリなどの記憶媒体を含んでもよい。なお、DB17を内視鏡検査システム100内に備える代わりに、外部のサーバなどにDB17を設け、通信により当該サーバから関連情報を取得するようにしてもよい。
DB17 stores endoscopic images and the like acquired during past endoscopic examinations of the subject. DB17 may include an external storage device such as a hard disk connected to or built into the endoscopic examination support device 1, or may include a removable storage medium such as a flash memory. Instead of providing DB17 within the endoscopic examination system 100, DB17 may be provided on an external server, etc., and related information may be acquired from the server via communication.
なお、内視鏡検査支援装置1は、磁気式センサなど、内視鏡カメラの回転および並進を計測可能なセンサを備えていてもよい。
The endoscopic examination support device 1 may also be equipped with a sensor capable of measuring the rotation and translation of the endoscopic camera, such as a magnetic sensor.
[機能構成]
図3は、内視鏡検査支援装置1の機能構成を示すブロック図である。内視鏡検査支援装置1は、機能的には、インターフェース13と、深度推定部21と、3次元復元部22と、病変検知部23と、3次元復元部24と、ポリープ領域マッピング部25と、ポリープ長径計算部26と、表示画像生成部27と、を備える。 [Functional configuration]
3 is a block diagram showing the functional configuration of the endoscopic examination support device 1. The endoscopic examination support device 1 functionally includes an interface 13, a depth estimation unit 21, a three-dimensional reconstruction unit 22, a lesion detection unit 23, a three-dimensional reconstruction unit 24, a polyp region mapping unit 25, a polyp long axis calculation unit 26, and a display image generation unit 27.
図3は、内視鏡検査支援装置1の機能構成を示すブロック図である。内視鏡検査支援装置1は、機能的には、インターフェース13と、深度推定部21と、3次元復元部22と、病変検知部23と、3次元復元部24と、ポリープ領域マッピング部25と、ポリープ長径計算部26と、表示画像生成部27と、を備える。 [Functional configuration]
3 is a block diagram showing the functional configuration of the endoscopic examination support device 1. The endoscopic examination support device 1 functionally includes an interface 13, a depth estimation unit 21, a three-dimensional reconstruction unit 22, a lesion detection unit 23, a three-dimensional reconstruction unit 24, a polyp region mapping unit 25, a polyp long axis calculation unit 26, and a display image generation unit 27.
内視鏡検査支援装置1には、内視鏡スコープ3から内視鏡映像Icが入力される。内視鏡映像Icは、インターフェース13へ入力される。インターフェース13は、入力された内視鏡映像Icからフレーム画像(以下、「内視鏡画像」とも呼ぶ。)を抽出し、深度推定部21と、病変検知部23と、表示画像生成部27とへ出力する。
The endoscopic examination support device 1 receives an endoscopic video Ic from the endoscope scope 3. The endoscopic video Ic is input to the interface 13. The interface 13 extracts frame images (hereinafter also referred to as "endoscopic images") from the input endoscopic video Ic, and outputs them to the depth estimation unit 21, the lesion detection unit 23, and the display image generation unit 27.
深度推定部21には、インターフェース13から内視鏡画像が入力される。深度推定部21は、内視鏡画像から、内視鏡画像に含まれる各画素に対応する内視鏡カメラ(即ち、内視鏡スコープ3の先端)からの深度を推定し、深度情報を生成する。「深度」は、内視鏡カメラから内視鏡画像の各画素までの距離であり、「深度情報」は内視鏡画像の画素毎の深度を示す情報である。なお、1枚の内視鏡画像に対応する深度情報を「深度画像」とも呼ぶ。
The depth estimation unit 21 receives an endoscopic image from the interface 13. The depth estimation unit 21 estimates the depth from the endoscopic camera (i.e., the tip of the endoscopic scope 3) corresponding to each pixel contained in the endoscopic image from the endoscopic image, and generates depth information. "Depth" is the distance from the endoscopic camera to each pixel in the endoscopic image, and "depth information" is information indicating the depth for each pixel in the endoscopic image. Note that the depth information corresponding to one endoscopic image is also called a "depth image."
深度の推定には、例えばStructure from Motion(以下、「SfM」と記す。)などの画像処理の手法、又は、深層学習などの機械学習の手法を適用することができる。機械学習の手法としては、1枚のフレーム画像に対して画素単位で深度を推定する単眼深度推定の技術を用いてもよい。機械学習の手法において、事前に学習を行った深度推定モデルを使用する。この際、深度推定モデルを学習する手法としては、深度の正解データを与えた教師あり学習の手法や、動画データを入力として学習を行う自己教師あり学習の手法などを用いることができる。深度推定部21は、生成した深度情報を3次元復元部22、24へ出力する。
For example, image processing techniques such as Structure from Motion (hereinafter, referred to as "SfM") or machine learning techniques such as deep learning can be applied to estimate depth. As a machine learning technique, a monocular depth estimation technique that estimates depth on a pixel-by-pixel basis for one frame image may be used. In the machine learning technique, a depth estimation model that has been trained in advance is used. In this case, as a technique for training the depth estimation model, a supervised learning technique in which correct depth data is given, or a self-supervised learning technique in which learning is performed using video data as input can be used. The depth estimation unit 21 outputs the generated depth information to the 3D restoration units 22 and 24.
3次元復元部22は、深度推定部21から入力された深度情報と、内視鏡スコープ3に設けられた内視鏡カメラの内部パラメータとを用いて3次元点群を生成する。即ち、3次元復元部22は、2次元の内視鏡画像を、3次元座標系における3次元点群に復元する。内視鏡画像は大腸内の撮影部位の3次元形状を、内視鏡カメラの内部パラメータを用いて2次元形状に変換したものである。よって、3次元復元部22は、カメラの内部パラメータと、内視鏡画像の深度情報とを用いて、内視鏡画像をもとの3次元形状(3次元点群)に復元する。この際、カメラの内部パラメータは既知として処理してもよいし、内視鏡画像から推定してもよい。
The 3D restoration unit 22 generates a 3D point cloud using the depth information input from the depth estimation unit 21 and the internal parameters of the endoscopic camera provided in the endoscope scope 3. That is, the 3D restoration unit 22 restores the 2D endoscopic image to a 3D point cloud in a 3D coordinate system. The endoscopic image is a 3D shape of the photographed area in the large intestine that has been converted into a 2D shape using the internal parameters of the endoscopic camera. Therefore, the 3D restoration unit 22 restores the endoscopic image to its original 3D shape (3D point cloud) using the internal parameters of the camera and the depth information of the endoscopic image. At this time, the internal parameters of the camera may be treated as known, or may be estimated from the endoscopic image.
3次元復元部22は、複数の内視鏡画像(フレーム画像)を用いて3次元復元を行ってもよい。この場合、3次元復元部22は、基準となるフレーム画像におけるカメラ位置に対する、他のフレーム画像におけるカメラ位置の移動(カメラの位置及び向きの変換:「外部パラメータ」と呼ぶ。)を計算する。そして、3次元復元部22は、外部パラメータを用いて各フレーム画像を、基準となるフレーム画像の座標系に変換し、基準となるフレーム画像の座標系において統合する。この際、3次元復元部22は、3次元復元の精度を上げるために最適化手法を用いてもよい。最適化手法の1つの例では、3次元復元部22は、フレーム画像間で対応のとれた特徴点を基に、再投影誤差を最小化するためのバンドル調整を行ってもよい。最適化手法の他の例では、3次元復元部22は、物体とカメラとの距離、及び、距離から得られる物体の明るさを推定するNeRF(Neural Radiance Field)を用いてもよい。3次元復元部22は、内視鏡画像の撮影部位の3次元点群をポリープ領域マッピング部25へ出力する。
The three-dimensional restoration unit 22 may perform three-dimensional restoration using multiple endoscopic images (frame images). In this case, the three-dimensional restoration unit 22 calculates the movement of the camera position in other frame images relative to the camera position in the reference frame image (transformation of the camera position and orientation: referred to as "external parameters"). Then, the three-dimensional restoration unit 22 transforms each frame image into the coordinate system of the reference frame image using the external parameters, and integrates them in the coordinate system of the reference frame image. At this time, the three-dimensional restoration unit 22 may use an optimization method to improve the accuracy of the three-dimensional restoration. In one example of the optimization method, the three-dimensional restoration unit 22 may perform bundle adjustment to minimize reprojection errors based on corresponding feature points between the frame images. In another example of the optimization method, the three-dimensional restoration unit 22 may use NeRF (Neural Radiance Field) to estimate the distance between the object and the camera, and the brightness of the object obtained from the distance. The 3D reconstruction unit 22 outputs a 3D point cloud of the imaging site in the endoscopic image to the polyp region mapping unit 25.
病変検知部23には、インターフェース13から内視鏡画像が入力される。病変検知部23は、予め用意された画像認識モデルなどを用いて、内視鏡画像から病変(「ポリープ」とも呼ぶ。)領域を検出し、ポリープ領域を含むポリープ画像を生成する。ポリープ画像におけるポリープ領域は、ポリープを囲む矩形であってもよく、セグメンテーションにより検出された画素単位の領域であってもよい。病変検知部23は、ポリープ画像を3次元復元部24及び表示画像生成部27へ出力する。
The lesion detection unit 23 receives an endoscopic image from the interface 13. The lesion detection unit 23 detects lesion (also called "polyp") areas from the endoscopic image using a pre-prepared image recognition model, etc., and generates a polyp image including the polyp area. The polyp area in the polyp image may be a rectangle surrounding the polyp, or may be a pixel-by-pixel area detected by segmentation. The lesion detection unit 23 outputs the polyp image to the 3D restoration unit 24 and the display image generation unit 27.
3次元復元部24は、ポリープ画像に含まれるポリープ領域を3次元復元し、3次元座標系におけるポリープ領域の3次元点群を生成する。3次元復元部24は、基本的に3次元復元部22と同様の手法でポリープ領域の3次元点群を生成すればよく、実際には共通の3次元復元部により3次元復元部22及び24を構成してもよい。3次元復元部24は、ポリープ領域の3次元点群をポリープ領域マッピング部25へ出力する。
The three-dimensional restoration unit 24 three-dimensionally restores the polyp region included in the polyp image, and generates a three-dimensional point cloud of the polyp region in a three-dimensional coordinate system. The three-dimensional restoration unit 24 basically needs to generate a three-dimensional point cloud of the polyp region using a method similar to that used by the three-dimensional restoration unit 22, and in practice the three-dimensional restoration units 22 and 24 may be configured using a common three-dimensional restoration unit. The three-dimensional restoration unit 24 outputs the three-dimensional point cloud of the polyp region to the polyp region mapping unit 25.
ポリープ領域マッピング部25は、3次元復元部22から入力された撮影部位の3次元点群上に、3次元復元部24から入力されたポリープ領域の3次元点群をマッピングする。これにより、撮影部位の3次元点群上でポリープ領域が識別可能となる。ポリープ領域マッピング部25は、ポリープ領域がマッピングされた3次元点群をポリープ長径計算部26へ出力する。
The polyp region mapping unit 25 maps the three-dimensional point cloud of the polyp region input from the three-dimensional reconstruction unit 24 onto the three-dimensional point cloud of the imaged area input from the three-dimensional reconstruction unit 22. This makes it possible to identify the polyp region on the three-dimensional point cloud of the imaged area. The polyp region mapping unit 25 outputs the three-dimensional point cloud onto which the polyp region is mapped to the polyp long diameter calculation unit 26.
ポリープ長径計算部26は、ポリープ領域がマッピングされた3次元点群を用いて、ポリープの長径を計算する。ポリープの形状は様々であるが、ポリープ長径計算部26は、ポリープの最も長い部分を長径とする。具体的には、ポリープ長径計算部26は、ポリープ領域が示すポリープの3次元点群において最も遠い2点の距離を長径として計算する。ポリープ長径計算部26は、計算した長径を表示画像生成部27へ出力する。
The polyp long diameter calculation unit 26 calculates the long diameter of the polyp using the three-dimensional point cloud onto which the polyp region is mapped. Polyps come in a variety of shapes, but the polyp long diameter calculation unit 26 determines the longest part of the polyp as the long diameter. Specifically, the polyp long diameter calculation unit 26 calculates the distance between the two farthest points in the three-dimensional point cloud of the polyp indicated by the polyp region as the long diameter. The polyp long diameter calculation unit 26 outputs the calculated long diameter to the display image generation unit 27.
表示画像生成部27は、インターフェース13から入力された内視鏡画像と、病変検知部23から入力されたポリープ画像と、ポリープ長径計算部26から入力されたポリープの長径とを用いて、表示画像を生成する。具体的に、表示画像生成部27は、内視鏡画像上にポリープ領域を示した画像と、ポリープの長径の値とを含む表示画像を生成する。なお、表示画像の例については後述する。表示画像生成部27は、生成した表示画像の表示データIdを表示装置2に出力し、表示装置2に表示させる。
The display image generating unit 27 generates a display image using the endoscopic image input from the interface 13, the polyp image input from the lesion detection unit 23, and the polyp long diameter input from the polyp long diameter calculation unit 26. Specifically, the display image generating unit 27 generates a display image including an image showing a polyp region on the endoscopic image and the value of the polyp long diameter. An example of the display image will be described later. The display image generating unit 27 outputs the display data Id of the generated display image to the display device 2 and causes it to be displayed on the display device 2.
上記の構成において、インターフェース13は画像取得手段の一例であり、3次元復元部22は3次元復元手段の一例であり、病変検知部23は検出手段の一例であり、ポリープ長径計算部26は長径計算手段の一例であり、表示画像生成部27は表示制御手段の一例である。
In the above configuration, the interface 13 is an example of an image acquisition means, the three-dimensional reconstruction unit 22 is an example of a three-dimensional reconstruction means, the lesion detection unit 23 is an example of a detection means, the polyp long diameter calculation unit 26 is an example of a long diameter calculation means, and the display image generation unit 27 is an example of a display control means.
[ポリープ情報表示処理]
次に、内視鏡検査支援装置1により実行されるポリープ情報表示処理について説明する。図4は、ポリープ情報表示処理のフローチャートである。この処理は、図2に示すプロセッサ11が予め用意されたプログラムを実行し、図3に示す各要素として動作することにより実現される。 [Polyp information display processing]
Next, a description will be given of the polyp information display process executed by the endoscopic examination support device 1. Fig. 4 is a flowchart of the polyp information display process. This process is realized by the processor 11 shown in Fig. 2 executing a program prepared in advance and operating as each element shown in Fig. 3.
次に、内視鏡検査支援装置1により実行されるポリープ情報表示処理について説明する。図4は、ポリープ情報表示処理のフローチャートである。この処理は、図2に示すプロセッサ11が予め用意されたプログラムを実行し、図3に示す各要素として動作することにより実現される。 [Polyp information display processing]
Next, a description will be given of the polyp information display process executed by the endoscopic examination support device 1. Fig. 4 is a flowchart of the polyp information display process. This process is realized by the processor 11 shown in Fig. 2 executing a program prepared in advance and operating as each element shown in Fig. 3.
まず、内視鏡スコープ3からインターフェース13に内視鏡映像Icが入力される。インターフェース13は、入力された内視鏡映像Icから内視鏡画像を取得する(ステップS11)。次に、深度推定部21は、内視鏡画像から深度を推定し、深度情報を生成する(ステップS12)。次に、3次元復元部22は、深度情報と、内視鏡カメラの内部パラメータとを用いて、内視鏡画像の撮影部位の3次元点群を復元する(ステップS13)。
First, the endoscopic video Ic is input from the endoscope scope 3 to the interface 13. The interface 13 acquires an endoscopic image from the input endoscopic video Ic (step S11). Next, the depth estimation unit 21 estimates the depth from the endoscopic image and generates depth information (step S12). Next, the three-dimensional reconstruction unit 22 uses the depth information and the internal parameters of the endoscopic camera to reconstruct a three-dimensional point cloud of the captured area in the endoscopic image (step S13).
また、病変検知部23は、内視鏡画像からポリープを検知する(ステップS14)。次に、3次元復元部24は、ポリープ画像からポリープ領域の3次元点群を生成する(ステップS15)。
The lesion detection unit 23 also detects polyps from the endoscopic image (step S14). Next, the three-dimensional reconstruction unit 24 generates a three-dimensional point cloud of the polyp region from the polyp image (step S15).
次に、ポリープ領域マッピング部25は、内視鏡画像の撮影部位の3次元点群上に、ポリープ領域の3次元点群をマッピングする(ステップS16)。次に、ポリープ長径計算部26は、ポリープ領域がマッピングされた3次元点群を用いて、ポリープ領域の長径を計算する(ステップS17)。次に、表示画像生成部27は、内視鏡画像と、ポリープ画像と、ポリープの長径値とを用いて表示画像を生成し、表示装置2に表示させる(ステップS18)。
Next, the polyp region mapping unit 25 maps the three-dimensional point cloud of the polyp region onto the three-dimensional point cloud of the imaging site of the endoscopic image (step S16). Next, the polyp long diameter calculation unit 26 calculates the long diameter of the polyp region using the three-dimensional point cloud onto which the polyp region is mapped (step S17). Next, the display image generation unit 27 generates a display image using the endoscopic image, the polyp image, and the long diameter value of the polyp, and displays it on the display device 2 (step S18).
[変形例]
図3の機能構成では、3次元復元部22、24は、カメラの内部パラメータと深度情報に基づいて3次元復元を行っている。その代わりに、3次元復元部22、24は、深度推定を行わず、内視鏡画像に画像処理を適用して3次元復元を行ってもよい。具体的には、3次元復元部22、24は、特徴点マッチングを行うことでカメラ姿勢及び特徴点の3次元座標を推定するSfMを使用してもよい。また、SfMは疎な点群しか3次元復元できないため、MVS(Multi-View Stereo)を用いて密な点群を復元してもよい。この場合、深度推定処理の代わりに、以下の処理手順で内視鏡画像から3次元復元が行われる。
(1)SfMによるカメラ位置の推定、及び、特徴点の3次元復元
(2)MVSによる密な3次元点群の復元 [Modification]
In the functional configuration of FIG. 3, the three-dimensional reconstruction units 22 and 24 perform three-dimensional reconstruction based on the internal parameters and depth information of the camera. Instead, the three-dimensional reconstruction units 22 and 24 may perform three-dimensional reconstruction by applying image processing to the endoscopic image without performing depth estimation. Specifically, the three-dimensional reconstruction units 22 and 24 may use SfM, which estimates the camera posture and three-dimensional coordinates of feature points by performing feature point matching. In addition, since SfM can only perform three-dimensional reconstruction of sparse point clouds, a dense point cloud may be restored using MVS (Multi-View Stereo). In this case, instead of the depth estimation process, three-dimensional reconstruction is performed from the endoscopic image using the following processing procedure.
(1) Estimation of camera position using SfM and 3D reconstruction of feature points (2) Reconstruction of dense 3D point cloud using MVS
図3の機能構成では、3次元復元部22、24は、カメラの内部パラメータと深度情報に基づいて3次元復元を行っている。その代わりに、3次元復元部22、24は、深度推定を行わず、内視鏡画像に画像処理を適用して3次元復元を行ってもよい。具体的には、3次元復元部22、24は、特徴点マッチングを行うことでカメラ姿勢及び特徴点の3次元座標を推定するSfMを使用してもよい。また、SfMは疎な点群しか3次元復元できないため、MVS(Multi-View Stereo)を用いて密な点群を復元してもよい。この場合、深度推定処理の代わりに、以下の処理手順で内視鏡画像から3次元復元が行われる。
(1)SfMによるカメラ位置の推定、及び、特徴点の3次元復元
(2)MVSによる密な3次元点群の復元 [Modification]
In the functional configuration of FIG. 3, the three-dimensional reconstruction units 22 and 24 perform three-dimensional reconstruction based on the internal parameters and depth information of the camera. Instead, the three-dimensional reconstruction units 22 and 24 may perform three-dimensional reconstruction by applying image processing to the endoscopic image without performing depth estimation. Specifically, the three-dimensional reconstruction units 22 and 24 may use SfM, which estimates the camera posture and three-dimensional coordinates of feature points by performing feature point matching. In addition, since SfM can only perform three-dimensional reconstruction of sparse point clouds, a dense point cloud may be restored using MVS (Multi-View Stereo). In this case, instead of the depth estimation process, three-dimensional reconstruction is performed from the endoscopic image using the following processing procedure.
(1) Estimation of camera position using SfM and 3D reconstruction of feature points (2) Reconstruction of dense 3D point cloud using MVS
[表示例]
次に、表示画像生成部27が生成する表示画像の例を説明する。
(第1表示例)
図5は、表示装置2に表示される表示画像の第1表示例を示す。図5(A)に示すように、第1表示例の表示画像40は、大別して映像エリア41と、解析結果エリア42を含む。映像エリア41には、内視鏡スコープ3から入力された内視鏡画像が表示される。解析結果エリア42には、内視鏡画像の解析結果などが表示される。図5(A)の例では、内視鏡画像中にポリープ43が検出され、映像エリア41に表示された内視鏡画像に重畳してポリープ43を囲む矩形(枠)44が表示されている。矩形44の一辺には、長さを示す目盛りが表示される。 [Display example]
Next, an example of a display image generated by the display image generating unit 27 will be described.
(First display example)
Fig. 5 shows a first display example of a display image displayed on the display device 2. As shown in Fig. 5(A), a display image 40 of the first display example mainly includes a video area 41 and an analysis result area 42. The video area 41 displays an endoscopic image input from the endoscope 3. The analysis result area 42 displays the analysis results of the endoscopic image, etc. In the example of Fig. 5(A), a polyp 43 is detected in the endoscopic image, and a rectangle (frame) 44 surrounding the polyp 43 is displayed superimposed on the endoscopic image displayed in the video area 41. A scale indicating the length is displayed on one side of the rectangle 44.
次に、表示画像生成部27が生成する表示画像の例を説明する。
(第1表示例)
図5は、表示装置2に表示される表示画像の第1表示例を示す。図5(A)に示すように、第1表示例の表示画像40は、大別して映像エリア41と、解析結果エリア42を含む。映像エリア41には、内視鏡スコープ3から入力された内視鏡画像が表示される。解析結果エリア42には、内視鏡画像の解析結果などが表示される。図5(A)の例では、内視鏡画像中にポリープ43が検出され、映像エリア41に表示された内視鏡画像に重畳してポリープ43を囲む矩形(枠)44が表示されている。矩形44の一辺には、長さを示す目盛りが表示される。 [Display example]
Next, an example of a display image generated by the display image generating unit 27 will be described.
(First display example)
Fig. 5 shows a first display example of a display image displayed on the display device 2. As shown in Fig. 5(A), a display image 40 of the first display example mainly includes a video area 41 and an analysis result area 42. The video area 41 displays an endoscopic image input from the endoscope 3. The analysis result area 42 displays the analysis results of the endoscopic image, etc. In the example of Fig. 5(A), a polyp 43 is detected in the endoscopic image, and a rectangle (frame) 44 surrounding the polyp 43 is displayed superimposed on the endoscopic image displayed in the video area 41. A scale indicating the length is displayed on one side of the rectangle 44.
図5(B)は、矩形44を拡大した図である。図5(B)において、矩形44の上辺44aには、長さを示す目盛り44bが設けられている。この例では、隣接する2つの目盛り44bの間の長さは1mmとなっている。目盛り44bが示す長さは、表示画像上の長さではなく、内視鏡画像中の撮影部位の実際の長さであり、前述した撮影部位の3次元点群に基づいて計算されている。検査者は、内視鏡画像中のポリープ43と、矩形44の一辺に設けられた目盛り44bとを同時に見ることにより、映像エリア41に表示された内視鏡画像からポリープ43のサイズを推測することができる。なお、図5(B)の例では目盛り44bを矩形44の上辺に設けているが、目盛り44bは矩形の4辺のうちのいずれかの一辺に設けられれば良い。
5B is an enlarged view of the rectangle 44. In FIG. 5B, the top side 44a of the rectangle 44 has a scale 44b indicating the length. In this example, the distance between two adjacent scales 44b is 1 mm. The length indicated by the scale 44b is not the length on the displayed image, but the actual length of the imaged part in the endoscopic image, and is calculated based on the three-dimensional point cloud of the imaged part described above. By simultaneously looking at the polyp 43 in the endoscopic image and the scale 44b on one side of the rectangle 44, the examiner can estimate the size of the polyp 43 from the endoscopic image displayed in the video area 41. Note that in the example of FIG. 5B, the scale 44b is on the top side of the rectangle 44, but the scale 44b may be on any one of the four sides of the rectangle.
図5(A)において、解析結果エリア42には、解析結果画像45と、ポリープの長径値46とが表示されている。ポリープが検出された状態では、解析結果画像45は基本的に映像エリア41に表示された内視鏡画像と同一の画像となり、ポリープの位置を示す矩形45xが重畳表示されている。ポリープの長径値46は、ポリープ長径計算部26が3次元点群上のポリープ領域に基づいて計算した長径の数値である。なお、ポリープが検出されていない状態では、解析結果画像45及びポリープの長径値46は表示されない。このように、内視鏡画像から計算されたポリープの長径値46が表示画像中に表示されるので、検査者は内視鏡検査支援装置1が計算したポリープの長径値を参考にして、検出されたポリープに対する処置などを検討することができる。
In FIG. 5(A), the analysis result area 42 displays an analysis result image 45 and a polyp's long diameter value 46. When a polyp is detected, the analysis result image 45 is basically the same as the endoscopic image displayed in the video area 41, and a rectangle 45x indicating the position of the polyp is superimposed. The polyp's long diameter value 46 is the long diameter value calculated by the polyp long diameter calculation unit 26 based on the polyp region on the three-dimensional point cloud. When a polyp is not detected, the analysis result image 45 and the polyp's long diameter value 46 are not displayed. In this way, the polyp's long diameter value 46 calculated from the endoscopic image is displayed in the displayed image, so the examiner can consider treatment for the detected polyp by referring to the polyp's long diameter value calculated by the endoscopic examination support device 1.
(第2表示例)
図6は、表示装置2に表示される表示画像の第2表示例を示す。図6(A)に示すように、第2表示例の表示画像40xは、第1表示例と同様に映像エリア41と、解析結果エリア42を含む。第2表示例は、ポリープを囲む矩形47が第1表示例と異なるが、それ以外は第1表示例と同様である。 (Second display example)
Fig. 6 shows a second display example of the display image displayed on the display device 2. As shown in Fig. 6(A), a display image 40x of the second display example includes a video area 41 and an analysis result area 42, similar to the first display example. The second display example is different from the first display example in the rectangle 47 surrounding the polyp, but is otherwise similar to the first display example.
図6は、表示装置2に表示される表示画像の第2表示例を示す。図6(A)に示すように、第2表示例の表示画像40xは、第1表示例と同様に映像エリア41と、解析結果エリア42を含む。第2表示例は、ポリープを囲む矩形47が第1表示例と異なるが、それ以外は第1表示例と同様である。 (Second display example)
Fig. 6 shows a second display example of the display image displayed on the display device 2. As shown in Fig. 6(A), a display image 40x of the second display example includes a video area 41 and an analysis result area 42, similar to the first display example. The second display example is different from the first display example in the rectangle 47 surrounding the polyp, but is otherwise similar to the first display example.
図6(B)は、矩形47を拡大した図である。第2表示例の矩形47は、その上辺47a上に目盛り47bが設けられている。但し、第2表示例では、目盛り47bは5mm毎に設けられている。また、目盛り47bの近傍に、目盛り47bの間隔を示す数値47c(本例では、5mmを示す「5」)が表示される。これにより、検査者は、目盛り47bの間隔が5mmであるという前提で、ポリープのサイズを推測することができる。なお、第2表示例においても、解析結果エリア42にはポリープの長径値46が表示される。
FIG. 6(B) is an enlarged view of rectangle 47. Rectangle 47 in the second display example has scale marks 47b on its upper side 47a. However, in the second display example, scale marks 47b are provided every 5 mm. Also, a numerical value 47c indicating the interval between scale marks 47b (in this example, "5" indicating 5 mm) is displayed near scale marks 47b. This allows the examiner to estimate the size of the polyp on the assumption that the interval between scale marks 47b is 5 mm. Note that in the second display example as well, polyp long diameter value 46 is displayed in analysis result area 42.
(第3表示例)
図7は、表示装置2に表示される表示画像の第3表示例を示す。図7(A)に示すように、第3表示例の表示画像40yは、第1表示例と同様に映像エリア41と、解析結果エリア42を含む。第3表示例は、ポリープを囲む矩形48内が第1表示例と異なるが、それ以外は第1表示例と同様である。 (Third display example)
7 shows a third display example of a display image displayed on the display device 2. As shown in Fig. 7(A), a display image 40y of the third display example includes a video area 41 and an analysis result area 42, similar to the first display example. The third display example is different from the first display example in the area within a rectangle 48 surrounding a polyp, but is otherwise similar to the first display example.
図7は、表示装置2に表示される表示画像の第3表示例を示す。図7(A)に示すように、第3表示例の表示画像40yは、第1表示例と同様に映像エリア41と、解析結果エリア42を含む。第3表示例は、ポリープを囲む矩形48内が第1表示例と異なるが、それ以外は第1表示例と同様である。 (Third display example)
7 shows a third display example of a display image displayed on the display device 2. As shown in Fig. 7(A), a display image 40y of the third display example includes a video area 41 and an analysis result area 42, similar to the first display example. The third display example is different from the first display example in the area within a rectangle 48 surrounding a polyp, but is otherwise similar to the first display example.
図7(B)は、矩形48を拡大した図である。第1及び第2表示例では、矩形の一辺に目盛りが設けられている。これに対し、第3表示例の矩形48では、矩形48の内部にポリープの長径を示す線分(軸)48aが設けられ、線分48a上に目盛り48bが設けられる。線分48aは、ポリープ長径計算部26が計算に使用した長径の方向に延びるように表示される。図7(B)の例では、目盛り48bは第2表示例と同様に5mmの間隔で設けられている。なお、第3表示例においても、解析結果エリア42にはポリープの長径値46が表示される。第3表示例ではポリープの長径を示す線分48aが表示されるので、検査者は、解析結果エリア42に表示されたポリープの長径値46が、ポリープ43のどの部分を測った値であるかを知ることができ、表示された長径値46の信頼性を判断することができる。
7B is an enlarged view of the rectangle 48. In the first and second display examples, a scale is provided on one side of the rectangle. In contrast, in the rectangle 48 of the third display example, a line segment (axis) 48a indicating the long diameter of the polyp is provided inside the rectangle 48, and a scale 48b is provided on the line segment 48a. The line segment 48a is displayed so as to extend in the direction of the long diameter used in the calculation by the polyp long diameter calculation unit 26. In the example of FIG. 7B, the scale 48b is provided at intervals of 5 mm, as in the second display example. Note that the polyp long diameter value 46 is also displayed in the analysis result area 42 in the third display example. In the third display example, the line segment 48a indicating the long diameter of the polyp is displayed, so the examiner can know which part of the polyp 43 the polyp long diameter value 46 displayed in the analysis result area 42 is measured, and can judge the reliability of the displayed long diameter value 46.
(第4表示例)
図8は、表示装置2に表示される表示画像の第4表示例を示す。図8に示すように、第4表示例の表示画像40zは、第1表示例と同様に映像エリア41と、解析結果エリア42を含む。第4表示例は、ポリープを囲む矩形を表示する代わりに、ポリープ43の輪郭49が内視鏡画像上に重畳表示される。第4表示例は、図3に示す病変検知部23が、セグメンテーションによりポリープの領域を画素単位で検出した場合に用いることができる。この点以外は、第4表示例は、第3表示例と同様である。即ち、第4表示例では、第3表示例と同様に、ポリープの長軸を示す線分48aが表示され、線分48a上に目盛りが表示される。よって、検査者は、解析結果エリア42に表示されたポリープの長径値46が、ポリープ43のどの部分を測った値であるかを知ることができる。 (Fourth display example)
FIG. 8 shows a fourth display example of the display image displayed on the display device 2. As shown in FIG. 8, the display image 40z of the fourth display example includes a video area 41 and an analysis result area 42, similar to the first display example. In the fourth display example, instead of displaying a rectangle surrounding the polyp, a contour 49 of the polyp 43 is superimposed on the endoscopic image. The fourth display example can be used when the lesion detection unit 23 shown in FIG. 3 detects the area of the polyp in pixel units by segmentation. Other than this, the fourth display example is the same as the third display example. That is, in the fourth display example, a line segment 48a indicating the long axis of the polyp is displayed, and a scale is displayed on the line segment 48a, similar to the third display example. Therefore, the examiner can know which part of the polyp 43 is measured by the long axis value 46 of the polyp displayed in the analysis result area 42.
図8は、表示装置2に表示される表示画像の第4表示例を示す。図8に示すように、第4表示例の表示画像40zは、第1表示例と同様に映像エリア41と、解析結果エリア42を含む。第4表示例は、ポリープを囲む矩形を表示する代わりに、ポリープ43の輪郭49が内視鏡画像上に重畳表示される。第4表示例は、図3に示す病変検知部23が、セグメンテーションによりポリープの領域を画素単位で検出した場合に用いることができる。この点以外は、第4表示例は、第3表示例と同様である。即ち、第4表示例では、第3表示例と同様に、ポリープの長軸を示す線分48aが表示され、線分48a上に目盛りが表示される。よって、検査者は、解析結果エリア42に表示されたポリープの長径値46が、ポリープ43のどの部分を測った値であるかを知ることができる。 (Fourth display example)
FIG. 8 shows a fourth display example of the display image displayed on the display device 2. As shown in FIG. 8, the display image 40z of the fourth display example includes a video area 41 and an analysis result area 42, similar to the first display example. In the fourth display example, instead of displaying a rectangle surrounding the polyp, a contour 49 of the polyp 43 is superimposed on the endoscopic image. The fourth display example can be used when the lesion detection unit 23 shown in FIG. 3 detects the area of the polyp in pixel units by segmentation. Other than this, the fourth display example is the same as the third display example. That is, in the fourth display example, a line segment 48a indicating the long axis of the polyp is displayed, and a scale is displayed on the line segment 48a, similar to the third display example. Therefore, the examiner can know which part of the polyp 43 is measured by the long axis value 46 of the polyp displayed in the analysis result area 42.
(第5表示例)
図9は、表示装置2に表示される表示画像の第5表示例を示す。図9に示すように、第5表示例の表示画像50は、映像エリア51のみを含む。映像エリア51には、内視鏡画像が表示される。図9の例では、内視鏡画像中にポリープ53が検出され、映像エリア51に表示された内視鏡画像に重畳してポリープ53を囲む矩形(枠)54が表示されている。矩形54は、基本的に第1表示例と同様であり、矩形54の上辺に長さを示す目盛りが表示される。 (Fifth display example)
Fig. 9 shows a fifth display example of a display image displayed on the display device 2. As shown in Fig. 9, a display image 50 in the fifth display example includes only an image area 51. An endoscopic image is displayed in the image area 51. In the example of Fig. 9, a polyp 53 is detected in the endoscopic image, and a rectangle (frame) 54 surrounding the polyp 53 is displayed superimposed on the endoscopic image displayed in the image area 51. The rectangle 54 is basically the same as in the first display example, and a scale indicating the length is displayed on the upper side of the rectangle 54.
図9は、表示装置2に表示される表示画像の第5表示例を示す。図9に示すように、第5表示例の表示画像50は、映像エリア51のみを含む。映像エリア51には、内視鏡画像が表示される。図9の例では、内視鏡画像中にポリープ53が検出され、映像エリア51に表示された内視鏡画像に重畳してポリープ53を囲む矩形(枠)54が表示されている。矩形54は、基本的に第1表示例と同様であり、矩形54の上辺に長さを示す目盛りが表示される。 (Fifth display example)
Fig. 9 shows a fifth display example of a display image displayed on the display device 2. As shown in Fig. 9, a display image 50 in the fifth display example includes only an image area 51. An endoscopic image is displayed in the image area 51. In the example of Fig. 9, a polyp 53 is detected in the endoscopic image, and a rectangle (frame) 54 surrounding the polyp 53 is displayed superimposed on the endoscopic image displayed in the image area 51. The rectangle 54 is basically the same as in the first display example, and a scale indicating the length is displayed on the upper side of the rectangle 54.
また、第5表示例では、矩形54の近傍(本例では下側)に、ポリープの長径値を示すボックス55が表示される。検査者は、ボックス55を見ることにより、内視鏡検査支援装置1が計算したポリープの長径値を知ることができる。
In addition, in the fifth display example, a box 55 indicating the long diameter value of the polyp is displayed near the rectangle 54 (in this example, below). By looking at the box 55, the examiner can know the long diameter value of the polyp calculated by the endoscopic examination support device 1.
(第6表示例)
図10は、表示装置2に表示される表示画像の第6表示例を示す。第6表示例の表示画像50xは、映像エリア51のみを含む。第6表示例は、第5表示例において、第2表示例の矩形47と同様の矩形56を用いたものである。即ち、矩形56は、その上辺に5mm間隔の目盛りが設けられる。この点以外は、第6表示例は第5表示例と同様である。 (Sixth display example)
10 shows a sixth display example of a display image displayed on the display device 2. A display image 50x in the sixth display example includes only an image area 51. The sixth display example is obtained by using a rectangle 56 similar to the rectangle 47 in the second display example in the fifth display example. That is, the rectangle 56 has scale marks at 5 mm intervals on its upper side. Apart from this, the sixth display example is the same as the fifth display example.
図10は、表示装置2に表示される表示画像の第6表示例を示す。第6表示例の表示画像50xは、映像エリア51のみを含む。第6表示例は、第5表示例において、第2表示例の矩形47と同様の矩形56を用いたものである。即ち、矩形56は、その上辺に5mm間隔の目盛りが設けられる。この点以外は、第6表示例は第5表示例と同様である。 (Sixth display example)
10 shows a sixth display example of a display image displayed on the display device 2. A display image 50x in the sixth display example includes only an image area 51. The sixth display example is obtained by using a rectangle 56 similar to the rectangle 47 in the second display example in the fifth display example. That is, the rectangle 56 has scale marks at 5 mm intervals on its upper side. Apart from this, the sixth display example is the same as the fifth display example.
(第7表示例)
図11は、表示装置2に表示される表示画像の第7表示例を示す。第7表示例の表示画像50yは、映像エリア51のみを含む。第7表示例は、第5表示例において、第3表示例の矩形48と同様の矩形57を用いたものである。即ち、矩形57の内部には、ポリープの長径を示す線分54dが表示され、線分54d上に目盛りが表示される。この点以外は、第7表示例は第5表示例と同様である。 (Seventh display example)
11 shows a seventh display example of a display image displayed on the display device 2. A display image 50y in the seventh display example includes only an image area 51. The seventh display example uses a rectangle 57 similar to the rectangle 48 in the third display example in the fifth display example. That is, a line segment 54d indicating the major axis of the polyp is displayed inside the rectangle 57, and a scale is displayed on the line segment 54d. Apart from this, the seventh display example is the same as the fifth display example.
図11は、表示装置2に表示される表示画像の第7表示例を示す。第7表示例の表示画像50yは、映像エリア51のみを含む。第7表示例は、第5表示例において、第3表示例の矩形48と同様の矩形57を用いたものである。即ち、矩形57の内部には、ポリープの長径を示す線分54dが表示され、線分54d上に目盛りが表示される。この点以外は、第7表示例は第5表示例と同様である。 (Seventh display example)
11 shows a seventh display example of a display image displayed on the display device 2. A display image 50y in the seventh display example includes only an image area 51. The seventh display example uses a rectangle 57 similar to the rectangle 48 in the third display example in the fifth display example. That is, a line segment 54d indicating the major axis of the polyp is displayed inside the rectangle 57, and a scale is displayed on the line segment 54d. Apart from this, the seventh display example is the same as the fifth display example.
<第2実施形態>
図12は、第2実施形態の内視鏡検査支援装置の機能構成を示すブロック図である。内視鏡検査支援装置70は、画像取得手段71と、検出手段72と、3次元復元手段73と、長径計算手段74と、表示制御手段75と、を備える。 Second Embodiment
12 is a block diagram showing the functional configuration of an endoscopic examination support device according to the second embodiment. The endoscopic examination support device 70 includes an image acquisition unit 71, a detection unit 72, a three-dimensional reconstruction unit 73, a major axis calculation unit 74, and a display control unit 75.
図12は、第2実施形態の内視鏡検査支援装置の機能構成を示すブロック図である。内視鏡検査支援装置70は、画像取得手段71と、検出手段72と、3次元復元手段73と、長径計算手段74と、表示制御手段75と、を備える。 Second Embodiment
12 is a block diagram showing the functional configuration of an endoscopic examination support device according to the second embodiment. The endoscopic examination support device 70 includes an image acquisition unit 71, a detection unit 72, a three-dimensional reconstruction unit 73, a major axis calculation unit 74, and a display control unit 75.
図13は、第2実施形態の内視鏡検査支援装置による処理のフローチャートである。画像取得手段71は、内視鏡カメラによって撮影された内視鏡画像を取得する(ステップS71)。検出手段72は、内視鏡画像からポリープを検出する(ステップS72)。3次元復元手段73は、内視鏡画像から、撮影部位の3次元点群を復元する(ステップS73)。長径計算手段74は、3次元点群と、ポリープとに基づいて、ポリープの長径を計算する(ステップS74)。表示制御手段75は、長径の値を含む表示画像を表示装置に表示させる(ステップS75)。
FIG. 13 is a flowchart of processing by the endoscopic examination support device of the second embodiment. The image acquisition means 71 acquires an endoscopic image captured by an endoscopic camera (step S71). The detection means 72 detects polyps from the endoscopic image (step S72). The three-dimensional reconstruction means 73 reconstructs a three-dimensional point cloud of the photographed area from the endoscopic image (step S73). The major axis calculation means 74 calculates the major axis of the polyp based on the three-dimensional point cloud and the polyp (step S74). The display control means 75 causes a display image including the major axis value to be displayed on the display device (step S75).
第2実施形態の内視鏡検査支援装置70によれば、内視鏡画像からポリープのサイズを推定し、表示することが可能となる。
The second embodiment of the endoscopic examination support device 70 makes it possible to estimate and display the size of polyps from endoscopic images.
上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
Some or all of the above embodiments can be described as follows, but are not limited to the following:
(付記1)
内視鏡カメラによって撮影された内視鏡画像を取得する画像取得手段と、
前記内視鏡画像からポリープを検出する検出手段と、
前記内視鏡画像から、撮影部位の3次元点群を復元する3次元復元手段と、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算する長径計算手段と、
前記長径の値を含む表示画像を表示装置に表示させる表示制御手段と、
を備える内視鏡検査支援装置。 (Appendix 1)
an image acquisition means for acquiring an endoscopic image captured by an endoscopic camera;
a detection means for detecting a polyp from the endoscopic image;
a three-dimensional reconstruction means for reconstructing a three-dimensional point cloud of an imaging region from the endoscopic image;
a major axis calculation means for calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
a display control means for causing a display image including the value of the major axis to be displayed on a display device;
An endoscopic examination support device comprising:
内視鏡カメラによって撮影された内視鏡画像を取得する画像取得手段と、
前記内視鏡画像からポリープを検出する検出手段と、
前記内視鏡画像から、撮影部位の3次元点群を復元する3次元復元手段と、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算する長径計算手段と、
前記長径の値を含む表示画像を表示装置に表示させる表示制御手段と、
を備える内視鏡検査支援装置。 (Appendix 1)
an image acquisition means for acquiring an endoscopic image captured by an endoscopic camera;
a detection means for detecting a polyp from the endoscopic image;
a three-dimensional reconstruction means for reconstructing a three-dimensional point cloud of an imaging region from the endoscopic image;
a major axis calculation means for calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
a display control means for causing a display image including the value of the major axis to be displayed on a display device;
An endoscopic examination support device comprising:
(付記2)
前記内視鏡画像の深度を推定する深度推定手段を備え、
前記3次元復元手段は、前記深度と、前記内視鏡カメラのパラメータとに基づいて前記撮影部位の3次元点群を復元する付記1に記載の内視鏡検査支援装置。 (Appendix 2)
a depth estimation means for estimating a depth of the endoscopic image,
The endoscopic examination support device according to claim 1, wherein the three-dimensional reconstruction means reconstructs a three-dimensional point cloud of the imaging area based on the depth and parameters of the endoscopic camera.
前記内視鏡画像の深度を推定する深度推定手段を備え、
前記3次元復元手段は、前記深度と、前記内視鏡カメラのパラメータとに基づいて前記撮影部位の3次元点群を復元する付記1に記載の内視鏡検査支援装置。 (Appendix 2)
a depth estimation means for estimating a depth of the endoscopic image,
The endoscopic examination support device according to claim 1, wherein the three-dimensional reconstruction means reconstructs a three-dimensional point cloud of the imaging area based on the depth and parameters of the endoscopic camera.
(付記3)
前記3次元点群上に前記ポリープをマッピングするマッピング手段を備え、
前記長径計算手段は、前記3次元点群上の前記ポリープの領域に基づいて前記長径を計算する付記1に記載の内視鏡検査支援装置。 (Appendix 3)
a mapping means for mapping the polyp onto the three-dimensional point cloud;
The endoscopic examination support device according to claim 1, wherein the long diameter calculation means calculates the long diameter based on a region of the polyp on the three-dimensional point cloud.
前記3次元点群上に前記ポリープをマッピングするマッピング手段を備え、
前記長径計算手段は、前記3次元点群上の前記ポリープの領域に基づいて前記長径を計算する付記1に記載の内視鏡検査支援装置。 (Appendix 3)
a mapping means for mapping the polyp onto the three-dimensional point cloud;
The endoscopic examination support device according to claim 1, wherein the long diameter calculation means calculates the long diameter based on a region of the polyp on the three-dimensional point cloud.
(付記4)
前記表示画像は、前記内視鏡画像と、前記内視鏡画像上で前記ポリープの位置に表示された目盛りと、前記長径の値を示す数値と、を含む付記1に記載の内視鏡検査支援装置。 (Appendix 4)
The endoscopic examination support device of claim 1, wherein the display image includes the endoscopic image, a scale displayed at the position of the polyp on the endoscopic image, and a numerical value indicating the value of the major diameter.
前記表示画像は、前記内視鏡画像と、前記内視鏡画像上で前記ポリープの位置に表示された目盛りと、前記長径の値を示す数値と、を含む付記1に記載の内視鏡検査支援装置。 (Appendix 4)
The endoscopic examination support device of claim 1, wherein the display image includes the endoscopic image, a scale displayed at the position of the polyp on the endoscopic image, and a numerical value indicating the value of the major diameter.
(付記5)
前記表示画像は、前記内視鏡画像上で前記ポリープの位置に表示された矩形を含み、前記目盛りは前記矩形の一辺に表示されている付記4に記載の内視鏡検査支援装置。 (Appendix 5)
An endoscopic examination support device as described in Appendix 4, wherein the display image includes a rectangle displayed at the position of the polyp on the endoscopic image, and the scale is displayed on one side of the rectangle.
前記表示画像は、前記内視鏡画像上で前記ポリープの位置に表示された矩形を含み、前記目盛りは前記矩形の一辺に表示されている付記4に記載の内視鏡検査支援装置。 (Appendix 5)
An endoscopic examination support device as described in Appendix 4, wherein the display image includes a rectangle displayed at the position of the polyp on the endoscopic image, and the scale is displayed on one side of the rectangle.
(付記6)
前記表示画像は、前記内視鏡画像上で前記ポリープの長径を示す線分を含み、前記目盛りは前記線分上に表示されている付記4に記載の内視鏡検査支援装置。 (Appendix 6)
The endoscopic examination support device according to claim 4, wherein the display image includes a line segment indicating the long diameter of the polyp on the endoscopic image, and the scale is displayed on the line segment.
前記表示画像は、前記内視鏡画像上で前記ポリープの長径を示す線分を含み、前記目盛りは前記線分上に表示されている付記4に記載の内視鏡検査支援装置。 (Appendix 6)
The endoscopic examination support device according to claim 4, wherein the display image includes a line segment indicating the long diameter of the polyp on the endoscopic image, and the scale is displayed on the line segment.
(付記7)
前記表示画像は、前記内視鏡画像を表示する第1領域と、前記第1領域とは異なる第2領域とを有し、
前記目盛りは前記第1領域に表示され、前記長径の値を示す数値は前記第2領域に表示される付記4に記載の内視鏡検査支援装置。 (Appendix 7)
the display image has a first area for displaying the endoscopic image and a second area different from the first area,
An endoscopic examination support device as described in Appendix 4, wherein the scale is displayed in the first area and a numerical value indicating the value of the major diameter is displayed in the second area.
前記表示画像は、前記内視鏡画像を表示する第1領域と、前記第1領域とは異なる第2領域とを有し、
前記目盛りは前記第1領域に表示され、前記長径の値を示す数値は前記第2領域に表示される付記4に記載の内視鏡検査支援装置。 (Appendix 7)
the display image has a first area for displaying the endoscopic image and a second area different from the first area,
An endoscopic examination support device as described in Appendix 4, wherein the scale is displayed in the first area and a numerical value indicating the value of the major diameter is displayed in the second area.
(付記8)
前記長径の値を示す数値は、前記内視鏡画像上に表示される付記4に記載の内視鏡検査支援装置。 (Appendix 8)
An endoscopic examination support device as described in Appendix 4, wherein a numerical value indicating the value of the major axis is displayed on the endoscopic image.
前記長径の値を示す数値は、前記内視鏡画像上に表示される付記4に記載の内視鏡検査支援装置。 (Appendix 8)
An endoscopic examination support device as described in Appendix 4, wherein a numerical value indicating the value of the major axis is displayed on the endoscopic image.
(付記9)
コンピュータにより実行され、
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる内視鏡検査支援方法。 (Appendix 9)
Executed by a computer,
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
An endoscopic examination support method for displaying a display image including the value of the major axis on a display device.
コンピュータにより実行され、
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる内視鏡検査支援方法。 (Appendix 9)
Executed by a computer,
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
An endoscopic examination support method for displaying a display image including the value of the major axis on a display device.
(付記10)
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる処理をコンピュータに実行させるプログラムを記録した記録媒体。 (Appendix 10)
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
A recording medium having recorded thereon a program for causing a computer to execute a process for displaying, on a display device, a display image including the value of the major axis.
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる処理をコンピュータに実行させるプログラムを記録した記録媒体。 (Appendix 10)
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
A recording medium having recorded thereon a program for causing a computer to execute a process for displaying, on a display device, a display image including the value of the major axis.
以上、実施形態及び実施例を参照して本開示を説明したが、本開示は上記実施形態及び実施例に限定されるものではない。本開示の構成や詳細には、本開示のスコープ内で当業者が理解し得る様々な変更をすることができる。
The present disclosure has been described above with reference to embodiments and examples, but the present disclosure is not limited to the above embodiments and examples. Various modifications that can be understood by a person skilled in the art can be made to the configuration and details of the present disclosure within the scope of the present disclosure.
1 内視鏡検査支援装置
2 表示装置
3 内視鏡スコープ
11 プロセッサ
12 メモリ
13 インターフェース
21 深度推定部
22、24 3次元復元部
23 病変検知部
25 ポリープ領域マッピング部
26 ポリープ長径計算部
27 表示画像生成部
100 内視鏡検査システム REFERENCE SIGNS LIST 1 Endoscopic examination support device 2 Display device 3 Endoscope scope 11 Processor 12 Memory 13 Interface 21 Depth estimation unit 22, 24 Three-dimensional reconstruction unit 23 Lesion detection unit 25 Polyp region mapping unit 26 Polyp long diameter calculation unit 27 Display image generation unit 100 Endoscopic examination system
2 表示装置
3 内視鏡スコープ
11 プロセッサ
12 メモリ
13 インターフェース
21 深度推定部
22、24 3次元復元部
23 病変検知部
25 ポリープ領域マッピング部
26 ポリープ長径計算部
27 表示画像生成部
100 内視鏡検査システム REFERENCE SIGNS LIST 1 Endoscopic examination support device 2 Display device 3 Endoscope scope 11 Processor 12 Memory 13 Interface 21 Depth estimation unit 22, 24 Three-dimensional reconstruction unit 23 Lesion detection unit 25 Polyp region mapping unit 26 Polyp long diameter calculation unit 27 Display image generation unit 100 Endoscopic examination system
Claims (10)
- 内視鏡カメラによって撮影された内視鏡画像を取得する画像取得手段と、
前記内視鏡画像からポリープを検出する検出手段と、
前記内視鏡画像から、撮影部位の3次元点群を復元する3次元復元手段と、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算する長径計算手段と、
前記長径の値を含む表示画像を表示装置に表示させる表示制御手段と、
を備える内視鏡検査支援装置。 an image acquisition means for acquiring an endoscopic image captured by an endoscopic camera;
a detection means for detecting a polyp from the endoscopic image;
a three-dimensional reconstruction means for reconstructing a three-dimensional point cloud of an imaging region from the endoscopic image;
a major axis calculation means for calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
a display control means for causing a display image including the value of the major axis to be displayed on a display device;
An endoscopic examination support device comprising: - 前記内視鏡画像の深度を推定する深度推定手段を備え、
前記3次元復元手段は、前記深度と、前記内視鏡カメラのパラメータとに基づいて前記撮影部位の3次元点群を復元する請求項1に記載の内視鏡検査支援装置。 a depth estimation means for estimating a depth of the endoscopic image,
The endoscopic examination support device according to claim 1 , wherein the three-dimensional reconstruction means reconstructs a three-dimensional point cloud of the imaging region based on the depth and parameters of the endoscope camera. - 前記3次元点群上に前記ポリープをマッピングするマッピング手段を備え、
前記長径計算手段は、前記3次元点群上の前記ポリープの領域に基づいて前記長径を計算する請求項1に記載の内視鏡検査支援装置。 a mapping means for mapping the polyp onto the three-dimensional point cloud;
2. The endoscopic examination support device according to claim 1, wherein the major axis calculation means calculates the major axis based on a region of the polyp on the three-dimensional point cloud. - 前記表示画像は、前記内視鏡画像と、前記内視鏡画像上で前記ポリープの位置に表示された目盛りと、前記長径の値を示す数値と、を含む請求項1に記載の内視鏡検査支援装置。 The endoscopic examination support device according to claim 1, wherein the display image includes the endoscopic image, a scale displayed at the position of the polyp on the endoscopic image, and a numerical value indicating the value of the major axis.
- 前記表示画像は、前記内視鏡画像上で前記ポリープの位置に表示された矩形を含み、前記目盛りは前記矩形の一辺に表示されている請求項4に記載の内視鏡検査支援装置。 The endoscopic examination support device according to claim 4, wherein the display image includes a rectangle displayed at the position of the polyp on the endoscopic image, and the scale is displayed on one side of the rectangle.
- 前記表示画像は、前記内視鏡画像上で前記ポリープの長径を示す線分を含み、前記目盛りは前記線分上に表示されている請求項4に記載の内視鏡検査支援装置。 The endoscopic examination support device according to claim 4, wherein the display image includes a line segment indicating the major axis of the polyp on the endoscopic image, and the scale is displayed on the line segment.
- 前記表示画像は、前記内視鏡画像を表示する第1領域と、前記第1領域とは異なる第2領域とを有し、
前記目盛りは前記第1領域に表示され、前記長径の値を示す数値は前記第2領域に表示される請求項4に記載の内視鏡検査支援装置。 the display image has a first area for displaying the endoscopic image and a second area different from the first area,
The endoscopic examination support device according to claim 4 , wherein the scale is displayed in the first area, and a numerical value indicating the value of the major axis is displayed in the second area. - 前記長径の値を示す数値は、前記内視鏡画像上に表示される請求項4に記載の内視鏡検査支援装置。 The endoscopic examination support device according to claim 4, wherein a numerical value indicating the value of the major axis is displayed on the endoscopic image.
- コンピュータにより実行され、
内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる内視鏡検査支援方法。 Executed by a computer,
Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
An endoscopic examination support method for displaying a display image including the value of the major axis on a display device. - 内視鏡カメラによって撮影された内視鏡画像を取得し、
前記内視鏡画像からポリープを検出し、
前記内視鏡画像から、撮影部位の3次元点群を復元し、
前記3次元点群と、前記ポリープとに基づいて、前記ポリープの長径を計算し、
前記長径の値を含む表示画像を表示装置に表示させる処理をコンピュータに実行させるプログラムを記録した記録媒体。 Acquire an endoscopic image captured by an endoscopic camera;
Detecting polyps from the endoscopic image;
Reconstructing a three-dimensional point cloud of the imaging region from the endoscopic image;
Calculating a major axis of the polyp based on the three-dimensional point cloud and the polyp;
A recording medium having recorded thereon a program for causing a computer to execute a process for displaying, on a display device, a display image including the value of the major axis.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2023/011446 WO2024195100A1 (en) | 2023-03-23 | 2023-03-23 | Endoscopy assistance device, endoscopy assistance method, and recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2023/011446 WO2024195100A1 (en) | 2023-03-23 | 2023-03-23 | Endoscopy assistance device, endoscopy assistance method, and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024195100A1 true WO2024195100A1 (en) | 2024-09-26 |
Family
ID=92841390
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/011446 WO2024195100A1 (en) | 2023-03-23 | 2023-03-23 | Endoscopy assistance device, endoscopy assistance method, and recording medium |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024195100A1 (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010512173A (en) * | 2006-08-21 | 2010-04-22 | エスティーアイ・メディカル・システムズ・エルエルシー | Computer-aided analysis using video from an endoscope |
JP2020141712A (en) * | 2019-03-04 | 2020-09-10 | 富士フイルム株式会社 | Endoscope apparatus, calibration device and calibration method |
JP2021014989A (en) * | 2017-11-07 | 2021-02-12 | シャープ株式会社 | Measurement device, measurement device control method, measurement program and recording medium |
JP2021045337A (en) * | 2019-09-18 | 2021-03-25 | 富士フイルム株式会社 | Medical image processing device, processor device, endoscope system, medical image processing method, and program |
KR20210150695A (en) * | 2020-06-04 | 2021-12-13 | 계명대학교 산학협력단 | Image-based size estimation system and method for calculating lesion size through endoscopic imaging |
JP2022535873A (en) * | 2019-06-04 | 2022-08-10 | マゼンティーク アイ リミテッド | Systems and methods for processing colon images and videos |
WO2022190366A1 (en) * | 2021-03-12 | 2022-09-15 | オリンパス株式会社 | Shape measurement system for endoscope and shape measurement method for endoscope |
WO2022230160A1 (en) * | 2021-04-30 | 2022-11-03 | オリンパスメディカルシステムズ株式会社 | Endoscopic system, lumen structure calculation system, and method for creating lumen structure information |
-
2023
- 2023-03-23 WO PCT/JP2023/011446 patent/WO2024195100A1/en unknown
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010512173A (en) * | 2006-08-21 | 2010-04-22 | エスティーアイ・メディカル・システムズ・エルエルシー | Computer-aided analysis using video from an endoscope |
JP2021014989A (en) * | 2017-11-07 | 2021-02-12 | シャープ株式会社 | Measurement device, measurement device control method, measurement program and recording medium |
JP2020141712A (en) * | 2019-03-04 | 2020-09-10 | 富士フイルム株式会社 | Endoscope apparatus, calibration device and calibration method |
JP2022535873A (en) * | 2019-06-04 | 2022-08-10 | マゼンティーク アイ リミテッド | Systems and methods for processing colon images and videos |
JP2021045337A (en) * | 2019-09-18 | 2021-03-25 | 富士フイルム株式会社 | Medical image processing device, processor device, endoscope system, medical image processing method, and program |
KR20210150695A (en) * | 2020-06-04 | 2021-12-13 | 계명대학교 산학협력단 | Image-based size estimation system and method for calculating lesion size through endoscopic imaging |
WO2022190366A1 (en) * | 2021-03-12 | 2022-09-15 | オリンパス株式会社 | Shape measurement system for endoscope and shape measurement method for endoscope |
WO2022230160A1 (en) * | 2021-04-30 | 2022-11-03 | オリンパスメディカルシステムズ株式会社 | Endoscopic system, lumen structure calculation system, and method for creating lumen structure information |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109069097B (en) | Dental three-dimensional data processing device and method thereof | |
JP6348078B2 (en) | Branch structure determination apparatus, operation method of branch structure determination apparatus, and branch structure determination program | |
JP6594133B2 (en) | Endoscope position specifying device, operation method of endoscope position specifying device, and endoscope position specifying program | |
CN100399978C (en) | Endoscope system | |
KR101930851B1 (en) | A skin analysis and diagnosis system for 3D face modeling | |
JP6254053B2 (en) | Endoscopic image diagnosis support apparatus, system and program, and operation method of endoscopic image diagnosis support apparatus | |
US10939800B2 (en) | Examination support device, examination support method, and examination support program | |
JP6824078B2 (en) | Endoscope positioning device, method and program | |
JP2015033556A (en) | Image processing system and image processing method | |
WO2021234907A1 (en) | Image processing device, control method, and storage medium | |
US10970875B2 (en) | Examination support device, examination support method, and examination support program | |
JPH11104072A (en) | Medical support system | |
JP7562886B2 (en) | PROGRAM, INFORMATION PROCESSING METHOD AND ENDOSCOPIC SYSTEM | |
CN117392109A (en) | Mammary gland focus three-dimensional reconstruction method and system | |
JP7441934B2 (en) | Processing device, endoscope system, and method of operating the processing device | |
KR20160057024A (en) | Markerless 3D Object Tracking Apparatus and Method therefor | |
WO2024195100A1 (en) | Endoscopy assistance device, endoscopy assistance method, and recording medium | |
KR20210150633A (en) | System and method for measuring angle and depth of implant surgical instrument | |
JP7023195B2 (en) | Inspection support equipment, methods and programs | |
EP4193589B1 (en) | Real time augmentation | |
JP2015136480A (en) | Three-dimensional medical image display control device and operation method for the same, and three-dimensional medical image display control program | |
WO2024028934A1 (en) | Endoscopy assistance device, endoscopy assistance method, and recording medium | |
JP6745748B2 (en) | Endoscope position specifying device, its operating method and program | |
WO2023275974A1 (en) | Image processing device, image processing method, and storage medium | |
WO2024121886A1 (en) | Endoscopic examination assistance device, endoscopic examination assistance method, and recording medium |