WO2016170656A1 - 画像処理装置、画像処理方法および画像処理プログラム - Google Patents
画像処理装置、画像処理方法および画像処理プログラム Download PDFInfo
- Publication number
- WO2016170656A1 WO2016170656A1 PCT/JP2015/062428 JP2015062428W WO2016170656A1 WO 2016170656 A1 WO2016170656 A1 WO 2016170656A1 JP 2015062428 W JP2015062428 W JP 2015062428W WO 2016170656 A1 WO2016170656 A1 WO 2016170656A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tube
- unit
- specific area
- image processing
- image
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present invention relates to an image processing apparatus, an image processing method, and an image processing program for performing image processing on an in-tube image.
- a technique for detecting an abnormal region in a living body based on pixel value gradient information of an endoscopic image that is, a pixel value surface shape feature amount, or an abnormal region based on edge information that is an outline feature amount of an endoscopic image.
- a technique for detection is disclosed (see, for example, Patent Document 1). This technique evaluates the isotropic pixel value gradient, that is, whether or not an equivalent gradient is generated in any of the surrounding directions, and whether or not the edge shape is an arc shape having a predetermined size. An abnormal region is detected by evaluating the above.
- Endoscopic images can be taken from the front of the inner wall of the tube, from the side, from a wide distance, close to the subject, out of focus, blurring, and motion blur. Images taken under are included. Since how to capture a predetermined specific area such as an abnormal area changes according to the shooting situation, there is a problem that even if the same detection technique is applied without taking the change into account, the detection performance is not improved.
- the present invention has been made in view of the above, and an object of the present invention is to provide an image processing apparatus, an image processing method, and an image processing program capable of accurately detecting a specific area in a tube.
- the image processing apparatus provides an in-tube imaging situation determined based on a relationship between a subject and a side on which the subject is imaged in an in-tube image taken inside the tube.
- An image processing method includes an in-tube imaging situation analysis step for analyzing an in-tube imaging situation determined based on a relationship between a subject and a side on which the subject is imaged in an in-tube image obtained by imaging the inside of the tube; A specific area detection step of calculating a specific area identification index and detecting the specific area by integrated determination of the specific area identification index according to the in-pipe photographing condition.
- An image processing program includes an in-tube imaging situation analyzing step for analyzing an in-tube imaging situation determined based on a relationship between a subject and a side on which the subject is imaged in an in-tube image obtained by imaging the inside of the tube; And a specific area detecting step of detecting the specific area by integrated determination of the specific area identification index according to the in-pipe photographing condition.
- the specific area in the pipe can be detected with high accuracy.
- FIG. 1 is a diagram (part 1) for explaining an overview of an embodiment of the present invention.
- FIG. 2 is a diagram (part 2) for explaining the outline of the embodiment of the present invention.
- FIG. 3 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 4 is a flowchart showing an outline of processing performed by the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 5 is a flowchart showing an outline of processing performed by the tube depth region detection unit of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 6 is a block diagram showing a functional configuration of the image processing apparatus according to Modification 1-1 of Embodiment 1 of the present invention.
- FIG. 1 is a diagram (part 1) for explaining an overview of an embodiment of the present invention.
- FIG. 2 is a diagram (part 2) for explaining the outline of the embodiment of the present invention.
- FIG. 3 is a block diagram showing a functional configuration of
- FIG. 7 is a flowchart showing an outline of processing performed by the image processing apparatus according to Modification 1-1 of Embodiment 1 of the present invention.
- FIG. 8 is a flowchart showing an outline of processing performed by the inner wall gradient calculation unit of the image processing apparatus according to Modification 1-1 of Embodiment 1 of the present invention.
- FIG. 9 is a block diagram showing a functional configuration of an image processing apparatus according to Modification 1-2 of Embodiment 1 of the present invention.
- FIG. 10 is a diagram schematically illustrating a setting example of a feature amount calculation region set by the shape direction identification unit of the image processing apparatus according to the modified example 1-2 of the first embodiment of the present invention.
- FIG. 11 is a flowchart showing an outline of processing performed by the image processing apparatus according to Modification 1-2 of Embodiment 1 of the present invention.
- FIG. 12 is a block diagram showing a functional configuration of an image processing apparatus according to Modification 1-3 of Embodiment 1 of the present invention.
- FIG. 13 is a flowchart showing an outline of processing performed by the image processing apparatus according to Modification 1-3 of Embodiment 1 of the present invention.
- FIG. 14 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 2 of the present invention.
- FIG. 15 is a flowchart showing an outline of processing performed by the image processing apparatus according to Embodiment 2 of the present invention.
- FIG. 12 is a block diagram showing a functional configuration of an image processing apparatus according to Modification 1-3 of Embodiment 1 of the present invention.
- FIG. 13 is a flowchart showing an outline of processing performed by the image processing apparatus according to Modification 1-3 of Embodiment 1 of the present invention
- FIG. 16 is a flowchart illustrating an outline of processing performed by the shooting distance estimation unit of the image processing apparatus according to the second embodiment of the present invention.
- FIG. 17 is a block diagram showing a functional configuration of an image processing apparatus according to Modification 2-1 of Embodiment 2 of the present invention.
- FIG. 18 is a flowchart showing an outline of processing performed by the image processing apparatus according to Modification 2-1 of Embodiment 2 of the present invention.
- FIG. 19 is a flowchart showing an outline of processing performed by the defocus analysis unit of the image processing apparatus according to Modification 2-1 of Embodiment 2 of the present invention.
- FIG. 20 is a block diagram showing a functional configuration of an image processing apparatus according to Modification 2-2 of Embodiment 2 of the present invention.
- FIG. 21 is a flowchart showing an outline of processing performed by the image processing apparatus according to Modification 2-2 of Embodiment 2 of the present invention.
- FIG. 22 is a flowchart showing an outline of processing performed by the motion blur analysis unit of the image processing apparatus according to Modification 2-2 of Embodiment 2 of the present invention.
- FIG. 23 is a block diagram showing a functional configuration of an image processing apparatus according to Modification 2-3 of Embodiment 2 of the present invention.
- FIG. 24 is a flowchart showing an outline of processing performed by the image processing apparatus according to Modification 2-3 of Embodiment 2 of the present invention.
- FIG. 25 is a block diagram showing a functional configuration of an image processing apparatus according to Modification 2-4 of Embodiment 2 of the present invention.
- FIG. 26 is a flowchart showing an outline of processing performed by the image processing apparatus according to Modification 2-4 of Embodiment 2 of the present invention.
- FIG. 27 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 3 of the present invention.
- FIG. 28 is a flowchart showing an outline of processing performed by the image processing apparatus according to Embodiment 3 of the present invention.
- FIG. 29 is a block diagram showing a functional configuration of an image processing apparatus according to Embodiment 4 of the present invention.
- FIG. 30 is a flowchart showing an outline of processing performed by the image processing apparatus according to Embodiment 4 of the present invention.
- FIG. 1 and 2 are diagrams for explaining the outline of the embodiment of the present invention. Specifically, FIG. 1 and FIG. 2 are diagrams schematically showing an image of a living body (intraductal image) taken by an endoscope that is introduced into the living body and observes the living body.
- Endoscopes are often photographed obliquely with respect to the mucosal surface of the inner wall of a living body.
- the endoscopic image captured by the endoscope shows a lesion from the mucosal surface near the imaging distance to the mucosal surface in the deep part of the tube that is far from the imaging distance.
- the endoscope may photograph from the front with respect to the mucosal surface of the inner wall of the living body.
- the deep part of the tube is not captured, and the way the abnormal region is imaged is different from the case of photographing from an oblique direction.
- the shooting distance to the inner wall mucosa surface of the living body tube varies depending on the image, and there may be a case where the image is out of focus or motion blur.
- the image processing apparatus is characterized in that it performs adaptive detection of a specific area including an abnormal area by analyzing the difference in the above-described shooting situation.
- the specific region is a region where the property or state of the subject in the in-pipe image satisfies a predetermined condition.
- the region is such that the tissue properties or in-vivo state of the living body satisfies a predetermined condition.
- the specific area may be a partial area of the image or an entire area of the image.
- the image captured by the endoscope is assumed to be a color image having pixel values for each wavelength component of R (red), G (green), and B (blue) at each pixel position, but is not limited thereto. I don't mean.
- FIG. 3 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 1 of the present invention.
- An image processing apparatus 1 shown in FIG. 1 includes a calculation unit 2 and a storage unit 3.
- the calculation unit 2 includes an in-pipe shooting state analysis unit 4 that analyzes the in-pipe shooting state determined based on the relationship between the subject and the shooting side in the in-pipe image, and a specific region detection unit that detects a specific region according to the in-pipe shooting state. And 5.
- the in-pipe photographing state analysis unit 4 includes a deep tube region detection unit 41 that detects a deep tube region in the tube image.
- the tube deep region detection unit 41 includes a low absorption wavelength component selection unit 411 that selects a low absorption wavelength component having the lowest degree of absorption / scattering in a living body, and pixels in an edge peripheral region in the low absorption wavelength component in the tube image.
- An edge peripheral region exclusion unit 412 that excludes pixels in the edge peripheral region, and a low pixel value region detection unit 413 that detects a region below a predetermined threshold in the image of the low absorption wavelength component after the pixels in the edge peripheral region are excluded.
- the tube deep region detection unit 41 performs a known labeling process (reference: CG-ARTS Association: Digital Image Processing: 181P, labeling) on the pixels detected by the low pixel value region detection unit 413, and connects the connected pixels. After being combined as one region, the largest one of the regions whose area is equal to or greater than a predetermined threshold is detected as the tube deep region. The tube depth region detection unit 41 determines that there is no tube depth region if there is no region equal to or greater than the predetermined threshold.
- the low absorption wavelength component selection unit 411 is a component that is separated from the blood absorption band and has a long wavelength, and is affected by absorption and scattering in the living body.
- R component which is a component which is hard to receive is selected.
- the edge peripheral region excluding unit 412 identifies the edge region by applying, for example, a known edge extraction process (reference: CG-ARTS Association: digital image processing: 114P, edge extraction: 209P, contour detection)
- a peripheral region is specified and excluded by performing known expansion processing (reference: CG-ARTS Association: digital image processing: 179P, contraction / expansion processing) on the region.
- CG-ARTS Association digital image processing: 179P, contraction / expansion processing
- the low pixel value area detection unit 413 detects a pixel whose pixel value is equal to or smaller than a predetermined threshold in the low absorption wavelength component image after excluding the edge peripheral area.
- the low pixel value region detection unit 413 calculates a pixel value equal to or less than a threshold value set based on the pixel value range taken by the pixels of the in-tube image in the in-tube image of the low absorption wavelength component after excluding the edge peripheral region. You may make it detect the pixel which has.
- the specific area detection unit 5 is an integrated determination of a specific area identification index 51 that calculates a plurality of specific area identification indexes based on feature amounts calculated from a plurality of areas with different ranges, and a specific area identification index according to the in-tube photographing situation. And an integrated determination unit 52 for detecting a specific area.
- the feature range identification unit 51 first sets a plurality of feature amount calculation regions from a small range to a large range at arbitrary positions in the image. At this time, a plurality of feature amount calculation regions having the same center position and different ranges are also set. Then, the feature range identification unit 51 calculates a feature amount from each region. Various features such as already known colors, contours (edges), pixel value surface shapes (pixel value gradients), textures, and the like can be considered. A plurality of feature amounts calculated from one feature amount calculation area are collected as a feature vector. Feature vectors are generated in the number corresponding to the set feature amount calculation areas.
- the feature range identification unit 51 calculates a specific region identification index based on the feature vector.
- a specific region identification index P (x) indicating whether or not the feature vector x satisfies a specific condition is calculated based on a probability model shown in the following formula (1). it can.
- k is the number of dimensions of the feature vector
- x is the feature vector (k ⁇ 1 matrix) of the inspection region to be identified
- ⁇ is the average vector (k of feature vectors in the sample (plurality) of the specific region.
- Z is a variance-covariance matrix (k ⁇ k matrix) of feature vectors in a plurality of samples in a specific region,
- the specific area identification index calculation method using the probability model is illustrated here, any method may be used as long as the specific area identification index can be calculated.
- the specific region identification index may be calculated using a method based on a feature space distance with a representative feature vector, a method based on a distance from an identification boundary in the feature space, or the like.
- the integrated determination unit 52 makes a determination with emphasis on a specific region identification index based on a large feature amount of the feature amount calculation region. More specifically, the integrated determination unit 52 calculates a final determination index by multiplying each of the plurality of specific area identification indexes by a weight according to the degree of emphasis and adding the weight. When the value is greater than or equal to the threshold, it is detected as a specific area. Thus, emphasizing the specific area identification index means that the calculation is performed with the weight of the specific area identification index being larger than the weight of the other specific area identification index.
- the integrated determination unit 52 performs the integrated determination with emphasis on the specific region identification index based on the feature amount (local information) having a small feature amount calculation range, and detects the specific region.
- the calculation unit 2 is realized by using a general-purpose processor such as a CPU (Central Processing Unit) or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC (Application Specific Integrated Circuit).
- a general-purpose processor such as a CPU (Central Processing Unit) or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC (Application Specific Integrated Circuit).
- the various operations stored in the storage unit 3 are read to give instructions to each unit constituting the image processing apparatus 1 and data transfer to control the entire operation of the image processing apparatus 1. And control.
- the processor may execute various processes independently, or the processor and the storage unit 3 may be combined or combined by using various data stored in the storage unit 3. Various processes may be executed.
- the arithmetic units described in the embodiments and modifications described later are also realized in the same manner as the arithmetic unit 2.
- the storage unit 3 is constituted by various IC memories such as ROM (Read Only Memory) or RAM (Random Access Memory), a hard disk connected with a built-in or data communication terminal, or an information recording device such as a CD-ROM and its reading device. Realized.
- the storage unit 3 operates the image processing apparatus 1 in addition to the image data of the in-pipe image acquired by the image processing apparatus 1 and also causes the image processing apparatus 1 to execute various functions, and the program is being executed.
- the data used for the is stored.
- the storage unit 3 stores the image processing program according to the first embodiment and various parameters such as threshold values used in the image processing. It goes without saying that the storage unit described in the embodiment described later is also realized in the same manner as the storage unit 3.
- Various programs such as an image processing program stored in the storage unit 3 can be recorded on a computer-readable recording medium.
- the recording of various programs in the storage unit 3 or the recording medium may be performed when the computer or the recording medium is shipped as a product, or may be performed by downloading via a communication network.
- the communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network), etc., and may be wired or wireless.
- the image processing apparatus 1 having the above configuration may be realized using a single computer or a plurality of computers. In the latter case, it is also possible to perform processing in cooperation with each other while transmitting and receiving data via a communication network.
- the computer here can be comprised by a general purpose personal computer, a server, etc., for example. The same applies to the image processing apparatus described in the embodiments and modifications described later.
- the function of the image processing apparatus 1 described above is part of an endoscope system that is introduced into a subject and observes the subject, and is provided in a processor that controls the entire endoscope system. It is also possible. The same applies to the image processing apparatus described in the embodiments and modifications described later.
- FIG. 4 is a flowchart showing an outline of processing executed by the image processing apparatus 1.
- the calculating part 2 acquires the in-pipe image which is a process target (step S1).
- FIG. 5 is a flowchart showing an outline of the processing performed by the pipe depth region detection unit 41.
- the low absorption wavelength component selection unit 411 selects a low absorption wavelength component with the lowest degree of absorption / scattering in the living body (step S11). For example, in the case of an image composed of R, G, and B components, the low absorption wavelength component selection unit 411 selects the R component as described above.
- the edge peripheral area exclusion unit 412 excludes pixels in the edge peripheral area in the in-pipe image of the low absorption wavelength component (step S12). Thereby, it is possible to prevent the edge peripheral region from being erroneously detected as the tube deep region.
- the low pixel value region detection unit 413 detects a region having a low pixel value, that is, a pixel region having a pixel value equal to or less than a predetermined threshold in the low absorption wavelength component image after excluding the edge peripheral region ( Step S13). As described above, since the imaging distance is long in the tube deep portion, the pixel value of the image of the low absorption wavelength component is low.
- the tube deep region detection unit 41 detects a tube deep region by performing a known labeling process or the like based on the region detected by the low pixel value region detection unit 413 (step S14). Thereby, the tube depth region detection process (step S2) by the tube depth region detection unit 41 ends.
- the method of detecting the tube deep region based on the pixel value correlated with the photographing distance is shown.
- this is only an example, for example, the method disclosed in Japanese Patent Application Laid-Open No. 2003-93328, etc.
- the tube deep region may be detected based on the above.
- processing such as correction of pixel value unevenness caused by the optical system or illumination system, or exclusion of non-mucosal regions such as specular reflection, residue, and bubbles may be performed. Thereby, the fall of the precision of each subsequent process can be suppressed.
- step S3 the feature range identifying unit 51 calculates a plurality of specific region identification indexes based on feature amounts calculated from a plurality of regions having different ranges (step S3).
- the feature range identification unit 51 calculates the specific region identification index P (x) based on, for example, the probability model shown in the above-described equation (1).
- the integration determination unit 52 performs integration determination by changing the specific region identification index to be emphasized according to the presence or absence of the deep tube region, and detects the specific region (step S4).
- the calculation unit 2 outputs the detection result of the specific area (step S5).
- the image processing apparatus 1 ends the series of processes. Note that the order of the tube deep region detection processing in step S2 and the specific region identification index calculation processing in step S3 may be reversed or may be performed in parallel.
- the shooting direction (oblique, front) with respect to the tube inner wall.
- the specific area identification index based on the more effective feature amount can be emphasized, and the specific area can be detected with high accuracy.
- FIG. 6 is a block diagram illustrating a functional configuration of the image processing apparatus according to the modified example 1-1 of the first embodiment.
- the image processing apparatus 1A shown in the figure components having the same functions as those of the image processing apparatus 1 shown in FIG.
- the image processing apparatus 1A includes a calculation unit 2A and a storage unit 3.
- the calculation unit 2A includes an in-pipe photographing state analysis unit 4A and a specific area detection unit 5.
- the in-pipe photographing state analysis unit 4A includes an inner wall gradient calculation unit 42 that calculates a gradient of the inner wall of the tube (inner wall gradient) in the image in the tube.
- the inner wall gradient calculating unit 42 selects a low absorption wavelength component that has the lowest degree of absorption / scattering in the living body, and a pixel value gradient calculation that calculates a pixel value gradient of the low absorption wavelength component. Part 421.
- the pixel value gradient calculation unit 421 calculates the magnitude and direction of the pixel value gradient based on the X-direction primary differential filter output ⁇ X of the predetermined size and the Y-direction primary differential filter output ⁇ Y of the same size (reference). : CG-ARTS Association: Digital image processing: 115P, differential filter). The pixel value gradient calculation unit 421 may calculate the gradient of the inner wall of the tube at each pixel position or may be calculated at a predetermined sampling interval.
- the integration determination unit 52 of the specific region detection unit 5 performs the integrated determination by changing the specific region identification index to be emphasized according to the magnitude of the inner wall gradient, and detects the specific region. If the average value of the gradient of the inner wall calculated at multiple locations is greater than or equal to a predetermined threshold, it is considered that the inner wall of the tube is photographed obliquely, and is based on global information using the entire tube structure It is considered that the detection accuracy of the specific area is high. In this case, the integrated determination unit 52 makes a determination with emphasis on a specific area identification index based on a feature amount having a large feature amount calculation range.
- the integrated determination unit 52 makes a determination with emphasis on a specific area identification index based on a feature amount having a small feature amount calculation range.
- FIG. 7 is a flowchart showing an outline of processing performed by the image processing apparatus 1A.
- the same step numbers are assigned to the same processes as those in the flowchart shown in FIG. Hereinafter, the process following step S1 will be described.
- step S2A the inner wall gradient calculating unit 42 calculates the gradient of the tube inner wall in the tube image (step S2A).
- FIG. 8 is a flowchart showing an outline of processing performed by the inner wall gradient calculation unit 42.
- the processing of the inner wall gradient calculation unit 42 will be described with reference to FIG.
- the low absorption wavelength component selection unit 411 selects the low absorption wavelength component having the lowest degree of absorption / scattering in the living body (step S21).
- the pixel value gradient calculation unit 421 calculates a pixel value gradient of the selected low absorption wavelength component (step S22). Thereby, the gradient calculation process (step S2A) of the tube inner wall in the tube image by the inner wall gradient calculation unit 42 is completed.
- step S3 subsequent to step S2A, the feature range identifying unit 51 calculates a plurality of specific region identification indexes based on feature amounts calculated from a plurality of regions having different ranges (step S3).
- the integration determination unit 52 performs integration determination by changing the specific area identification index to be emphasized according to the magnitude of the inner wall gradient, and detects the specific area (step S4A).
- the calculation unit 2A outputs the detection result of the specific area (step S5).
- the image processing apparatus 1A ends a series of processes. Note that the order of the inner wall gradient calculation process in step S2A and the specific area identification index calculation process in step S3 may be reversed or may be performed in parallel.
- the specific area identification index based on the more effective feature amount can be emphasized, and the specific area can be detected with high accuracy.
- the in-pipe photographing state analysis unit 4A may further include the tube depth region detection unit 41 described in the first embodiment.
- the integration determination unit 52 performs integration determination according to the presence / absence of the deep tube region and the inner wall gradient.
- FIG. 9 is a block diagram illustrating a functional configuration of the image processing apparatus according to the modified example 1-2 of the first embodiment.
- the image processing apparatus 1B shown in the figure components similar to those in the image processing apparatus 1 shown in FIG.
- the image processing apparatus 1B includes a calculation unit 2B and a storage unit 3.
- the calculation unit 2B includes an in-tube photographing situation analysis unit 4 and a specific area detection unit 5B.
- the specific region detection unit 5B includes a shape direction identification unit 53 that calculates a plurality of specific region identification indexes based on feature amounts calculated from a plurality of regions having different shapes and / or directions, and an integrated determination unit 52.
- FIG. 10 is a diagram schematically illustrating a setting example of the feature amount calculation region set by the shape direction identification unit 53.
- the shape-direction identifying unit 53 is composed of a circular region and a plurality of elliptical regions (four elliptical regions in FIG. 10) having mutually different major axis directions around an arbitrary position C of the in-tube image 201.
- a feature amount calculation region group 202 is set.
- the shape direction discriminating unit 53 may set a plurality of feature amount calculation region groups 202 while scanning the center position of the feature amount calculation region group 202 on the in-pipe image 201 according to a predetermined rule, or different centers.
- a plurality of feature amount calculation region groups 202 having positions may be set in parallel.
- the shape of the region of the feature amount calculation region group is not limited to that described in FIG.
- the shape-direction identifying unit 53 may set a plurality of feature amount calculation areas by setting a circular area and then scanning the in-tube image 201 while rotating the circular area.
- the integration determination unit 52 performs integration determination by changing the specific region identification index to be emphasized according to the direction of the deep region, and detects the specific region.
- the inner wall of the tube is photographed from an oblique direction.
- the abnormal region is short with respect to the direction of the deep region (depth direction). It is easy to take a long image in a direction perpendicular to the direction. For this reason, it is considered that the detection accuracy of the specific region based on the feature amount calculated from the region having a shape close thereto is high.
- the integrated determination unit 52 makes a determination with emphasis on the specific region identification index based on the feature amount calculated from the longer region in the direction orthogonal to the direction of the deep region.
- the integrated determination unit 52 makes a determination with emphasis on the specific region identification index based on the feature amount calculated from the circular region independent of the direction.
- the integrated determination unit 52 may make the determination with emphasis on the specific region identification index based on the feature amount calculated from the region in the direction aligned with the direction of the deep tube region.
- FIG. 11 is a flowchart showing an outline of processing performed by the image processing apparatus 1B.
- the same step numbers are assigned to the same processes as those in the flowchart shown in FIG. Hereinafter, the process following step S2 will be described.
- step S3B the shape-direction identifying unit 53 calculates a plurality of specific region identification indexes based on feature amounts calculated from a plurality of regions having different shapes and / or directions (step S3B).
- the shape direction-specific identification unit 53 calculates feature amounts from a plurality of feature amount calculation regions using, for example, a feature amount calculation region group 202 shown in FIG.
- the integration determination unit 52 performs integration determination by changing the specific region identification index to be emphasized according to the direction of the deep tube region, and detects the specific region (step S4B).
- the calculation unit 2B outputs the detection result of the specific area (step S5).
- the image processing apparatus 1B ends a series of processes. Note that the order of the tube deep region detection processing in step S2 and the specific region identification index calculation processing in step S3B may be reversed or performed in parallel.
- the imaging direction with respect to the tube inner wall In the change of the image resulting from the difference, the specific area identification index based on the more effective feature amount can be emphasized, and the specific area can be detected with high accuracy.
- the image processing apparatus 1B may include the in-pipe photographing state analyzing unit 4A described in the modified example 1-1 instead of the in-pipe photographing state analyzing unit 4. .
- the integrated determination unit 52 performs integrated determination according to the magnitude of the inner wall gradient.
- the integrated determination unit 52 makes a determination with emphasis on a specific region identification index based on a feature amount calculated from a long region in a direction orthogonal to the direction of the inner wall gradient.
- the integrated determination unit 52 makes a determination with emphasis on the specific region identification index based on the feature amount calculated from the circular region independent of the direction.
- the integrated determination unit 52 may make the determination with emphasis on the specific region identification index based on the feature amount calculated from the region in the direction aligned with the direction of the inner wall gradient.
- the in-pipe photographing state analyzing unit 4 may further include the inner wall gradient calculating unit 42 described in the modified example 1-1.
- the integration determination unit 52 performs integration determination according to the direction of the deep tube region and the direction of the inner wall gradient.
- FIG. 12 is a block diagram illustrating a functional configuration of an image processing apparatus according to Modification 1-3 of Embodiment 1.
- the image processing apparatus 1 ⁇ / b> C shown in FIG. 1 constituent parts having the same functions as those of the image processing apparatus 1 shown in FIG. 1 are denoted by the same reference numerals as in FIG. 1.
- the image processing apparatus 1 ⁇ / b> C includes a calculation unit 2 ⁇ / b> C and a storage unit 3.
- the calculation unit 2C includes an in-pipe photographing state analysis unit 4 and a specific area detection unit 5C.
- the specific region detection unit 5C includes a feature type identification unit 54 that calculates a plurality of specific region identification indexes based on different types of feature amounts, and an integrated determination unit 52.
- the feature type identification unit 54 sets a feature amount calculation region at an arbitrary position in the image, and calculates a plurality of specific region identification indexes for each feature amount such as color, contour, pixel value plane shape, texture, and the like.
- the integration determination unit 52 performs integrated determination by changing the particular region identification index to be emphasized according to the presence or absence of the deep tube region.
- the inner wall of the tube is photographed obliquely, so that the contour line on the surface of the specific region becomes clear (see FIG. 1). For this reason, it is considered that the detection accuracy of the specific region based on the contour feature amount is high.
- the integrated determination unit 52 makes a determination with emphasis on the specific region identification index based on the contour feature amount.
- the integration determination unit 52 performs the integration determination with emphasis on the specific area identification index based on the pixel value surface shape feature amount or the texture feature amount.
- FIG. 13 is a flowchart showing an outline of processing performed by the image processing apparatus 1C.
- the same step numbers are assigned to the same processes as those in the flowchart shown in FIG. Hereinafter, the process following step S2 will be described.
- step S3C the feature type identification unit 54 calculates a plurality of specific region identification indexes based on different types of feature amounts (step S3C).
- the integration determination unit 52 performs integration determination by changing the specific region identification index to be emphasized according to the presence or absence of the deep tube region, and detects the specific region (step S4C).
- the calculation unit 2C outputs the detection result of the specific area (step S5).
- the image processing apparatus 1 ⁇ / b> C ends the series of processes. It should be noted that the order of the tube deep region detection processing in step S2 and the specific region identification index calculation processing in step S3C may be reversed or performed in parallel.
- the imaging direction (diagonal, front, etc.) with respect to the tube inner wall
- the specific area identification index based on the more effective feature amount can be emphasized, and the specific area can be detected with high accuracy.
- the image processing apparatus 1C may include the in-pipe shooting situation analysis unit 4A described in Modification 1-1 instead of the in-pipe shooting situation analysis unit 4. .
- the integration determination unit 52 performs integration determination by changing the particular area identification index to be emphasized according to the magnitude of the inner wall gradient.
- the integrated determination unit 52 makes a determination with emphasis on the specific region identification index based on the contour feature amount.
- the integrated determination unit 52 makes the determination with emphasis on the specific area identification index based on the pixel value surface shape feature amount or the texture feature amount.
- the in-pipe photographing state analysis unit 4 may further include the inner wall gradient calculation unit 42 described in Modification 1-1.
- the integration determination unit 52 performs integration determination according to the presence / absence of the deep tube region and the inner wall gradient.
- FIG. 14 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 2 of the present invention.
- constituent parts having the same functions as those of the image processing apparatus 1 shown in FIG.
- the image processing device 6 includes a calculation unit 7 and a storage unit 3.
- the calculation unit 7 includes an in-pipe photographing state analysis unit 8 and a specific area detection unit 5.
- the in-pipe shooting state analysis unit 8 has a shooting distance estimation unit 81 that estimates the shooting distance to the inner wall of the tube.
- Various methods for estimating the shooting distance are known.
- a method for estimating a shooting distance assuming that a shooting target is a uniform diffusion surface will be described.
- the imaging distance estimation unit 81 includes a low absorption wavelength component selection unit 811 that selects a low absorption wavelength component having the lowest degree of absorption / scattering in a living body. This is to obtain pixel value information that is most correlated with the imaging distance from the mucosal surface while suppressing a decrease in pixel value due to blood vessels or the like that appear on the mucosal surface. For example, in the case of an image composed of R, G, and B components, the R component is selected as described in the first embodiment.
- the shooting distance estimation unit 81 estimates a shooting distance assuming a uniform diffusion surface based on the pixel value of the selected low absorption wavelength component. Specifically, the shooting distance estimation unit 81 estimates the shooting distance r at a plurality of locations in the in-pipe image according to the following equation (2).
- I on the right side of Equation (2) is the radiation intensity of the light source measured in advance
- K is the diffuse reflection coefficient of the mucosal surface
- ⁇ is the angle formed by the normal vector of the mucosal surface and the vector from the surface to the light source
- L is an R component value of a pixel in which the surface of the mucosa to be imaged distance estimation is shown.
- the diffuse reflection coefficient K is obtained by measuring an average value in advance.
- the angle ⁇ is set in advance as an average value as a value determined by the positional relationship between the endoscope tip and the mucosal surface.
- the shooting distance estimation unit 81 may perform subsequent adaptive processing using a pixel value having a correlation with the shooting distance r instead of estimating the shooting distance r defined by Expression (2). Good.
- the integrated determination unit 52 performs the integrated determination with an emphasis on the specific region identification index based on the feature amount having a large feature amount calculation range.
- the integrated determination unit 52 makes a determination with emphasis on a specific area identification index based on a feature amount having a small feature amount calculation range.
- FIG. 15 is a flowchart showing an outline of processing executed by the image processing apparatus 6.
- the calculation unit 7 acquires an in-pipe image that is a processing target (step S31).
- FIG. 16 is a flowchart illustrating an outline of processing performed by the shooting distance estimation unit 81.
- the processing of the shooting distance estimation unit 81 will be described with reference to FIG.
- the low absorption wavelength component selection unit 811 selects the low absorption wavelength component having the lowest degree of absorption / scattering in the living body (step S41).
- the shooting distance estimation unit 81 estimates a shooting distance assuming a uniform diffusion surface based on the pixel value of the selected low absorption wavelength component (step S42). Specifically, the shooting distance estimation unit 81 estimates the shooting distance according to the above-described equation (2). Thereby, the shooting distance estimating process (step S32) by the shooting distance estimating unit 81 is completed.
- the calculation unit 7 corrects pixel value unevenness caused by the optical system and the illumination system and excludes non-mucosal regions such as specular reflection, residue, and bubbles. Etc. may be performed. Thereby, the fall of the precision of each subsequent process can be suppressed.
- a detection unit such as a distance measuring sensor may be provided in the endoscope, and the shooting distance estimation unit 81 may estimate the shooting distance based on the detection result.
- the feature range identifying unit 51 calculates a plurality of specific region identification indexes based on the feature amounts calculated from a plurality of regions having different ranges (step S33). This process is the same as the process of step S3 described in the first embodiment.
- the integration determination unit 52 performs integration determination by changing the specific area identification index to be emphasized according to the distance of the shooting distance, and detects the specific area (step S34).
- the calculation unit 7 outputs the detection result of the specific area (step S35). Thereby, the image processing device 6 ends the series of processing. Note that the order of the shooting distance estimation process in step S32 and the specific area identification index calculation process in step S33 may be reversed or may be performed in parallel.
- the specific area identification index based on a more effective feature amount can be emphasized, and the specific area can be detected with high accuracy.
- FIG. 17 is a block diagram illustrating a functional configuration of the image processing apparatus according to the modified example 2-1 of the second embodiment.
- constituent parts having the same functions as those of the image processing apparatus 6 shown in FIG.
- the image processing apparatus 6A includes a calculation unit 7A and a storage unit 3.
- the calculation unit 7A includes an in-tube photographing situation analysis unit 8A and a specific area detection unit 5.
- the in-pipe photographing state analysis unit 8A includes a defocus analysis unit 82 that analyzes defocus in the in-pipe image.
- the defocus analysis unit 82 includes a specular reflection exclusion unit 821 and a spatial frequency analysis unit 822.
- the specular reflection exclusion unit 821 discriminates and excludes specular reflection in the in-pipe image based on, for example, a method disclosed in Japanese Patent Application Laid-Open No. 2012-11137.
- the spatial frequency analysis unit 822 performs a known two-dimensional Fourier transform (reference: CG-ARTS Association: digital image processing: 128P, two-dimensional Fourier transform) on a predetermined component (eg, G component) of the in-tube image.
- a predetermined component eg, G component
- the radial distribution is obtained by calculating the sum of the spectra in the annular region where the distance from the center indicating the low frequency component is within a predetermined range while changing the distance.
- a portion with a small distance indicates a low-frequency component of the in-tube image
- a portion with a large distance indicates a high-frequency component of the in-tube image.
- an image with few high-frequency components has a large defocus.
- FIG. 18 is a flowchart showing an outline of processing performed by the image processing apparatus 6A.
- the same steps as those in the flowchart shown in FIG. Hereinafter, the process following step S31 will be described.
- step S32A the defocus analysis unit 82 analyzes the defocus state of the in-tube image (step S32A).
- FIG. 19 is a flowchart illustrating an outline of processing performed by the defocus analysis unit 82.
- the specular reflection exclusion unit 821 determines and excludes specular reflection in the in-pipe image (step S51).
- the spatial frequency analysis unit 822 performs a two-dimensional Fourier transform on a predetermined component of the in-pipe image, and then calculates a radial distribution of the two-dimensional Fourier spectrum obtained by the two-dimensional Fourier transform (step S52). ).
- the defocus analysis unit 82 analyzes the defocus state based on the radial distribution of the two-dimensional Fourier spectrum (step S53). Specifically, the defocus analysis unit 82 determines that the degree of defocus is larger as the high-frequency component (portion where the distance is larger in the radial distribution) in the in-tube image is smaller.
- the feature range identifying unit 51 calculates a plurality of specific region identification indexes based on the feature amounts calculated from a plurality of regions having different ranges (step S33).
- the integration determination unit 52 performs integration determination by changing the specific area identification index to be emphasized according to the degree of defocus, and detects the specific area (step S34A).
- the degree of defocus is large, the subject image is a wider range than when the degree of defocus is small.
- the integrated determination unit 52 performs the integrated determination with an emphasis on the specific region identification index based on the feature amount having a large feature amount calculation range as the degree of defocusing increases.
- the calculation unit 7A outputs the detection result of the specific area (step S35).
- the image processing apparatus 6A ends a series of processes.
- the order of the out-of-focus state analysis process in step S32A and the specific area identification index calculation process in step S33 may be reversed or performed in parallel.
- the in-pipe photographing state analysis unit 8A may further include the photographing distance estimation unit 81 described in the second embodiment.
- the integrated determination unit 52 performs integrated determination according to the distance of the shooting distance and the degree of defocus.
- FIG. 20 is a block diagram illustrating a functional configuration of an image processing apparatus according to Modification 2-2 of Embodiment 2.
- FIG. 20 shows that components similar to those in the image processing apparatus 6A shown in FIG. 17 are denoted by the same reference numerals as those in FIG.
- the image processing device 6B includes a calculation unit 7B and a storage unit 3.
- the calculation unit 7B includes an in-pipe photographing state analysis unit 8B and a specific area detection unit 5.
- the in-pipe shooting state analysis unit 8B has a motion blur analysis unit 83 that analyzes motion blur in the in-pipe image.
- the motion blur analysis unit 83 includes a specular reflection exclusion unit 821 and a spatial frequency analysis unit 831.
- the spatial frequency analysis unit 831 calculates the angular distribution and radial distribution of the two-dimensional Fourier spectrum. Specifically, the spatial frequency analysis unit 831 performs a two-dimensional Fourier transform on a predetermined component (for example, a G component) of the in-tube image to obtain a two-dimensional Fourier spectrum, and then displays a center indicating a low-frequency component.
- the angle distribution is obtained by calculating the sum of the spectra in the fan-shaped region where the angle with respect to the horizontal line passing through is in a predetermined range while changing the angle.
- the spatial frequency analysis unit 831 uses the same method as described in the modified example 2-1 in the fan-shaped region whose angle is in the predetermined range, thereby calculating the radial distribution in the fan-shaped region in which the angle is in the predetermined range.
- FIG. 21 is a flowchart showing an outline of processing performed by the image processing apparatus 6B.
- the same step numbers are assigned to the same processes as those in the flowchart shown in FIG. Hereinafter, the process following step S31 will be described.
- step S32B the motion blur analysis unit 83 analyzes the motion blur state of the in-pipe image (step S32B).
- FIG. 22 is a flowchart illustrating an outline of processing performed by the motion blur analysis unit 83.
- the specular reflection exclusion unit 821 excludes specular reflection in the in-pipe image (step S61).
- the spatial frequency analysis unit 831 calculates the angular distribution and radial distribution of the two-dimensional Fourier spectrum (step S62).
- the motion blur analysis unit 83 analyzes the motion blur state based on the angular distribution and the radial distribution (step S63). Specifically, the motion blur analysis unit 83 analyzes the direction of motion blur based on the angular distribution, and analyzes the motion blur state based on the radial distribution of the region where the angle is narrowed according to the analysis result. For example, when motion blur occurs in a substantially constant direction, the angular distribution corresponding to that direction has a relatively high spectral distribution. In this case, the motion blur analysis unit 83 analyzes the motion blur state based on the radial distribution near the region where the spectrum distribution is relatively high.
- the motion blur analysis unit 83 analyzes motion blur for each of the R, G, and B wavelength components and analyzes motion blur between the wavelength components. For analysis of motion blur between wavelength components, a sum image of wavelength component images is taken to generate a composite image, and the above-described spatial frequency analysis or the like may be performed on this composite image.
- the feature range identification unit 51 calculates a plurality of specific region identification indexes based on the feature amounts calculated from a plurality of regions having different ranges (step S33).
- the integration determination unit 52 performs integration determination by changing the specific area identification index to be emphasized according to the degree of motion blur, and detects the specific area (step S34B).
- the degree of motion blur is large, the subject image is a wider range than when the degree of motion blur is small.
- the integrated determination unit 52 performs the integrated determination with an emphasis on a specific region identification index based on a feature amount having a large feature amount calculation range as the degree of motion blur is larger.
- the calculation unit 7B outputs the detection result of the specific area (step S35).
- the image processing device 6B ends the series of processes. Note that the order of the motion blur state analysis process in step S32B and the specific area identification index calculation process in step S33 may be reversed or may be performed in parallel.
- the specific area identification index based on the more effective feature amount can be emphasized, and the specific area can be detected with high accuracy.
- the process of excluding the specular reflection is not necessarily performed.
- the in-pipe photographing condition analysis unit 8B further includes the photographing distance estimation unit 81 described in the second embodiment and / or the defocus analysis unit 82 described in the modification 2-1. It is good.
- the integrated determination unit 52 performs integrated determination according to the distance of the shooting distance and / or the degree of out-of-focus and the degree of motion blur.
- FIG. 23 is a block diagram illustrating a functional configuration of an image processing apparatus according to Modification 2-3 of Embodiment 2.
- the image processing device 6C shown in the figure components similar to those in the image processing device 1C shown in FIG. 12 and the image processing device 6 shown in FIG. It is attached.
- the image processing apparatus 6C includes a calculation unit 7C and a storage unit 3.
- the calculation unit 7C includes an in-pipe photographing state analysis unit 8 and a specific area detection unit 5C.
- the specific area detection unit 5 ⁇ / b> C includes a feature type identification unit 54 and an integrated determination unit 52.
- the integration determination unit 52 performs integration determination by changing the specific area identification index to be emphasized according to the distance of the shooting distance, and detects the specific area.
- a predetermined threshold value that is, when the shooting distance is close
- the integrated determination unit 52 performs the integrated determination with an emphasis on the specific region identification index based on the texture feature amount or the contour feature amount.
- the integrated determination unit 52 makes a determination with emphasis on the specific region identification index based on the color feature amount or the pixel value surface shape feature amount.
- the integrated determination unit 52 may determine the specific region identification index based on the color feature amount without giving importance.
- FIG. 24 is a flowchart showing an outline of processing performed by the image processing apparatus 6C.
- the same step numbers are assigned to the same processes as those in the flowchart shown in FIG. Hereinafter, the process following step S32 will be described.
- step S33C the feature type identification unit 54 calculates a plurality of specific area identification indexes based on different types of feature amounts (step S33C).
- the integration determination unit 52 performs integration determination by changing the specific area identification index to be emphasized according to the distance of the shooting distance, and detects the specific area (step S34).
- the calculation unit 7C outputs the detection result of the specific area (step S35).
- the image processing apparatus 6C ends the series of processes. Note that the order of the shooting distance estimation process in step S32 and the specific area identification index calculation process in step S33C may be reversed or may be performed in parallel.
- an in-pipe shooting state analysis unit 8A having a defocus analysis unit 82 or an in-pipe shooting state analysis unit 8B having a motion blur analysis unit 83 is provided. May be.
- the integration determination unit 52 performs integration determination by changing the particular area identification index to be emphasized according to the degree of defocus.
- the integrated determination unit 52 performs the integrated determination with an emphasis on the specific region identification index based on the texture feature amount or the contour feature amount.
- the integrated determination unit 52 performs the integrated determination with an emphasis on the specific area identification index based on the color feature value or the pixel value surface shape feature value.
- the integration determination unit 52 performs integration determination by changing the particular area identification index to be emphasized according to the degree of motion blur. For example, when the degree of motion blur is small, the texture and contour of the surface of the specific area are clearly shown, so the integrated determination unit 52 emphasizes the specific area identification index based on the texture feature amount or the contour feature amount and performs the integrated determination. I do.
- the integrated determination unit 52 makes a determination with emphasis on the specific area identification index based on the color feature amount or the pixel value surface shape feature amount.
- the integrated determination unit 52 may determine without emphasizing the specific area identification index based on the color feature amount.
- the in-pipe shooting state analysis unit 8 may include any two or all of the shooting distance estimation unit 81, the focal blur analysis unit 82, and the motion blur analysis unit 83.
- FIG. 25 is a block diagram illustrating a functional configuration of an image processing apparatus according to Modification 2-4 of Embodiment 2.
- the image processing device 6D shown in the figure components similar to those in the image processing device 1B shown in FIG. 9 and the image processing device 6B shown in FIG. It is attached.
- the image processing device 6D includes a calculation unit 7D and a storage unit 3.
- the calculation unit 7D includes an in-pipe photographing state analysis unit 8B and a specific area detection unit 5B.
- FIG. 26 is a flowchart showing an outline of processing performed by the image processing apparatus 6D.
- the same step numbers are assigned to the same processes as those in the flowchart shown in FIG. Hereinafter, the process following step S32B will be described.
- step S33D the shape-direction identifying unit 53 calculates a plurality of specific region identification indexes based on feature amounts calculated from a plurality of regions having different shapes and / or directions (step S33D).
- the shape direction identification unit 53 calculates feature amounts from a plurality of regions using, for example, a feature amount calculation region group 202 shown in FIG.
- the integration determination unit 52 performs integration determination by changing the specific area identification index to be emphasized according to the direction and degree of motion blur, and detects the specific area (step S34D).
- the integration determination unit 52 performs the integrated determination with an emphasis on the specific region identification index based on the feature amount calculated from the feature amount calculation region of the shape expanded by the degree of motion blur in the direction of motion blur.
- the calculation unit 7D outputs the detection result of the specific area (step S35).
- the image processing apparatus 6D ends a series of processes. Note that the order of the motion blur state analysis process in step S32B and the specific area identification index calculation process in step S33D may be reversed or performed in parallel.
- FIG. 27 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 3 of the present invention.
- An image processing apparatus 9 shown in FIG. 1 includes a calculation unit 10 and a storage unit 11.
- the calculation unit 10 includes an in-pipe photographing state analysis unit 12 and a specific area detection unit 13.
- the storage unit 11 includes a parameter storage unit 111.
- the in-pipe shooting state analysis unit 12 may be any one of a plurality of in-pipe shooting state analysis units described in the first and second embodiments, or may be configured by appropriately combining them.
- the specific area detection unit 13 identifies a discriminator-specific identification unit 131 that calculates a plurality of specific area identification indexes based on different classifiers, and identification parameters created based on teacher data in an in-tube imaging situation equivalent to the in-tube imaging situation And an integrated determination unit 132 that performs integrated determination with emphasis on the specific area identification index based on the vessel and detects the specific area.
- the discriminator discriminating unit 131 first sets a feature amount calculation region at an arbitrary position in the image. Subsequently, the discriminator-specific discriminating unit 131 calculates a feature amount from each feature amount calculation region, and calculates a plurality of specific region identification indexes based on discriminators having different discrimination parameters based on the feature amount.
- the identification parameters are an identification boundary in the feature space, a distribution model according to the in-pipe photographing condition, an identification function, a representative pattern (template), and the like.
- Equivalent here means that the analysis results (existence and direction of the deep part, the magnitude and direction of the inner wall gradient, the distance of the imaging distance, the presence or absence of defocusing, the presence or absence of motion blur, etc.) in the in-pipe imaging situation analysis unit 12 are predetermined. It is shown that the errors are almost the same while allowing the error. Actually, multiple images obtained in advance are analyzed in the same way as in-tube shooting situation analysis (this analysis may be mechanical processing or manual work), and each analysis result image is identified. After that, parameters corresponding to each analysis result are created based on the teacher data based on each analysis result image, and a plurality of specific area identification indices are calculated using these parameters.
- an identification parameter created based on teacher data of the specific region appearing in an image obtained by photographing the inner wall of the tube from an oblique direction is suitable.
- an identification parameter created based on the teacher data of the specific region appearing in the image obtained by photographing the inner wall of the tube from the front is suitable.
- identification parameters created based on teacher data in equivalent in-tube shooting situations are suitable. Yes.
- the discriminator-specific discriminating unit 131 creates a plurality of discriminating parameters according to the difference in the in-tube photographing situation in advance, and applies the feature amount (feature vector) obtained from the feature amount calculation area. Thus, a specific area identification index based on the plurality of identification parameters is calculated.
- the parameter storage unit 111 included in the storage unit 11 stores identification parameters created based on a plurality of teacher data respectively corresponding to a plurality of in-tube shooting situations in association with the in-tube shooting situations.
- the identification parameter may be stored in the external device, and the integrated determination unit 132 may acquire the parameter from the external device and acquire the identification parameter.
- FIG. 28 is a flowchart showing an outline of processing executed by the image processing apparatus 9.
- the calculating part 10 acquires the in-pipe image which is a process target (step S71).
- the in-pipe photographing state analysis unit 12 analyzes the photographing state of the in-pipe image (step S72).
- the discriminator discriminating unit 131 calculates a plurality of specific area discrimination indexes based on discriminators having different discriminating parameters (step S73).
- the integrated determination unit 132 extracts, from the parameter storage unit 111, the identification parameter created based on the teacher data in the in-pipe shooting situation equivalent to the in-pipe shooting state, and sets the specific region identification index based on the discriminator of the identification parameter. Integration determination is performed with emphasis and a specific area is detected (step S74).
- step S75 the calculation unit 10 outputs a specific area detection result.
- the image processing apparatus 9 ends the series of processes. Note that the order of the in-pipe photographing state analysis process in step S72 and the specific area identification index calculation process in step S73 may be reversed or may be performed in parallel.
- the third embodiment of the present invention since a plurality of specific area identification indexes based on discriminators with different identification parameters are integratedly determined according to the in-tube imaging situation, the image generated due to the difference in the in-tube imaging situation.
- the specific area identification index based on a more effective classifier can be emphasized, and the specific area can be detected with high accuracy.
- the integration determination unit 132 may further perform the integration determination described in the first and second embodiments.
- FIG. 29 is a block diagram showing a functional configuration of an image processing apparatus according to Embodiment 4 of the present invention.
- the image processing apparatus 14 shown in the figure components having the same functions as those of the image processing apparatus 1 shown in FIG.
- the image processing apparatus 14 includes a calculation unit 15 and a storage unit 3.
- the computing unit 15 includes an area dividing unit 16, an in-tube photographing state analyzing unit 17, and a specific region detecting unit 18.
- the area dividing unit 16 divides the in-pipe image into areas.
- Examples of the region division method include rectangular division of a predetermined size and region division based on edges (see Japanese Patent Application Laid-Open No. 2012-238041). In addition, when performing rectangular division, you may divide
- the in-pipe shooting state analysis unit 17 may be any one of a plurality of in-pipe shooting state analysis units described in the first and second embodiments, or may be a combination of them.
- the specific area detection unit 18 may be any one of a plurality of specific area detection units described in the first to third embodiments, or may be a combination of them.
- FIG. 30 is a flowchart showing an outline of processing executed by the image processing apparatus 14.
- the calculating part 15 acquires the in-pipe image which is a process target (step S81).
- the region dividing unit 16 divides the in-pipe image into regions (step S82).
- the in-pipe shooting state analysis unit 17 analyzes the in-pipe shooting state in each divided region (step S83).
- the specific area detection unit 18 calculates a plurality of specific area identification indexes in each divided area (step S84).
- the specific area detection unit 18 performs the integrated determination of the specific area identification index according to the in-tube photographing situation in each divided area, and detects the specific area (step S85).
- step S86 the calculation unit 15 outputs the specific area detection result.
- the image processing apparatus 14 ends the series of processes. Note that the order of the in-pipe photographing state analysis process in step S83 and the specific area identification index calculation process in step S84 may be reversed or may be performed in parallel.
- the specific area is detected for each divided area in accordance with the in-pipe shooting state, so that the specific area can be detected with high accuracy.
- the present invention can be applied to an intraluminal image of a virtual endoscope generated by CT colonography or an intraluminal image photographed by an industrial endoscope in addition to an endoscopic image for a living body. is there.
- the present invention can include various embodiments not described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Signal Processing (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
図3は、本発明の実施の形態1に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置1は、演算部2と、記憶部3とを有する。
図6は、実施の形態1の変形例1-1に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置1Aにおいて、図1に示す画像処理装置1と同様の機能を有する構成部位に対しては、図1と同様の符号を付してある。
図9は、実施の形態1の変形例1-2に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置1Bにおいて、図1に示す画像処理装置1と同様の機能を有する構成部位に対しては、図1と同様の符号を付してある。
図12は、実施の形態1の変形例1-3に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置1Cにおいて、図1に示す画像処理装置1と同様の機能を有する構成部位に対しては、図1と同様の符号を付してある。
図14は、本発明の実施の形態2に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置6において、図3に示す画像処理装置1と同様の機能を有する構成部位に対しては、図3と同様の符号を付してある。
図17は、実施の形態2の変形例2-1に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置6Aにおいて、図14に示す画像処理装置6と同様の機能を有する構成部位に対しては、図14と同様の符号を付してある。
図20は、実施の形態2の変形例2-2に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置6Bにおいて、図17に示す画像処理装置6Aと同様の機能を有する構成部位に対しては、図17と同様の符号を付してある。
図23は、実施の形態2の変形例2-3に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置6Cにおいて、図12に示す画像処理装置1Cおよび図14に示す画像処理装置6と同様の機能を有する構成部位に対しては、図12および図14と同様の符号を付してある。
図25は、実施の形態2の変形例2-4に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置6Dにおいて、図9に示す画像処理装置1Bおよび図20に示す画像処理装置6Bと同様の機能を有する構成部位に対しては、図9および図20と同様の符号を付してある。
図27は、本発明の実施の形態3に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置9は、演算部10と、記憶部11とを有する。演算部10は、管内撮影状況解析部12と、特定領域検出部13とを有する。記憶部11は、パラメータ記憶部111を有する。
図29は、本発明の実施の形態4に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置14において、図3に示す画像処理装置1と同様の機能を有する構成部位に対しては、図3と同様の符号を付してある。
ここまで、本発明を実施するための形態を説明してきたが、本発明は上述した実施の形態1~4によってのみ限定されるべきものではない。例えば、生体用の内視鏡画像以外にも、CTコロノグラフィにおいて生成されるバーチャル内視鏡の管腔内画像や、工業用内視鏡によって撮影された管内画像に対して適用することも可能である。
2、2A、2B、2C、7、7A、7B、7C、7D、10、15 演算部
3、11 記憶部
4、4A、8、8A、8B、12、17 管内撮影状況解析部
5、5B、5C、13、18 特定領域検出部
16 領域分割部
41 管深部領域検出部
42 内壁勾配算出部
51 特徴範囲別識別部
52、132 統合判定部
53 形状方向別識別部
54 特徴種類別識別部
81 撮影距離推定部
82 焦点ボケ解析部
83 動きブレ解析部
111 パラメータ記憶部
131 識別器別識別部
201 管内画像
202 特徴量算出領域群
411、811 低吸収波長成分選択部
412 エッジ周辺領域除外部
413 低画素値領域検出部
421 画素値勾配算出部
821 鏡面反射除外部
822、831 空間周波数解析部
Claims (23)
- 管内を撮影した管内画像において、被写体と該被写体を撮影する側との関係に基づいて定まる管内撮影状況を解析する管内撮影状況解析部と、
前記管内画像に対する複数の特定領域識別指標を算出し、前記管内撮影状況に応じた特定領域識別指標の統合判定により特定領域を検出する特定領域検出部と、
を備えることを特徴とする画像処理装置。 - 前記特定領域検出部は、
範囲の異なる複数の領域から算出した特徴量に基づく複数の特定領域識別指標を算出する特徴範囲別識別部と、
複数の特定領域識別指標のうち重視する特定領域識別指標を前記管内撮影状況に応じて変更して統合判定する統合判定部と、
を有することを特徴とする請求項1に記載の画像処理装置。 - 前記管内撮影状況解析部は、
前記管内画像中の管深部領域を検出する管深部領域検出部および前記管内画像内の管内壁の勾配を算出する内壁勾配算出部の少なくとも一方を有し、
前記統合判定部は、
前記管深部領域の存否および前記管内壁の勾配の少なくとも一方に応じて、重視する特定領域識別指標を変更して統合判定することを特徴とする請求項2に記載の画像処理装置。 - 前記統合判定部は、
前記管深部領域が存在する場合、前記管深部領域が存在しない場合よりも算出範囲の大きな特徴量に基づく前記特定領域識別指標を重視することを特徴とする請求項3に記載の画像処理装置。 - 前記統合判定部は、
前記管内壁の勾配が閾値以上である場合、前記管内壁の勾配が前記閾値未満である場合よりも算出範囲の大きな特徴量に基づく前記特定領域識別指標を重視することを特徴とする請求項3に記載の画像処理装置。 - 前記管内撮影状況解析部は、
前記管内画像中の管内壁までの撮影距離を推定する撮影距離推定部、前記管内画像の焦点ボケの状態を解析する焦点ボケ解析部、および前記管内画像の動きブレの状態を解析する動きブレ解析部の少なくともいずれか一つを有し、
前記統合判定部は、
前記撮影距離、前記焦点ボケの状態、および前記動きブレの状態の少なくともいずれか一つに応じて、重視する前記特定領域識別指標を変更して統合判定する請求項2に記載の画像処理装置。 - 前記統合判定部は、
前記撮影距離が大きいほど、前記焦点ボケの度合いが大きいほど、または前記動きブレの度合いが大きいほど、算出範囲の大きな特徴量に基づく前記特定領域識別指標を重視することを特徴とする請求項6に記載の画像処理装置。 - 前記特定領域検出部は、
形状および/または方向の異なる複数の領域から算出した特徴量に基づく複数の特定領域識別指標を算出する形状方向別識別部と、
前記複数の特定領域識別指標のうち重視する特定領域識別指標を、前記管内撮影状況に応じて変更して統合判定する統合判定部と、
を有することを特徴とする請求項1に記載の画像処理装置。 - 前記管内撮影状況解析部は、
前記管内画像中から管深部領域を検出する管深部領域検出部および前記管内画像中の管内壁の勾配を算出する内壁勾配算出部の少なくとも一方を有し、
前記統合判定部は、
前記管深部領域の方向および前記管内壁の勾配の方向の少なくとも一方に応じて、重視する前記特定領域識別指標を変更して統合判定することを特徴とする請求項8に記載の画像処理装置。 - 前記統合判定部は、
前記管深部領域の方向または前記管内壁の勾配の方向と直交する方向に長めの算出領域形状の特徴量に基づく前記特定領域識別指標を重視することを特徴とする請求項9に記載の画像処理装置。 - 前記管内撮影状況解析部は、
前記管内画像の動きブレの状態を解析する動きブレ解析部を有し、
前記統合判定部は、
前記動きブレの状態に応じて、重視する前記特定領域識別指標を変更して統合判定することを特徴とする請求項8に記載の画像処理装置。 - 前記特定領域検出部は、
色、輪郭、画素値面形状、テクスチャの何れかに類別される種類の異なる特徴量に基づく複数の特定領域識別指標を算出する特徴種類別識別部と、
前記複数の特定領域識別指標のうち重視する特定領域識別指標を、前記管内撮影状況に応じて変更して統合判定する統合判定部と、
を有することを特徴とする請求項1に記載の画像処理装置。 - 前記管内撮影状況解析部は、
前記管内画像中の管深部領域を検出する管深部領域検出部および前記管内画像中の管内壁の勾配を算出する内壁勾配算出部の少なくとも一方を有し、
前記統合判定部は、
前記管深部領域の存否および前記管内壁の勾配のいずれか一方に応じて、重視する前記特定領域識別指標を変更して統合判定することを特徴とする請求項12に記載の画像処理装置。 - 前記統合判定部は、
前記管深部領域が存在する場合、または前記管内壁の勾配が所定値以上である場合、輪郭特徴量に基づく前記特定領域識別指標を重視することを特徴とする請求項13に記載の画像処理装置。 - 前記統合判定部は、
前記管深部領域が存在しない場合、または前記管内壁の勾配が所定値未満である場合、画素値面形状特徴量またはテクスチャ特徴量に基づく前記特定領域識別指標を重視することを特徴とする請求項13に記載の画像処理装置。 - 前記管内撮影状況解析部は、
前記管内画像中の前記管内壁までの撮影距離を推定する撮影距離推定部、前記管内画像の焦点ボケの状態を解析する焦点ボケ解析部、および前記管内画像の動きブレの状態を解析する動きブレ解析部の少なくともいずれか一つを有し、
前記統合判定部は、
前記撮影距離、前記焦点ボケの状態、および前記動きブレの状態の少なくともいずれか一つに応じて、重視する前記特定領域識別指標を変更して統合判定することを特徴とする請求項12に記載の画像処理装置。 - 前記統合判定部は、
前記撮影距離が大きい場合、前記焦点ボケの度合いが所定の度合い以上である場合、または前記動きブレの度合いが所定の度合い以上である場合、色特徴量または画素値面形状特徴量に基づく前記特定領域識別指標を重視することを特徴とする請求項16に記載の画像処理装置。 - 前記統合判定部は、
前記撮影距離が小さい場合、前記焦点ボケの度合いが所定の度合いより小さい場合、または動きブレの度合いが所定の度合いより小さい場合、テクスチャ特徴量に基づく前記特定領域識別指標を重視することを特徴とする請求項16に記載の画像処理装置。 - 前記特定領域検出部は、
異なる複数の識別器に基づく複数の特定領域識別指標を算出する識別器別識別部と、
前記複数の特定領域識別指標のうち重視する特定領域識別指標を、前記管内撮影状況に応じて変更して統合判定する統合判定部と、
を有することを特徴とする請求項1に記載の画像処理装置。 - 前記統合判定部は、
解析した管内撮影状況と同等の管内撮影状況における教師データを基に作成した識別パラメータの識別器に基づく前記特定領域識別指標を重視することを特徴とする請求項19に記載の画像処理装置。 - 前記管内画像を領域分割する領域分割部をさらに備え、
前記統合判定部は、
前記領域分割部が分割した領域毎に、前記管内撮影状況に応じた特定領域識別指標の統合判定により特定領域を検出することを特徴とする請求項1に記載の画像処理装置。 - 管内を撮影した管内画像において、被写体と該被写体を撮影する側との関係に基づいて定まる管内撮影状況を解析する管内撮影状況解析ステップと、
前記管内画像に対する複数の特定領域識別指標を算出し、前記管内撮影状況に応じた特定領域識別指標の統合判定により特定領域を検出する特定領域検出ステップと、
を含むことを特徴とする画像処理方法。 - 管内を撮影した管内画像において、被写体と該被写体を撮影する側との関係に基づいて定まる管内撮影状況を解析する管内撮影状況解析ステップと、
前記管内画像に対する複数の特定領域識別指標を算出し、前記管内撮影状況に応じた特定領域識別指標の統合判定により特定領域を検出する特定領域検出ステップと、
をコンピュータに実行させることを特徴とする画像処理プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/062428 WO2016170656A1 (ja) | 2015-04-23 | 2015-04-23 | 画像処理装置、画像処理方法および画像処理プログラム |
DE112015006378.1T DE112015006378T5 (de) | 2015-04-23 | 2015-04-23 | Bildverarbeitungsvorrichtung, Bildverarbeitungsverfahren und Bildverarbeitungsprogramm |
CN201580079117.6A CN107529963B (zh) | 2015-04-23 | 2015-04-23 | 图像处理装置、图像处理方法和存储介质 |
JP2017513916A JP6598850B2 (ja) | 2015-04-23 | 2015-04-23 | 画像処理装置、画像処理方法および画像処理プログラム |
US15/787,759 US10540765B2 (en) | 2015-04-23 | 2017-10-19 | Image processing device, image processing method, and computer program product thereon |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/062428 WO2016170656A1 (ja) | 2015-04-23 | 2015-04-23 | 画像処理装置、画像処理方法および画像処理プログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/787,759 Continuation US10540765B2 (en) | 2015-04-23 | 2017-10-19 | Image processing device, image processing method, and computer program product thereon |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016170656A1 true WO2016170656A1 (ja) | 2016-10-27 |
Family
ID=57142983
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/062428 WO2016170656A1 (ja) | 2015-04-23 | 2015-04-23 | 画像処理装置、画像処理方法および画像処理プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10540765B2 (ja) |
JP (1) | JP6598850B2 (ja) |
CN (1) | CN107529963B (ja) |
DE (1) | DE112015006378T5 (ja) |
WO (1) | WO2016170656A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018180206A1 (ja) * | 2017-03-30 | 2018-10-04 | 富士フイルム株式会社 | 細胞画像評価装置および方法並びにプログラム |
EP3590412A4 (en) * | 2017-03-03 | 2020-06-03 | Fujifilm Corporation | ENDOSCOPE SYSTEM, PROCESSOR DEVICE AND ENDOSCOPE SYSTEM OPERATING METHOD |
WO2021039441A1 (ja) * | 2019-08-23 | 2021-03-04 | ライトタッチテクノロジー株式会社 | 生体組織の識別方法、生体組織識別装置、および生体組織識別プログラム |
WO2021054419A1 (ja) * | 2019-09-20 | 2021-03-25 | 株式会社Micotoテクノロジー | 内視鏡画像処理システム及び内視鏡画像処理方法 |
WO2023144936A1 (ja) * | 2022-01-26 | 2023-08-03 | 日本電気株式会社 | 画像判定装置、画像判定方法、及び、記録媒体 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111064934A (zh) * | 2019-12-30 | 2020-04-24 | 元力(天津)科技有限公司 | 一种医用影像处理系统及方法 |
CN111709912A (zh) * | 2020-05-18 | 2020-09-25 | 北京配天技术有限公司 | 一种圆弧边缘检测方法、装置及存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012073953A (ja) * | 2010-09-29 | 2012-04-12 | Olympus Corp | 画像処理装置、画像処理方法、および画像処理プログラム |
JP2013039344A (ja) * | 2011-08-15 | 2013-02-28 | Toshiba Corp | 医用画像処理装置、医用画像処理方法および異常検出プログラム |
WO2013180147A1 (ja) * | 2012-05-31 | 2013-12-05 | オリンパス株式会社 | 内視鏡装置 |
JP2014104293A (ja) * | 2012-11-29 | 2014-06-09 | Olympus Corp | 画像処理装置、画像処理方法、及び画像処理プログラム |
WO2014103237A1 (ja) * | 2012-12-25 | 2014-07-03 | 富士フイルム株式会社 | 画像処理装置および画像処理方法、並びに画像処理プログラム |
JP2014188223A (ja) * | 2013-03-27 | 2014-10-06 | Olympus Corp | 画像処理装置、内視鏡装置、プログラム及び画像処理方法 |
JP2015008782A (ja) * | 2013-06-27 | 2015-01-19 | オリンパス株式会社 | 画像処理装置、内視鏡装置及び画像処理方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4885388B2 (ja) | 2001-09-25 | 2012-02-29 | オリンパス株式会社 | 内視鏡挿入方向検出方法 |
JP4891637B2 (ja) | 2006-03-14 | 2012-03-07 | オリンパスメディカルシステムズ株式会社 | 画像解析装置 |
JP5658931B2 (ja) | 2010-07-05 | 2015-01-28 | オリンパス株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
US9412054B1 (en) * | 2010-09-20 | 2016-08-09 | Given Imaging Ltd. | Device and method for determining a size of in-vivo objects |
JP5771442B2 (ja) | 2011-05-09 | 2015-08-26 | オリンパス株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
CN103747718B (zh) * | 2012-03-21 | 2016-03-30 | 奥林巴斯株式会社 | 图像处理装置 |
JP5980555B2 (ja) * | 2012-04-23 | 2016-08-31 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
JP6265588B2 (ja) * | 2012-06-12 | 2018-01-24 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
-
2015
- 2015-04-23 WO PCT/JP2015/062428 patent/WO2016170656A1/ja active Application Filing
- 2015-04-23 DE DE112015006378.1T patent/DE112015006378T5/de not_active Withdrawn
- 2015-04-23 CN CN201580079117.6A patent/CN107529963B/zh active Active
- 2015-04-23 JP JP2017513916A patent/JP6598850B2/ja active Active
-
2017
- 2017-10-19 US US15/787,759 patent/US10540765B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012073953A (ja) * | 2010-09-29 | 2012-04-12 | Olympus Corp | 画像処理装置、画像処理方法、および画像処理プログラム |
JP2013039344A (ja) * | 2011-08-15 | 2013-02-28 | Toshiba Corp | 医用画像処理装置、医用画像処理方法および異常検出プログラム |
WO2013180147A1 (ja) * | 2012-05-31 | 2013-12-05 | オリンパス株式会社 | 内視鏡装置 |
JP2014104293A (ja) * | 2012-11-29 | 2014-06-09 | Olympus Corp | 画像処理装置、画像処理方法、及び画像処理プログラム |
WO2014103237A1 (ja) * | 2012-12-25 | 2014-07-03 | 富士フイルム株式会社 | 画像処理装置および画像処理方法、並びに画像処理プログラム |
JP2014188223A (ja) * | 2013-03-27 | 2014-10-06 | Olympus Corp | 画像処理装置、内視鏡装置、プログラム及び画像処理方法 |
JP2015008782A (ja) * | 2013-06-27 | 2015-01-19 | オリンパス株式会社 | 画像処理装置、内視鏡装置及び画像処理方法 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3590412A4 (en) * | 2017-03-03 | 2020-06-03 | Fujifilm Corporation | ENDOSCOPE SYSTEM, PROCESSOR DEVICE AND ENDOSCOPE SYSTEM OPERATING METHOD |
US11259692B2 (en) | 2017-03-03 | 2022-03-01 | Fujifilm Corporation | Endoscope system, processor device, and method for operating endoscope system |
WO2018180206A1 (ja) * | 2017-03-30 | 2018-10-04 | 富士フイルム株式会社 | 細胞画像評価装置および方法並びにプログラム |
JPWO2018180206A1 (ja) * | 2017-03-30 | 2019-12-26 | 富士フイルム株式会社 | 細胞画像評価装置および方法並びにプログラム |
WO2021039441A1 (ja) * | 2019-08-23 | 2021-03-04 | ライトタッチテクノロジー株式会社 | 生体組織の識別方法、生体組織識別装置、および生体組織識別プログラム |
WO2021054419A1 (ja) * | 2019-09-20 | 2021-03-25 | 株式会社Micotoテクノロジー | 内視鏡画像処理システム及び内視鏡画像処理方法 |
WO2023144936A1 (ja) * | 2022-01-26 | 2023-08-03 | 日本電気株式会社 | 画像判定装置、画像判定方法、及び、記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
CN107529963A (zh) | 2018-01-02 |
US10540765B2 (en) | 2020-01-21 |
CN107529963B (zh) | 2020-06-12 |
DE112015006378T5 (de) | 2017-12-14 |
US20180040127A1 (en) | 2018-02-08 |
JPWO2016170656A1 (ja) | 2018-02-15 |
JP6598850B2 (ja) | 2019-10-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6598850B2 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
US11514270B2 (en) | Speckle contrast analysis using machine learning for visualizing flow | |
JP6265588B2 (ja) | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム | |
JP7188514B2 (ja) | 診断支援装置、及び診断支援装置における画像処理方法、並びにプログラム | |
US9324153B2 (en) | Depth measurement apparatus, image pickup apparatus, depth measurement method, and depth measurement program | |
US10004448B2 (en) | Osteoporosis diagnostic support apparatus | |
WO2014115371A1 (ja) | 画像処理装置、内視鏡装置、画像処理方法及び画像処理プログラム | |
JP6422198B2 (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
TW201131512A (en) | Distance evaluation methods and apparatuses, and machine readable medium thereof | |
JP6704933B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
WO2016194177A1 (ja) | 画像処理装置、内視鏡装置及び画像処理方法 | |
JP6603709B2 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP6664486B2 (ja) | 画像処理装置、画像処理装置の作動方法及び画像処理装置の作動プログラム | |
JP7228341B2 (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
CN112712499B (zh) | 一种物体检测方法、装置以及计算机可读存储介质 | |
JP2015084894A (ja) | 心胸郭比算出装置 | |
CN118229555B (zh) | 一种图像融合方法、装置、设备及计算机可读存储介质 | |
JP2009129221A (ja) | 画像のぼけ量測定装置、ぼけ量測定方法、および、ぼけ量測定プログラム | |
TWI590807B (zh) | Image processing device and image processing program | |
JP2013092884A (ja) | 画像処理装置、方法、及びプログラム | |
JP2003067741A (ja) | 画像処理装置及び画像処理方法 | |
Leroy et al. | An efficient method for monocular depth from defocus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15889895 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017513916 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015006378 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15889895 Country of ref document: EP Kind code of ref document: A1 |