WO2011033619A1 - 画像処理装置、画像処理方法、画像処理プログラム、及び、記憶媒体 - Google Patents
画像処理装置、画像処理方法、画像処理プログラム、及び、記憶媒体 Download PDFInfo
- Publication number
- WO2011033619A1 WO2011033619A1 PCT/JP2009/066154 JP2009066154W WO2011033619A1 WO 2011033619 A1 WO2011033619 A1 WO 2011033619A1 JP 2009066154 W JP2009066154 W JP 2009066154W WO 2011033619 A1 WO2011033619 A1 WO 2011033619A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- texture
- acquired
- base
- image processing
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 234
- 238000003672 processing method Methods 0.000 title claims description 12
- 238000000605 extraction Methods 0.000 claims abstract description 15
- 239000000284 extract Substances 0.000 claims abstract description 6
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 5
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 5
- 238000000034 method Methods 0.000 claims description 99
- 230000008569 process Effects 0.000 claims description 86
- 238000012937 correction Methods 0.000 claims description 68
- 230000002146 bilateral effect Effects 0.000 claims description 35
- 230000001052 transient effect Effects 0.000 claims description 14
- 238000001914 filtration Methods 0.000 claims description 10
- 238000005259 measurement Methods 0.000 claims description 5
- 238000011946 reduction process Methods 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 238000000926 separation method Methods 0.000 description 37
- 230000008859 change Effects 0.000 description 25
- 238000010586 diagram Methods 0.000 description 19
- 238000004458 analytical method Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 15
- 230000000694 effects Effects 0.000 description 12
- 238000009499 grossing Methods 0.000 description 10
- 206010047571 Visual impairment Diseases 0.000 description 7
- 238000003702 image correction Methods 0.000 description 7
- 230000000052 comparative effect Effects 0.000 description 6
- 230000009467 reduction Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000004615 ingredient Substances 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20028—Bilateral filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- the present invention relates to a technical field of an image processing apparatus, an image processing method, an image processing program, and a storage medium using, for example, a bilateral filter.
- Patent Document 1 Japanese Patent Application Laid-Open No. H10-228561 and the like disclose a technique of making the rise and fall of the contour portion of an image steep without using an overshoot or undershoot by using an enlargement / reduction circuit.
- Non-Patent Document 1 and the like disclose a technique related to a bilateral filter as a non-linear filter capable of removing a noise component without blurring the outline of an image.
- Non-Patent Document 2 and the like disclose a technique related to a bilateral filter that makes the inclination of a pixel value in a spatial direction steep in the contour portion of an image.
- Non-Patent Document 3 and the like disclose a technique related to image enlargement processing based on separation of a skeleton component and a texture component. This technology separates the input image into a skeletal component and a texture component, and adopts an interpolation method suitable for each component, thus preserving fine texture components and without causing jaggy or ringing near the contour. It is a technology that keeps the outline sharp.
- the present invention has been made in view of, for example, the above-described conventional problems, and is capable of effectively suppressing the generation of noise components and more appropriately improving image quality, an image processing method, and an image. It is an object to provide a processing program and a storage medium.
- an image processing apparatus includes an acquisition unit that acquires a first image, an extraction unit that extracts a texture image from the acquired first image, and the extracted texture image.
- the first image is acquired by an acquisition unit configured to include a memory, a processor, and the like, for example.
- a texture image is extracted from the acquired first image by an extraction unit including a memory, a processor, and the like.
- the “first image” according to the present invention means, for example, an image such as a photograph, a frame image constituting a moving image, a color image, a monochrome image, or the like taken by a camera, a video camera, or the like.
- the “texture image” according to the present invention means an image composed of components in which the pixel value of each pixel changes smaller than that of surrounding pixels.
- a “texture image” means an image composed of pixels whose change in pixel value is minute.
- the “base image” according to the present invention means an image in which the texture component is almost or completely eliminated from the image.
- the “base image” is composed of a contour portion where the change in pixel value is large and a flat portion where the change in pixel value is uniform.
- the “pixel value” according to the present invention refers to an index indicating the level of the characteristic level of a pixel such as luminance, chromaticity, or saturation, for example, in units of pixels.
- the “extraction” according to the present invention typically means that only texture images in an image are directly or indirectly “extracted”, “identified”, “selected”, “distinguished”, “recognized”. , “Selection”, “selection selection”, and the like.
- the extracted texture image is enlarged by the first enlargement unit configured to include a memory, a processor, and the like.
- the acquired first image is enlarged by a second enlargement unit including a memory, a processor, and the like.
- a base image with a sharpened outline is obtained by performing a sharpening process on the outline of the enlarged first image, for example, by a base image acquisition means configured to include a memory, a processor, and the like.
- the “sharpening process” means an image process that sharpens the gradient of the change in the spatial direction of the pixel value of the contour of the image.
- the enlarged texture image and the acquired base image are synthesized by the synthesis means.
- the first image subjected to the enlargement process is subjected to the sharpening process using the bilateral filter or the trilateral filter.
- the bilateral filter or the trilateral filter also has a noise removal effect.
- Non-Patent Document 1 for the bilateral filter having a noise removal effect. Since noise is mainly composed of minute changes in an image, it is similar to a texture component. For this reason, the texture component is also removed from the sharpened image. Therefore, the base image obtained by performing the sharpening process reduces the graininess and detail feeling of the image, and is visually uncomfortable.
- the texture image is extracted from the first image by subtracting the first image filtered using a bilateral filter or an ⁇ filter.
- the extracted texture image is enlarged and then combined with the base image to obtain an output image.
- the second enlargement unit is applied to the input image, image processing can be performed on an image that is not deteriorated or image information is missing, so that the contour portion is clearer. It is possible to obtain a visual effect that is very preferable in practice.
- the base image acquisition unit acquires the base image using a bilateral filter or a trilateral filter.
- the extracting unit subtracts an image obtained by performing bilateral filtering or ⁇ filtering on the acquired first image from the acquired first image.
- the texture image is extracted.
- the degree of occurrence of noise components such as jaggies and ringing is increased by the image enlargement process.
- noise components such as jaggies and ringing are likely to occur at positions where there is a large difference in pixel values between adjacent pixel groups, such as around the contour of an image, so-called edge periphery.
- edge periphery For this reason, if the sharpening process is performed after performing the enlargement process on the acquired first image, noise components such as jaggy and ringing generated during the enlargement process enlarge the first image.
- the image is further emphasized, and as a result of image processing, there is a technical problem that the degree of occurrence of noise components increases in the output image.
- the bilateral filter or the trilateral filter in the base image acquisition unit is configured such that noise components such as jaggy and ringing generated in the enlarged first image are noises included in the bilateral filter.
- the removal action can be smoothed and reduced.
- the texture image is a subtraction between the first image and an image obtained by performing bilateral filtering or ⁇ filtering on the first image, a difference in pixel values between pixels constituting the texture image is very small. Thereby, when the enlargement process by the first enlargement unit is performed on the texture image, generation of noise components such as jaggy and ringing can be remarkably suppressed.
- a first correction process for correcting the enlarged texture image in accordance with a property of the enlarged texture image, and the acquired base image the image processing apparatus further includes a correction unit that performs at least one of the second correction processes for performing correction according to the properties of the acquired base image, and the combining unit includes the at least one of the correction processes.
- the texture image and the base image can be distinguished and corrected by an appropriate method according to the properties of the texture image and the base image.
- the image quality of the output image can be further improved as a result of the image processing.
- the correction unit performs, as the first correction process, a three-dimensional noise reduction process, an isolated point removal process, a non-linear process, At least one correction in the multiplication process is performed to correct the enlarged texture image.
- the correction means performs, as the first correction process, a three-dimensional noise reduction process, which is a filter process in the time axis direction, only for a contour image of an image, that is, a texture image that does not include an edge portion, Perform 3DNR processing.
- a three-dimensional noise reduction process which is a filter process in the time axis direction
- the 3DNR process does not affect the contour of the image.
- the generation of afterimages can be effectively reduced while removing random noise with 3DNR, which is very useful in practice.
- the correction means performs isolated point removal processing only on the texture image as the first correction processing.
- the isolated point removal process performs the isolated point removal process with little or no influence from a pixel portion whose pixel value greatly changes, such as an edge of an image, that is, an edge portion. Therefore, it is possible to improve noise detection accuracy and effectively remove noise, which is very useful in practice.
- the correction means performs nonlinear processing or multiplier processing on the texture image as the first correction processing, and does not perform nonlinear processing or multiplier processing on the base image.
- the pixel values of the edge portion and the flat portion in the base image can be maintained, so that the pixel value and detail of the image can be maintained without causing white crushing or black crushing by raising or lowering the pixel values of the whole image. It is possible to improve the feeling of the image and enhance the contrast of the image.
- the correction unit performs at least one of a gradation correction process and a transient correction process on the acquired base image as the second correction process. And correcting the acquired base image.
- the gradation correction process is performed on the base image with the texture component reduced as the second correction process, a good gradation correction process can be performed.
- the pixel value can be linearly changed according to the stepwise change in the pixel value, and a good gradation correction process can be performed.
- a gradation correction process is applied to an image that contains many texture components, there is a technical problem that the gradation correction process may not be performed normally due to the interference of the texture component. Occurs.
- the transient correction process since the transient correction process is performed on the base image in which the texture component is reduced, a good transient correction process can be performed. Specifically, in the base image with reduced texture components, it is possible to make the gradient of the contour steep according to the stepwise change in pixel values around the contour without affecting the texture component. Good transient correction processing can be performed.
- the measurement unit measures the distribution of frequency components in an arbitrary region in each first image among the acquired first image or the plurality of first image groups.
- the extraction means is a distribution of the measured frequency components. Based on the above, the texture image is extracted.
- the base image acquisition means changes at least one of the number of taps and a filter coefficient according to the distribution of the measured frequency components, and the base image is obtained.
- the extraction means changes at least one of the number of taps and the filter coefficient in accordance with the distribution of the measured frequency components, and extracts the texture image.
- the number of taps according to the present invention means a value representing the range of pixels to be subjected to image processing in units of pixels.
- the filter coefficient means a parameter for controlling the characteristics of the filter. Typically, in the case of an ⁇ filter, it means selection of an ⁇ value or a nonlinear function, and in the case of a bilateral filter or a trilateral filter, ⁇ Value or ⁇ value.
- the number of taps is changed in the increasing direction, and the above-described high-frequency component If the frequency or the time integration of the frequency does not exceed a predetermined value, the number of taps may be changed in a decreasing direction.
- the number of taps is changed based on the distribution of the measured frequency components for the various first images that are input, so that the degree of granularity and detail of the images can be accurately adjusted to a certain level. This is very preferable in practice because it can be maintained.
- the measurement unit initializes the measurement of the frequency component for each scene or each channel when the first image is acquired.
- the base image acquisition unit acquires the base image based on enlargement information for enlarging the first image in addition to the measured distribution.
- the product of the number of taps based on the measured distribution of frequency components and the magnification ratio included in the magnification information may be set as the number of taps.
- an image processing method of the present invention includes an acquisition step of acquiring a first image, an extraction step of extracting a texture image from the acquired first image, and the extracted texture image.
- the image processing method of the present invention it is possible to receive various benefits of the above-described image processing apparatus of the present invention.
- the image processing method of the present invention can also adopt various aspects.
- the image processing program of the present invention is an image processing program executed by an apparatus including a computer, an acquisition unit that acquires a first image, an extraction unit that extracts a texture image from the acquired first image, A first enlarging means for enlarging the extracted texture image, a second enlarging means for enlarging the acquired first image, and a contour by applying a sharpening process to the contour of the enlarged first image.
- the computer is caused to function as a base image acquisition unit that acquires a sharpened base image, and a synthesis unit that combines the enlarged texture image and the acquired base image.
- the computer program is read from a recording medium such as a ROM, a CD-ROM, a DVD-ROM, and a hard disk for storing the computer program and executed by the computer, Alternatively, when the computer program is downloaded to a computer via communication means and executed, the above-described embodiment of the image processing apparatus of the present invention can be realized relatively easily.
- the embodiments of the image processing program of the present invention can also adopt various aspects.
- the storage medium of the present invention stores the above-described image processing program (including various aspects).
- the computer it is possible to cause the computer to appropriately function as the embodiment according to the above-described image processing apparatus of the present invention by causing the computer to read the above-described image processing program. It is.
- FIG. 1 is a block diagram illustrating an overall configuration of an image processing apparatus according to a first embodiment. It is the block diagram which showed the detailed structure of the texture separation part 110 which concerns on 1st Embodiment. It is the graph (Drawing 3 (a) thru / or Drawing 3 (d)) which showed the example of the nonlinear function in the epsilon filter which is an example of the filter processing part of the texture separation part concerning a 1st embodiment. 3 is a flowchart showing a flow of operations of the image processing apparatus according to the first embodiment.
- FIG. 5 is a waveform diagram of an image in which an image obtained by performing various processes in the texture separation unit of the image processing apparatus according to the first embodiment is expressed using a position on the image and a pixel value (FIG.
- FIG. 6A to FIG. 6 show other waveform diagrams of images obtained by performing various processes in the image processing apparatus according to the first embodiment and expressing the obtained image using the position and pixel value on the image. (D)). It is a block diagram which shows the whole structure of the image processing apparatus which concerns on a comparative example. It is a block diagram which shows the whole structure of the image processing apparatus which concerns on 2nd Embodiment.
- FIG. 9 is a waveform diagram (FIG. 9A) of an image subjected to gradation correction processing according to a second embodiment and a waveform diagram (FIG. 9B) of an image subjected to gradation correction processing according to a comparative example.
- FIG. 1 is a block diagram showing the overall configuration of the image processing apparatus according to the first embodiment.
- the image processing apparatus 100 includes a texture separation unit 110, an enlargement processing unit 120, an enlargement processing unit 130, a sharpening processing unit 140, and an adder 150. Yes.
- the input images are input to the texture separation unit 110 and the enlargement processing unit 130, respectively.
- the input image constitutes an example of the first image according to the present invention.
- the texture separation unit 110 separates the texture image from the input image and outputs it. Further, the enlargement information is input to the enlargement processing unit 120 and the enlargement processing unit 130.
- the enlargement information may be information related to a magnification that specifies how many times the input image is enlarged, or the enlargement information may be information related to the number of pixels that specifies the number of pixels after enlargement.
- An example of the acquisition unit according to the present invention and an example of the extraction unit according to the present invention are configured by the texture separation unit 110.
- the enlargement processing unit 130 enlarges the input image to a predetermined number of pixels and outputs it to the sharpening processing unit 140.
- An example of the second enlargement unit according to the present invention is configured by the enlargement processing unit 130.
- the enlargement processing unit 120 enlarges the texture image to a predetermined number of pixels and outputs it to the adder 150.
- An example of the first enlargement unit according to the present invention is configured by the enlargement processing unit 120.
- the sharpening processing unit 140 performs edge sharpening processing on the image obtained by enlarging the input image by the enlargement processing unit 130 and outputs a base image.
- An example of the base image acquisition unit according to the present invention is configured by the sharpening processing unit 140.
- the output image is obtained by combining the base image and the enlarged texture image by the adder 150.
- An example of the synthesizing means according to the present invention is constituted by the adder 150.
- FIG. 2 is a block diagram illustrating a detailed configuration of the texture separation unit 110 according to the first embodiment.
- the texture separation unit 110 includes a filter processing unit 111 and a subtractor 112.
- the input image is input to the filter processing unit 111 and the subtractor 112.
- the filter processing unit 111 performs edge maintenance filter processing on the input image and outputs it.
- a texture image is obtained by subtracting the input image subjected to the edge maintenance filter processing from the input image by the subtractor 112.
- the filter processing unit 111 is configured by a filter having an edge maintaining effect, and may use one of an ⁇ filter and a bilateral filter.
- the enlargement processing unit 130 and the enlargement processing unit 120 can use nearest neighbor, bilinear interpolation, bicubic interpolation, and interpolation by the Lanczos-windowed sinc function filter. However, other enlargement methods are not excluded. Further, the enlargement processing unit 130 and the enlargement processing unit 120 may be the same enlargement processing method or different enlargement processing methods. However, the enlargement processing unit 130 uses a method such as nearest neighbor that suppresses the occurrence of jaggies and ringing, and the enlargement processing unit 120 uses bicubic interpolation that can satisfactorily enlarge high-frequency components and the Lanczos-windowed sinc function filter A method using interpolation is recommended.
- the sharpening processing unit 140 may use any one of a bilateral filter having a sharp edge effect and a trilateral filter.
- the weighting factor of the filter is (i) the spatial distance between the target pixel of interest and the target pixel, and (ii) the pixel value of the target pixel and the pixel value of the target pixel. It may mean a filter that is determined by two elements, the difference between.
- the trilateral filter is a filter in which a third function is added to the bilateral filter.
- the trilateral filter may include a filter using a third weight as the impulse noise detector or a filter using a function based on a gradient between the pixel of interest and its surrounding pixels. .
- FIG. 3 is a graph (FIGS. 3A to 3D) showing a specific example of the nonlinear function in the ⁇ filter which is an example of the filter processing unit of the texture separation unit according to the first embodiment. is there.
- the horizontal axis indicates x which is the difference between the pixel value xn ⁇ k and the pixel value xn
- the vertical axis indicates the nonlinear function F (x).
- ⁇ filter which is a non-linear smoothing filter, is a digital filter that is effective in smoothing the pixel without impairing a sharp change in the pixel value.
- the ⁇ filter is expressed by the following equation (1) when the pixel for filtering is one-dimensional and 2N + 1 taps.
- the function F (x) is the absolute value of the function value ((F (x))) (where ((a)) indicates the absolute value of a) ((F (x))) ⁇ It is a nonlinear function that is suppressed to ⁇ 0.
- An example is shown in FIG. In the ⁇ filter of the above equation (1), the difference between the input and output pixel values is suppressed to a finite value determined by the following equation (2).
- the ⁇ filter compares the absolute value ((xn ⁇ xn ⁇ k)) of the difference between the pixel value xn of the central pixel and the pixel value xn ⁇ k of the surrounding pixels with a predetermined threshold ⁇ 0.
- the absolute value ((xn ⁇ xn ⁇ k)) is smaller than the predetermined threshold value ⁇ 0, the pixel value xn ⁇ k is substituted for bn ⁇ k, and a normal low-pass filter using ak as each tap coefficient
- the ⁇ filter may be configured by applying a one-dimensional ⁇ filter in the horizontal direction and the vertical direction of an image, respectively, or may be configured by a two-dimensional ⁇ filter.
- the bilateral filter is a non-linear filter and is characterized by smoothing noise without smoothing edges.
- the bilateral filter uses a Gaussian function as a weighting coefficient, and weights the spatial direction and the pixel value direction (gradation direction).
- the input pixel value at the spatial coordinates (x, y) is d (x, y)
- the output pixel value at the coordinates (x, y) is f (x, y)
- the number of taps is 2N + 1
- the bilateral filter is expressed by the following equation (4).
- ⁇ and ⁇ are the coefficients of the bilateral filter.
- ⁇ is decreased, the spatial smoothing range is expanded, and when ⁇ is decreased, the gradation smoothing range is expanded.
- bilateral filters see “Noise Reduction and Illustration-Based Image Generation by Bilateral Filters” by Kiichi Urahama, Journal of the Institute of Image Information and Television Engineers, Vol. 62, No. 8, pp. 1268 to 1273 (2008 ) ”
- the property of sharpening the edges of bilateral filters see “Kuraichi Urahama, Kohei Inoue, Bilateral Filter Edge Enhancement”, IEICE Transactions 2003 / 3Vol.J86-ANo.3 ” Please refer to.
- FIG. 4 is a flowchart showing the flow of the operation of the image processing apparatus according to the first embodiment.
- FIG. 5 is a waveform diagram of an image in which an image obtained by performing various processes in the texture separation unit of the image processing apparatus according to the first embodiment is expressed using a position on the image and a pixel value.
- FIG. 6 is another waveform diagram of an image in which the image obtained by performing various processes in the image processing apparatus according to the first embodiment is expressed using the position on the image and the pixel value (FIG. 6A).
- the horizontal axis represents the position on the image (that is, the pixel position)
- the vertical axis represents the pixel value. Represents.
- the pixel value I indicating the pixel in the image is acquired by the texture separation unit 110 (step S10).
- the enlargement processing unit 130 acquires a pixel value I indicating a pixel in the image (step S50).
- FIG. 5A shows an input image.
- the input image includes a contour component having a large change in pixel value, a base component indicating a flat portion in which the change in pixel value is uniform (that is, an example of the base image according to the present invention), and a texture indicating a minute change in the image. And a component (that is, an example of a texture image according to the present invention).
- the filter processing unit 111 of the texture separation unit 110 generates a pixel value LP (I) that has been filtered from the pixel value I (step S20).
- the waveform diagram of the image shown in FIG. 5B is obtained by performing a filter process that maintains the level change of the pixel value in the contour portion, that is, the edge portion of the input image.
- the texture component is removed while maintaining the level change of the pixel value at the edge portion.
- the texture image “I-LP (I)” is acquired by subtracting the pixel value LP (I) from the pixel value I by the subtractor 112 of the texture separation unit 110 (step S30). Specifically, the texture image shown in FIG. 5C is obtained. This texture image is obtained by subtracting the image after filtering that maintains the edge from the input image.
- the enlargement processing unit 120 performs an enlargement process on the texture image “I-LP (I)”, thereby generating an image “EX1 (I-LP (I))” (step S40).
- FIG. 6A shows a texture image obtained by enlarging the texture image shown in FIG.
- the enlargement processing unit 130 performs an enlargement process on the acquired pixel value I to obtain the image “EX2 ( I) "is generated (step S60). Specifically, FIG. 6B shows an image obtained by enlarging the input image.
- FIG. 6C shows a base image, in which the enlarged input image is sharpened to sharpen the change in the pixel value at the edge portion.
- the texture component is removed from the image.
- step S80 the image “EX1 (I-LP (I))” generated in step S40 and the image “BI (EX2 (I))” generated in step S70 are added to the adder.
- step 150 the image “EX1 (I ⁇ LP (I)) + BI (EX2 (I))” is generated and output.
- FIG. 6D shows an output image, and an image “EX1 (I ⁇ LP (I)) + BI (EX2 (I))” obtained by combining the enlarged texture image and the base image. Is obtained.
- the bilateral filter described above is a non-linear filter and has a feature of smoothing noise without smoothing the edge portion, but also has a property of sharpening the edge portion.
- the edge portion of the enlarged input image is steepened using this property to obtain a sharp base image.
- texture components representing fine changes in the image are also removed. Therefore, in the present embodiment, a texture image is generated from the input image, and an image obtained by enlarging the texture image and the base image are combined to obtain an image that maintains a sense of detail while sharpening the edge portion. .
- FIG. 7 is a block diagram showing the overall configuration of the image processing apparatus according to the comparative example.
- the image processing apparatus 100c includes an enlargement processing unit 101c and a sharpening processing unit 140c.
- the enlargement processing unit 101c performs processing for enlarging the input image to a predetermined number of pixels.
- the sharpening processing unit 140c performs a sharpening process on the enlarged input image and outputs it as an output image.
- noise components such as jaggies and ringing are generated not only by image enlargement processing.
- noise components such as jaggies and ringing are likely to occur at pixel positions where the difference in pixel values between adjacent pixels is large, such as around the contour of an image, so-called edge periphery.
- noise components such as jaggy and ringing generated in the enlargement processing unit 101c are further emphasized in the contour correction processing unit 140c, and the degree of occurrence of noise components in the output image is increased. Problems arise.
- noise components such as jaggy and ringing that are generated when the input image is enlarged by the enlargement processing unit 130 are removed by the bilateral filter.
- the edge sharpening action it is smoothed and reduced.
- the texture image is a subtraction between the input image and the image subjected to the filter processing of the input image by the filter processing unit 111, the pixels constituting the texture image as shown in FIG. The difference in pixel value between them is very small. Thereby, when the enlargement process by the enlargement processing unit 120 is performed on the texture image, generation of noise components such as jaggy and ringing can be remarkably suppressed.
- the skeleton component means a component substantially similar to the base component. Also in the skeletal image interpolation processing, conversion to a frequency band and iterative calculation processing using a TV norm are performed. For this reason, in the technology related to image enlargement based on the separation of the skeleton component and the texture component, the amount of image processing becomes enormous. For example, in online image processing using a communication line, the image processing time becomes long. A technical problem arises. Further, in the technique related to image enlargement based on the separation of the skeleton component and the texture component, the skeleton image is generated by the separation process in addition to the texture image. That is, the relationship of the following formula (5) is established.
- Input image skeleton image + texture image + ⁇ (5)
- the skeleton image is obtained by performing a separation process on the input image, and part of the edge portion of the skeleton image is smoothed during the separation process, and the smoothed skeleton image is further interpolated.
- the sharpening effect of the edge portion of the skeleton image is reduced.
- the conversion process to the frequency band in the image signal is not performed, and the iterative calculation process is not performed.
- the iterative calculation process is not performed.
- the sharpening process is performed after the enlargement process is performed on the input image, the edge portion is held without being smoothed, and a good sharpening effect can be obtained.
- FIG. 8 is a block diagram showing the overall configuration of the image processing apparatus according to the second embodiment.
- FIG. 9 is a waveform diagram of an image subjected to gradation correction processing according to the second embodiment (FIG. 9A) and a waveform diagram of an image subjected to gradation correction processing according to a comparative example (FIG. 9B). is there.
- the image processing apparatus 200 includes a texture separation unit 110, an enlargement processing unit 120, a noise removal processing unit 210, a nonlinear processing unit 220, a multiplier 230, an enlargement processing unit 130,
- the image processing apparatus includes a sharpening processing unit 140, a base image correction processing unit 240, and an adder 150.
- an example of correction means for performing the first correction processing according to the present invention is configured by at least one of the noise removal processing unit 210, the nonlinear processing unit 220, and the multiplier 230.
- an example of a correction unit that performs the second correction processing according to the present invention is configured by the base image correction processing unit 240.
- the noise removal processing unit 210 may be 3DNR (noise reduction) processing or isolated point removal processing.
- the 3DNR process can remove random noise and the like by filtering in the time axis direction.
- the isolated point removal process is a method of providing a noise reduction effect by determining that texture components are distributed as noise, assuming that the texture components are distributed with a certain size (region), and removing the texture components.
- the non-linear processing unit 220 performs non-linear filter processing on the texture image. For example, by performing S-curve processing, low-level texture components are reduced as noise, the range of texture components at intermediate levels that are likely to contain a lot of original texture components in the image is expanded, and more than a certain level By setting the texture component to a characteristic that suppresses, the overall image quality is improved.
- Multiplier 230 controls the amount of the texture component, and is designated with a magnification of L times.
- L 0, the image generated from the texture image is 0, and only the image created from the base image is the output image.
- the image generated from the texture image is reduced, and is combined with the image generated from the base image and output.
- L> 1 the image generated from the texture image is augmented and combined with the image generated from the base image to become an output image.
- an image processing unit that performs gradation correction processing and transient correction processing can be considered.
- gradation correction processing when an area with a gradual change in gradation (gradation area) is identified, low-pass filter processing or linear interpolation is performed in that area, and uniform gradation changes are made in the gradation area.
- the transient correction processing is image processing that increases the slope of the edge by spatial processing, and is image processing that is performed on at least one of the luminance signal and the color signal.
- the base image output while being sharpened by the sharpening processing unit 140 is input to the adder 150 via the base image correction processing unit 240.
- the texture image output while being enlarged by the enlargement processing unit 120 is input to the adder 150 via the noise removal processing unit 210, the nonlinear processing unit 220, and the multiplier 230.
- the adder 150 synthesizes the input base image and texture image and outputs them as an output image.
- the base image after the sharpening processing by the sharpening processing unit 140 is smoothed while retaining edges, and this base image is composed of an edge portion and a flat portion, Ingredients are greatly reduced.
- the base image correction unit 240 according to the second embodiment the gradation correction process is performed on the base image in which the texture component is reduced, so that a good gradation correction process can be performed.
- FIG. 9A in the gradation correction processing according to the second embodiment, in the base image in which the texture component is reduced, linearly according to the stepwise change of the pixel value. It is possible to change the pixel value, and good gradation correction processing can be performed.
- the gradation correction processing may be performed normally due to the interference of the texture components. There is a technical problem that may not be possible.
- the base image correction unit 240 since the transient correction process is performed on the base image with the texture component reduced, a good transient correction process can be performed. Specifically, in the base image in which the texture component is reduced, the gradient of the change in the pixel value of the contour is made steep according to the stepwise change in the pixel value around the contour without affecting the texture component. It is possible to perform a good transient correction process.
- random noise included in an image is a component composed of minute changes in the pixel value of the image, and thus is classified as a component similar to a texture component.
- 3DNR processing when three-dimensional noise reduction that is filtering processing in the time axis direction, so-called 3DNR processing, is performed on an input image, so-called afterimages become a problem.
- this afterimage tends to be generally detected in the contour of the image, that is, the edge portion.
- the noise removal processing unit 210 performs 3DNR processing only on the image contour, that is, the texture image not including the edge portion.
- the 3DNR process in the noise removal processing unit 210 does not affect the edge portion of the image.
- noise components that are generally removed by the isolated point removal processing are components that are composed of minute changes in the pixel values of the image, and thus are classified as components that are similar to texture components.
- the noise removal processing unit 210 performs isolated point removal processing with little or no influence from a pixel portion whose pixel value changes greatly, such as an edge of an image, that is, an edge portion. Therefore, it is possible to improve noise detection accuracy and effectively remove noise, which is very useful in practice.
- the above-described nonlinear processing and multiplier processing are performed on the texture image, and the above-described nonlinear processing and multiplier processing are not performed on the base image.
- FIG. 10 is a block diagram showing the overall configuration of the image processing apparatus according to the third embodiment.
- FIG. 11 is a histogram (FIGS. 11A and 11B) showing a quantitative and qualitative relationship between the frequency components of the input image and the frequency of each frequency component according to the third embodiment.
- the image processing apparatus 300 includes a texture separation unit 110, an enlargement processing unit 120, an enlargement processing unit 130, a sharpening processing unit 140, an adder 150, and a frequency analysis processing unit 310. It is configured with.
- An example of the measuring means according to the present invention is configured by the frequency analysis processing unit 310.
- the frequency analysis processing unit 310 analyzes the spatial frequency component of the input image, and sets at least one of the number of taps and the filter coefficient in the sharpening processing unit based on the analysis result and the enlarged information, and performs frequency analysis.
- the processing unit 310 analyzes the spatial frequency component of the input image, and sets at least one of the number of taps and the filter coefficient in the texture separation unit based on the analysis result.
- the input images are input to the texture separation unit 110, the enlargement processing unit 120, and the frequency analysis processing unit 310, respectively.
- the above-described enlargement information is input to the enlargement processing unit 120, the enlargement processing unit 130, and the frequency analysis processing unit 310, respectively.
- Information on the result of frequency analysis by the frequency analysis processing unit 310 is input to the texture separation unit 110 and the sharpening processing unit 140, respectively.
- a wavelet (Wavelet) transform, a Fourier (Fourier) transform, a DCT (Discrete Cosine Transform) transform, or a Hadamard transform is performed to obtain a frequency distribution statistic.
- the image sharpness is determined from the statistical value, and at least one of the number of taps and the filter coefficient is set.
- filter coefficients set from the frequency analysis processing unit to the filter processing unit of the texture separation unit include selection of an ⁇ value and a nonlinear function in the case of an ⁇ filter.
- examples of the filter coefficient include a coefficient ⁇ and a coefficient ⁇ .
- Parameter settings from the frequency analysis processing unit to the sharpening processing unit include coefficient ⁇ and coefficient ⁇ .
- the ⁇ value means ⁇ 0 in the above equation (3).
- the selection of the nonlinear function means that one of the nonlinear functions shown in FIGS. 3A to 3D is selected.
- the coefficient ⁇ means ⁇ in the above formula (4).
- the coefficient ⁇ means ⁇ in the above equation (4).
- the input image is Fourier transformed and developed in the frequency domain. Histogram processing is performed on the image data developed in the frequency domain, and frequency distribution statistics are obtained. Based on this statistic, as shown in FIG. 11 (b), for example, when the high frequency component exists at a frequency approximately the same as the frequency of the low frequency component, and the high frequency component is included, It may be determined that the sharpness is high. On the other hand, as shown in FIG. 11 (a), for example, the frequency of the high frequency component decreases as the frequency becomes high, compared with the frequency of the low frequency component, and thus the frequency of the high frequency component disappears. When the degree contained in the high frequency component is small, it may be determined that the sharpness is low.
- the number of taps may be changed in the increasing direction, and when the sharpness is low, the number of taps may be changed in the decreasing direction. In order to avoid an abrupt change, it may be considered to have a transition section in which the change in the number of taps is zero regardless of the sharpness level.
- the above processing may be performed on one image, but may be performed only on one block area.
- the frequency distribution statistic may be obtained from the accumulated value or average value of a plurality of images in consideration of the change in frequency distribution depending on the pattern.
- the frequency analysis process may be a reset process by scene change or channel change.
- the number of taps is increased in consideration of the enlarged information.
- the enlargement information is designated to enlarge n times the number of horizontal pixels and the number of vertical pixels
- the designated tap number may be determined by the following equation (6).
- Number of specified taps (number of taps in analysis results) x n (6)
- n is an enlargement rate.
- the frequency analysis processing unit 310 discriminates the sharpness of the input image in advance, so that the above-described graininess and detail of the image can be applied to various input images. Since the degree can be maintained at a certain level, it is very preferable in practice.
- the present invention is not limited to the above-described embodiments, and can be appropriately changed without departing from the spirit or idea of the invention that can be read from the claims and the entire specification, and an image processing apparatus with such a change, An image processing method, an image processing program, and a storage medium are also included in the technical scope of the present invention.
- the present invention is, for example, a digital camera, for example, a display device such as a liquid crystal TV, a PDP, an organic EL, an image reproducing device such as a DVD, Blu-ray, HD-DVD, HDD recorder, personal computer, for example, a terrestrial digital broadcast receiving terminal, It can be used for cable digital broadcast receiver terminals, satellite digital broadcast receiver terminals, IP broadcast receiver terminals, car navigation systems, mobile phones, digital broadcast receivers such as one-seg receivers, and image processing methods in the image processing apparatus. It is.
- the present invention can be used for image processing methods such as still image and moving image editing software, still image and moving image reproduction software, an image processing program, and a storage medium in which the image processing program is stored.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Editing Of Facsimile Originals (AREA)
- Facsimile Image Signal Circuits (AREA)
Abstract
Description
(全体構成)
先ず、本発明の第1実施形態について説明する。図1は、第1実施形態に係る画像処理装置の全体構成を示すブロック図である。
次に、図2を参照して、第1実施形態に係るテクスチャ分離部110の詳細構成について説明する。ここに、図2は、第1実施形態に係るテクスチャ分離部110の詳細構成を示したブロック図である。
次に、図3を参照して、テクスチャ分離部のフィルタ処理部の一例であるεフィルタについて説明する。ここに、図3は、第1実施形態に係るテクスチャ分離部のフィルタ処理部の一例であるεフィルタにおける非線形関数の具体例を示したグラフ(図3(a)ないし図3(d))である。尚、図3(a)ないし図3(d)において、横軸は、画素値xn-kと画素値xnとの差であるxを示し、縦軸は、非線形関数F(x)を示す。
…… (1)
ここで、関数F(x)は、その関数値の絶対値((F(x))) (但し、((a))はaの絶対値を示す)が((F(x)))≦ε0に抑えられている非線形関数である。その例を図3に示す。上述の式(1)のεフィルタでは、入力と出力の画素値の差が、次の式(2)で定まる有限の値に抑えられている。
…… (2)
これにより、入出力画素値の差が±ε以内に制限され、画素値の急峻な変化が維持される。
ここで、bn-kに例えば図3(a)のF(x)を採用した場合、bn-kは、次の式(3)によって表される。
…… (3)
このときεフィルタは、フィルタ処理の中心画素の画素値xnと周辺画素の画素値xn-kとの差の絶対値((xn-xn-k))を所定の閾値ε0と比較する。その結果、絶対値((xn-xn-k))が所定の閾値ε0よりも小さい場合は、画素値xn-kをbn-kに代入し、akを各タップ係数とした通常のローパスフィルタと同様の処理を行うことにより、中心画素を中心として画像を平滑化する。一方、絶対値((xn-xn-k))が所定の閾値ε0よりも大きい場合は、画素値xnをbn-kに代入し、画素値xn-kを画素値xnに置き換えた後に、中心画素を中心としたローパスフィルタ処理を行うことにより、画素値xn-kを無視した平滑化を行う。尚、概ね同様にして、bn-kに、図3(b)に示されたF(x)、図3(c)に示されたF(x)、又は図3(d)に示されたF(x)を採用してよい。また、εフィルタに関する詳細な内容については、『「非線形ディジタルフィルタとその応用」荒川薫、電子情報通信学会誌 Vol.77,No.8,pp844-852,1994年8月』を参照されたし。
次に、鮮鋭化処理部の一例であると共に、テクスチャ分離部のフィルタ処理部の一例であるバイラテラルフィルタについて説明する。
…… (4)
ここでα、βはバイラテラルフィルタの係数であり、αを小さくすると空間方向の平滑範囲が広がり、βを小さくすると階調方向の平滑範囲が広がる。バイラテラルフィルタに関する詳細な内容については、『「バイラテラルフィルタによるノイズ除去とイラスト風画像の生成」浦浜喜一著、映像情報メディア学会誌Vol.62,No.8,pp.1268~1273(2008)』を参照されたし。また、バイラテラルフィルタが有するエッジを急峻化する性質については、『「浦浜喜一,井上光平”バイラテラルフィルタのエッジ強調性”電子情報通信学会論文誌2003/3Vol.J86-ANo.3」』を参照されたし。
次に、図4ないし図6を参照して、第1実施形態に係る画像処理装置の動作原理について説明する。ここに、図4は、第1実施形態に係る画像処理装置の動作の流れを示したフローチャートである。
次に、図7を参照して、第1実施形態に係る画像処理装置の作用と効果とについて検討する。ここに、図7は、比較例に係る画像処理装置の全体構成を示すブロック図である。
次に、第1実施形態に係る画像処理装置の作用と効果とについて、更に検討する。
このため、骨格画像は入力画像に分離処理を施すことで取得しており、骨格画像のエッジ部の一部が分離処理の際に平滑化されてしまい、この平滑化された骨格画像を更に補間処理した場合、骨格画像のエッジ部の鮮鋭化効果が低減されてしまうという技術的な問題点が生じる。
(全体構成)
次に、図8及び図9を参照して、本発明の第2実施形態について説明する。図8は、第2実施形態に係る画像処理装置の全体構成を示すブロック図である。図9は、第2実施形態に係るグラデーション補正処理を施した画像の波形図(図9(a))及び比較例に係るグラデーション補正処理を施した画像の波形図(図9(b))である。
(全体構成)
次に、図10及び図11を参照して、本発明の第3実施形態について説明する。図10は、第3実施形態に係る画像処理装置の全体構成を示すブロック図である。図11は、第3実施形態に係る入力画像の周波数成分と各周波数成分の頻度との定量的及び定性的な関係を示したヒストグラム(図11(a)及び図11(b))である。
但し、nは拡大率。
Claims (13)
- 第1画像を取得する取得手段と、
前記取得された第1画像からテクスチャ画像を抽出する抽出手段と、
前記抽出されたテクスチャ画像を拡大する第1拡大手段と、
前記取得された第1画像を拡大する第2拡大手段と、
前記拡大された第1画像の輪郭に鮮鋭化処理を施すことにより輪郭が鮮鋭化されたベース画像を取得するベース画像取得手段と、
前記拡大されたテクスチャ画像と、前記取得されたベース画像とを合成する合成手段と
を備えることを特徴とする画像処理装置。 - 前記ベース画像取得手段は、バイラテラルフィルタ又はトリラテラルフィルタを用いて前記ベース画像を取得することを特徴とする請求の範囲第1項に記載の画像処理装置。
- 前記抽出手段は、前記取得された第1画像から、前記取得された第1画像にバイラテラルフィルタリング又はεフィルタリングを施した画像を減算することにより、前記テクスチャ画像を抽出することを特徴とする請求の範囲第1項に記載の画像処理装置。
- 前記拡大されたテクスチャ画像に対して、前記拡大されたテクスチャ画像の性質に応じた補正を行う第1補正処理、及び、前記取得されたベース画像に対して、前記取得されたベース画像の性質に応じた補正を行う第2補正処理のうち少なくともいずれか一方の補正処理を施す補正手段を更に備え、
前記合成手段は、前記少なくともいずれか一方の補正処理が施された後、前記テクスチャ画像と、前記ベース画像とを合成することを特徴とする請求の範囲第1項に記載の画像処理装置。 - 前記補正手段は、前記第1補正処理として、前記拡大されたテクスチャ画像に対して、3次元ノイズリダクション処理、孤立点除去処理、非線形処理、及び乗算処理のうち少なくとも一つの補正を施し、前記拡大されたテクスチャ画像を補正することを特徴とする請求の範囲第4項に記載の画像処理装置。
- 前記補正手段は、前記第2補正処理として、前記取得されたベース画像に対して、グラデーション補正処理、及びトランジェント補正処理のうち少なくとも一つの処理を施し、前記取得されたベース画像を補正することを特徴とする請求の範囲第4項に記載の画像処理装置。
- 前記取得された一つの第1画像もしくは複数の第1画像群のうち各第1画像毎における任意の領域で周波数成分の分布を測定する測定手段を更に備え、
前記ベース画像取得手段は、前記測定された周波数成分の分布に基づいて、前記ベース画像を取得することに加えて又は代えて、
前記抽出手段は、前記測定された周波数成分の分布に基づいて、前記テクスチャ画像を抽出することを特徴とする請求の範囲第1項に記載の画像処理装置。 - 前記ベース画像取得手段は、前記測定された周波数成分の分布に応じてタップ数、及びフィルタ係数のうち少なくとも一つを変化させ、前記ベース画像を取得することに加えて又は代えて、
前記抽出手段は、前記測定された周波数成分の分布に応じてタップ数、及びフィルタ係数のうち少なくとも一つを変化させ、前記テクスチャ画像を抽出することを特徴とする請求の範囲第7項に記載の画像処理装置。 - 前記測定手段は、前記第1画像を取得する際のシーン毎又はチャンネル毎に、前記周波数成分の測定を初期化することを特徴とする請求の範囲第7項に記載の画像処理装置。
- 前記ベース画像取得手段は、前記測定された分布に加えて前記第1画像を拡大するための拡大情報に基づいて、前記ベース画像を取得することを特徴とする請求の範囲第7項に記載の画像処理装置。
- 第1画像を取得する取得工程と、
前記取得された第1画像からテクスチャ画像を抽出する抽出工程と、
前記抽出されたテクスチャ画像を拡大する第1拡大工程と、
前記取得された第1画像を拡大する第2拡大工程と、
前記拡大された第1画像の輪郭に鮮鋭化処理を施すことにより輪郭が鮮鋭化されたベース画像を取得するベース画像取得工程と、
前記拡大されたテクスチャ画像と、前記取得されたベース画像とを合成する合成工程と
を備えることを特徴とする画像処理方法。 - コンピュータを備える装置によって実行される画像処理プログラムであって、
第1画像を取得する取得手段と、
前記取得された第1画像からテクスチャ画像を抽出する抽出手段と、
前記抽出されたテクスチャ画像を拡大する第1拡大手段と、
前記取得された第1画像を拡大する第2拡大手段と、
前記拡大された第1画像の輪郭に鮮鋭化処理を施すことにより輪郭が鮮鋭化されたベース画像を取得するベース画像取得手段と、
前記拡大されたテクスチャ画像と、前記取得されたベース画像とを合成する合成手段として前記コンピュータを機能させることを特徴とする画像処理プログラム。 - 請求の範囲第12項に記載の画像処理プログラムを記憶したことを特徴とする記憶媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/395,797 US20120189208A1 (en) | 2009-09-16 | 2009-09-16 | Image processing apparatus, image processing method, image processing program, and storage medium |
JP2011531688A JPWO2011033619A1 (ja) | 2009-09-16 | 2009-09-16 | 画像処理装置、画像処理方法、画像処理プログラム、及び、記憶媒体 |
PCT/JP2009/066154 WO2011033619A1 (ja) | 2009-09-16 | 2009-09-16 | 画像処理装置、画像処理方法、画像処理プログラム、及び、記憶媒体 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2009/066154 WO2011033619A1 (ja) | 2009-09-16 | 2009-09-16 | 画像処理装置、画像処理方法、画像処理プログラム、及び、記憶媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011033619A1 true WO2011033619A1 (ja) | 2011-03-24 |
Family
ID=43758246
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/066154 WO2011033619A1 (ja) | 2009-09-16 | 2009-09-16 | 画像処理装置、画像処理方法、画像処理プログラム、及び、記憶媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20120189208A1 (ja) |
JP (1) | JPWO2011033619A1 (ja) |
WO (1) | WO2011033619A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011076274A (ja) * | 2009-09-30 | 2011-04-14 | Casio Computer Co Ltd | 画像処理装置、画像処理方法及びプログラム |
WO2013151163A1 (ja) * | 2012-04-05 | 2013-10-10 | シャープ株式会社 | 画像処理装置、画像表示装置、画像処理方法、コンピュータプログラム及び記録媒体 |
JP2014017742A (ja) * | 2012-07-10 | 2014-01-30 | Ns Solutions Corp | 画像処理装置、画像処理方法及びプログラム |
JP2014179689A (ja) * | 2013-03-13 | 2014-09-25 | Nec Corp | 画像処理方法、及び画像処理装置 |
US9390485B2 (en) | 2014-03-07 | 2016-07-12 | Ricoh Company, Ltd. | Image processing device, image processing method, and recording medium |
JP2016527611A (ja) * | 2013-06-19 | 2016-09-08 | クゥアルコム・テクノロジーズ・インコーポレイテッド | デジタルカメラに対する単一フレームベースのスーパー解像度補間のためのシステムおよび方法 |
US9940718B2 (en) | 2013-05-14 | 2018-04-10 | Samsung Electronics Co., Ltd. | Apparatus and method for extracting peak image from continuously photographed images |
JP2018169948A (ja) * | 2017-03-30 | 2018-11-01 | 株式会社メガチップス | 超解像画像生成装置、プログラム、および、集積回路 |
CN111445398A (zh) * | 2020-03-11 | 2020-07-24 | 浙江大华技术股份有限公司 | 热成像图像处理方法、设备及计算机可读存储介质 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11308589B2 (en) * | 2018-05-03 | 2022-04-19 | Canon Virginia, Inc. | Devices, systems, and methods for enhancing images |
JP2012249079A (ja) * | 2011-05-27 | 2012-12-13 | Semiconductor Components Industries Llc | 輪郭補正装置 |
US9007441B2 (en) * | 2011-08-04 | 2015-04-14 | Semiconductor Components Industries, Llc | Method of depth-based imaging using an automatic trilateral filter for 3D stereo imagers |
JP6020904B2 (ja) * | 2012-02-01 | 2016-11-02 | パナソニックIpマネジメント株式会社 | 画像処理装置および撮像装置 |
JP6429444B2 (ja) * | 2013-10-02 | 2018-11-28 | キヤノン株式会社 | 画像処理装置、撮像装置及び画像処理方法 |
US9495731B2 (en) | 2015-04-15 | 2016-11-15 | Apple Inc. | Debanding image data based on spatial activity |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10290368A (ja) * | 1997-04-15 | 1998-10-27 | Fuji Photo Film Co Ltd | 輪郭強調方法 |
JP2002135653A (ja) * | 2000-10-19 | 2002-05-10 | Sanyo Electric Co Ltd | 画像信号処理装置 |
JP2002170114A (ja) * | 2000-12-04 | 2002-06-14 | Fuji Photo Film Co Ltd | 画像処理方法および装置並びに記録媒体 |
JP2002199235A (ja) * | 2000-12-26 | 2002-07-12 | Canon Inc | 画像処理装置およびその制御方法 |
JP2004007395A (ja) * | 2002-03-27 | 2004-01-08 | Sanyo Electric Co Ltd | 立体画像処理方法および装置 |
JP2004112728A (ja) * | 2002-09-20 | 2004-04-08 | Ricoh Co Ltd | 画像処理装置 |
JP2005340900A (ja) * | 2004-05-24 | 2005-12-08 | Sony Corp | 信号処理装置および方法、記録媒体、並びにプログラム |
JP2007181186A (ja) * | 2005-12-26 | 2007-07-12 | Samsung Electronics Co Ltd | 入力映像に適応的な解像度変換装置及び解像度変換方法 |
JP2008033692A (ja) * | 2006-07-28 | 2008-02-14 | Canon Inc | 画像処理装置及びその制御方法、並びに、コンピュータプログラム及びコンピュータ可読記憶媒体 |
JP2008118374A (ja) * | 2006-11-02 | 2008-05-22 | Necディスプレイソリューションズ株式会社 | 画質制御回路および画質制御方法 |
JP2008242696A (ja) * | 2007-03-27 | 2008-10-09 | Casio Comput Co Ltd | 画像処理装置及びカメラ |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001338288A (ja) * | 2000-05-25 | 2001-12-07 | Nec Corp | 画像処理方法とシステム並びに画像表示制御装置 |
JP2004318693A (ja) * | 2003-04-18 | 2004-11-11 | Konica Minolta Photo Imaging Inc | 画像処理方法、画像処理装置及び画像処理プログラム |
US8339421B2 (en) * | 2008-03-03 | 2012-12-25 | Mitsubishi Electric Corporation | Image processing apparatus and method and image display apparatus and method |
-
2009
- 2009-09-16 US US13/395,797 patent/US20120189208A1/en not_active Abandoned
- 2009-09-16 WO PCT/JP2009/066154 patent/WO2011033619A1/ja active Application Filing
- 2009-09-16 JP JP2011531688A patent/JPWO2011033619A1/ja not_active Ceased
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10290368A (ja) * | 1997-04-15 | 1998-10-27 | Fuji Photo Film Co Ltd | 輪郭強調方法 |
JP2002135653A (ja) * | 2000-10-19 | 2002-05-10 | Sanyo Electric Co Ltd | 画像信号処理装置 |
JP2002170114A (ja) * | 2000-12-04 | 2002-06-14 | Fuji Photo Film Co Ltd | 画像処理方法および装置並びに記録媒体 |
JP2002199235A (ja) * | 2000-12-26 | 2002-07-12 | Canon Inc | 画像処理装置およびその制御方法 |
JP2004007395A (ja) * | 2002-03-27 | 2004-01-08 | Sanyo Electric Co Ltd | 立体画像処理方法および装置 |
JP2004112728A (ja) * | 2002-09-20 | 2004-04-08 | Ricoh Co Ltd | 画像処理装置 |
JP2005340900A (ja) * | 2004-05-24 | 2005-12-08 | Sony Corp | 信号処理装置および方法、記録媒体、並びにプログラム |
JP2007181186A (ja) * | 2005-12-26 | 2007-07-12 | Samsung Electronics Co Ltd | 入力映像に適応的な解像度変換装置及び解像度変換方法 |
JP2008033692A (ja) * | 2006-07-28 | 2008-02-14 | Canon Inc | 画像処理装置及びその制御方法、並びに、コンピュータプログラム及びコンピュータ可読記憶媒体 |
JP2008118374A (ja) * | 2006-11-02 | 2008-05-22 | Necディスプレイソリューションズ株式会社 | 画質制御回路および画質制御方法 |
JP2008242696A (ja) * | 2007-03-27 | 2008-10-09 | Casio Comput Co Ltd | 画像処理装置及びカメラ |
Non-Patent Citations (2)
Title |
---|
"Eizo Media Shori Symposium, 08 November 2006 (08.11.2006), Dai 11 Kai Symposium Shiryo", article YUKI ISHII ET AL.: "Josangata Kokkaku/Texture Bunri ni Motozuku Gazo Kakudai", pages: 121 - 122 * |
YUKI ISHII ET AL.: "Josangata Kokkaku/Texture Gazo Bunri no Gazo Shori eno Oyo", THE IEICE TRANSACTIONS ON COMMUNICATIONS, vol. J90-D, no. 7, 1 July 2007 (2007-07-01), pages 1682 - 1685 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011076274A (ja) * | 2009-09-30 | 2011-04-14 | Casio Computer Co Ltd | 画像処理装置、画像処理方法及びプログラム |
WO2013151163A1 (ja) * | 2012-04-05 | 2013-10-10 | シャープ株式会社 | 画像処理装置、画像表示装置、画像処理方法、コンピュータプログラム及び記録媒体 |
JP2013219462A (ja) * | 2012-04-05 | 2013-10-24 | Sharp Corp | 画像処理装置、画像表示装置、画像処理方法、コンピュータプログラム及び記録媒体 |
JP2014017742A (ja) * | 2012-07-10 | 2014-01-30 | Ns Solutions Corp | 画像処理装置、画像処理方法及びプログラム |
JP2014179689A (ja) * | 2013-03-13 | 2014-09-25 | Nec Corp | 画像処理方法、及び画像処理装置 |
US9940718B2 (en) | 2013-05-14 | 2018-04-10 | Samsung Electronics Co., Ltd. | Apparatus and method for extracting peak image from continuously photographed images |
JP2016527611A (ja) * | 2013-06-19 | 2016-09-08 | クゥアルコム・テクノロジーズ・インコーポレイテッド | デジタルカメラに対する単一フレームベースのスーパー解像度補間のためのシステムおよび方法 |
US9390485B2 (en) | 2014-03-07 | 2016-07-12 | Ricoh Company, Ltd. | Image processing device, image processing method, and recording medium |
JP2018169948A (ja) * | 2017-03-30 | 2018-11-01 | 株式会社メガチップス | 超解像画像生成装置、プログラム、および、集積回路 |
CN111445398A (zh) * | 2020-03-11 | 2020-07-24 | 浙江大华技术股份有限公司 | 热成像图像处理方法、设备及计算机可读存储介质 |
CN111445398B (zh) * | 2020-03-11 | 2023-06-20 | 浙江大华技术股份有限公司 | 热成像图像处理方法、设备及计算机可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US20120189208A1 (en) | 2012-07-26 |
JPWO2011033619A1 (ja) | 2013-02-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011033619A1 (ja) | 画像処理装置、画像処理方法、画像処理プログラム、及び、記憶媒体 | |
US8811773B2 (en) | Image enlargement apparatus, method, integrated circuit, and program | |
JP5839631B2 (ja) | ブロックノイズの検出方法 | |
US7483040B2 (en) | Information processing apparatus, information processing method, recording medium, and program | |
JP5979856B2 (ja) | イメージにじみ検出器 | |
JP4858609B2 (ja) | ノイズ低減装置、ノイズ低減方法、及びノイズ低減プログラム | |
KR101009999B1 (ko) | 윤곽보정방법, 화상처리장치 및 표시장치 | |
JP5837572B2 (ja) | 画像処理装置、画像処理方法、画像処理のためのコンピュータプログラム及び記録媒体 | |
JP6101817B2 (ja) | 画像高画質化装置、画像表示装置、画像高画質化方法及びコンピュータプログラム | |
WO2009107197A1 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP2011065339A (ja) | 画像処理装置、画像処理方法、画像処理プログラム、及び、記憶媒体 | |
JP2011134204A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
EP3438923B1 (en) | Image processing apparatus and image processing method | |
JP5652272B2 (ja) | 画像処理装置、画像処理プログラム及び画像処理方法 | |
JP3625144B2 (ja) | 画像処理方法 | |
KR102470242B1 (ko) | 영상 처리 장치, 영상 처리 방법, 및 프로그램 | |
CN113808038A (zh) | 图像处理方法、介质及电子设备 | |
KR100905524B1 (ko) | 확대된 영상의 흐려짐 선명화 장치 및 방법 | |
JP2010199994A (ja) | 画像処理装置 | |
WO2014034242A1 (ja) | 画像処理装置、および画像処理方法、並びにプログラム | |
JP3751806B2 (ja) | 画像処理装置および記録媒体 | |
KR101332030B1 (ko) | 영상 확대방법 및 이를 수행하는 프로그램을 기록한 컴퓨터로 읽을 수 있는 기록매체 | |
JP4235217B2 (ja) | フィルタリング型画像補正装置およびフィルタリング型画像補正方法 | |
JP5794169B2 (ja) | 映像ノイズ低減装置及び映像ノイズ低減方法 | |
KR100889935B1 (ko) | 렌즈 왜곡 보정으로 흐려진 영상의 선명화 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09849478 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011531688 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13395797 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09849478 Country of ref document: EP Kind code of ref document: A1 |