CN114792283A - Image processing method, device and equipment and computer readable storage medium - Google Patents
Image processing method, device and equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN114792283A CN114792283A CN202110105672.9A CN202110105672A CN114792283A CN 114792283 A CN114792283 A CN 114792283A CN 202110105672 A CN202110105672 A CN 202110105672A CN 114792283 A CN114792283 A CN 114792283A
- Authority
- CN
- China
- Prior art keywords
- kernel
- determining
- type
- defocus
- brightness
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 17
- 238000012545 processing Methods 0.000 claims abstract description 79
- 238000000034 method Methods 0.000 claims abstract description 55
- 230000010354 integration Effects 0.000 claims description 71
- 238000013507 mapping Methods 0.000 claims description 43
- 238000004590 computer program Methods 0.000 claims description 20
- 238000009877 rendering Methods 0.000 abstract description 24
- 230000008569 process Effects 0.000 abstract description 17
- 230000000694 effects Effects 0.000 abstract description 15
- 238000010586 diagram Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 10
- 238000004364 calculation method Methods 0.000 description 6
- 230000004044 response Effects 0.000 description 4
- 229910003460 diamond Inorganic materials 0.000 description 3
- 239000010432 diamond Substances 0.000 description 3
- 230000011514 reflex Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The application provides an image processing method, an image processing device, image processing equipment and a computer readable storage medium, relates to the technical field of image processing, and aims to process an initial image according to brightness stretching weight and defocusing blur kernel of each pixel point in the initial image, so that the rendering effect of image processing in a mobile terminal is enriched. The method comprises the following steps: determining a brightness stretching weight corresponding to a first pixel point of an initial image and determining the type of a defocusing fuzzy core of the initial image; and processing the initial image according to the brightness stretching weight and the type of the defocusing fuzzy core to obtain a target image.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, an image processing device, and a computer-readable storage medium.
Background
With the popularization of mobile terminals (e.g., mobile phones, wearable devices, tablet computers, etc.), users can take pictures everywhere with the mobile terminals. When a user usually takes a picture with the mobile terminal, the user wants to take a rendering effect of the single lens reflex. However, some rendering effects of the single lens reflex cannot be achieved in the mobile terminal at present.
Disclosure of Invention
The embodiment of the application provides an image processing method, device and equipment and a computer readable storage medium, which can process an initial image according to brightness stretching weight and defocusing fuzzy core of each pixel point in the initial image, and enrich rendering effect of image processing in a mobile terminal.
In view of the above, in a first aspect, the present application provides an image processing method, including:
determining a brightness stretching weight corresponding to a first pixel point of an initial image and determining the type of a defocusing fuzzy core of the initial image;
and processing the initial image according to the brightness stretching weight and the type of the defocusing fuzzy core to obtain a target image.
Optionally, the processing the initial image according to the brightness stretching weight and the type of the defocus blur kernel to obtain a target image includes: determining a mapping relation according to the brightness stretching weight and the type of the defocusing fuzzy core; and processing the initial image through the mapping relation and the defocusing fuzzy core to obtain the target image.
Optionally, the determining a luminance stretching weight corresponding to the first pixel point of the initial image includes: and determining the brightness stretching weight corresponding to the first pixel point according to the size relation between the brightness value of the first pixel point and a preset brightness threshold value.
Optionally, the determining, according to a size relationship between the brightness value of the first pixel point and a preset brightness threshold, a brightness stretching weight corresponding to the first pixel point includes: determining the brightness stretching weight of the first pixel point as a first numerical value under the condition that the brightness value of the first pixel point is greater than the brightness threshold value;
determining the brightness stretching weight of the first pixel point as a second numerical value under the condition that the brightness value of the first pixel point is smaller than or equal to the brightness threshold value;
wherein the second value is less than the first value.
Optionally, the determining the type of the defocus blur kernel of the initial image includes: carrying out binarization processing on the defocusing fuzzy core to obtain a discrete defocusing core; and determining the type of the defocusing fuzzy kernel of the initial image according to the non-zero area in the discrete defocusing kernel.
Optionally, the determining a type of the defocus blur kernel of the initial image according to a non-zero region in the discrete defocus kernel includes: determining that the type of the defocus blur kernel comprises a first type defocus blur kernel if each row of non-zero regions in the discrete defocus kernel is a single rectangle and any column of the non-zero regions is not a single rectangle; determining that the type of the defocus blur kernel comprises a second type of defocus blur kernel if each column of a non-zero region in the discrete defocus kernel is a single rectangle and any row of the non-zero region is not a single rectangle; determining that the type of the defocus blur kernel comprises a third type of defocus blur kernel if each row of non-zero regions in the discrete defocus kernel is a single rectangle and each column of the non-zero regions is a single rectangle; determining the type of the defocus blur kernel comprises a fourth type of defocus blur kernel if any row of a non-zero region in the discrete defocus kernel is not a single rectangle, any column of the non-zero region is not a single rectangle, and the discrete defocus kernel is an axisymmetric structure.
Optionally, the determining a mapping relationship according to the brightness stretching weight and the type of the defocus blur kernel includes: determining a target integral direction according to the type of the defocusing fuzzy core; and determining the mapping relation according to the brightness stretching weight in the target integration direction.
Optionally, the determining a target integration direction according to the type of the defocus blur kernel includes: determining that the target integration direction comprises a line integration direction if the type of the defocus blur kernel is the first type of defocus blur kernel; determining that the target integration direction comprises a column integration direction if the type of the defocus blur kernel is the second type of defocus blur kernel; determining that the target integration direction comprises a row integration direction or a column integration direction if the type of the defocus blur kernel is the third type of defocus blur kernel; and under the condition that the type of the defocusing fuzzy kernel is the fourth defocusing fuzzy kernel, segmenting the fourth defocusing fuzzy kernel along a symmetry axis, and determining the target integration direction according to the segmented fourth defocusing fuzzy kernel.
Optionally, the processing the initial image through the mapping relationship and the defocus blur kernel to obtain the target image includes: acquiring a one-way edge of a discrete defocus kernel corresponding to the defocus blur kernel; and processing the initial image to obtain the target image through the elements surrounded by the one-way edge of the discrete defocus core and the mapping relation.
Optionally, the method further comprises:
responding to the selection operation of a user on the defocusing fuzzy kernel, and determining the defocusing fuzzy kernel;
or,
acquiring a depth image corresponding to the initial image, coordinate information of a focusing target in the initial image and a blurring degree, calculating a blurring radius corresponding to the first pixel point according to the depth image, the coordinate information and the blurring degree, and determining a defocusing blurring kernel corresponding to the first pixel point according to the blurring radius; wherein the blurring degree is proportional to the aperture value.
By adopting the method, the defocusing fuzzy kernel can be used for blurring the depth of field of the image, and the brightness stretching weight is related to the light spot, so that the initial image can be processed by adopting the defocusing fuzzy kernel and the brightness stretching weight, and the effects of light spot rendering, blurring rendering and the like are obtained. In addition, the image processing efficiency can be improved through the mapping relation. In addition, the shape of the light spot is determined by the type of the defocusing fuzzy kernel, so that various types of light spot rendering are realized by selecting the defocusing fuzzy kernel, and the light spot effect is diversified.
In a second aspect, the present application provides an image processing apparatus comprising:
the determining module is used for determining the brightness stretching weight corresponding to a first pixel point of an initial image and determining the type of a defocusing fuzzy core of the initial image;
and the processing module is used for processing the initial image according to the brightness stretching weight and the type of the defocusing fuzzy core to obtain a target image.
In a third aspect, the present application provides an image processing apparatus comprising a processor, a memory, and a computer program stored in the memory and executable on the processor, the processor implementing the method according to the first aspect or any alternative of the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements a method according to the first aspect or any of the alternatives of the first aspect.
In a fifth aspect, the present application provides a computer program product for causing an image processing apparatus to perform the steps of the method of the first aspect or any alternative of the first aspect when the computer program product is run on the image processing apparatus.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic interface diagram of a mobile terminal according to an embodiment of the present application;
FIG. 3 is a schematic interface diagram of another mobile terminal according to an embodiment of the present application;
fig. 4 is a schematic interface diagram of another mobile terminal provided in the embodiment of the present application;
FIG. 5 is a schematic diagram of a processing procedure of a defocus blur kernel according to an embodiment of the present application;
FIG. 6 is a diagram illustrating another defocus blur kernel processing procedure according to an embodiment of the present application;
FIG. 7 is a diagram illustrating another defocus blur kernel processing procedure provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items. Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
It should also be appreciated that reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless otherwise specifically stated.
The following describes an exemplary image processing method provided by the present application with a specific embodiment.
Referring to fig. 1, fig. 1 is a schematic flowchart of an image processing method provided in the present application. The execution subject of the image processing method in this embodiment may be a mobile terminal, for example, an android mobile phone, an iOS mobile phone, a watch, a bracelet, and other devices with a photographing function. As shown in fig. 1, the image processing method may include:
s101, determining a brightness stretching weight corresponding to a first pixel point of the initial image and determining the type of a defocus blur kernel of the initial image.
The initial image may be an image obtained by a user by using the mobile terminal to currently take a picture, an image stored in a local area of the mobile terminal, an image currently downloaded by the user by using the mobile terminal (for example, an image downloaded from a web page or an image downloaded from a chat log), and the like.
It can be understood that the first pixel point is any pixel point in the initial image.
In this embodiment of the present application, determining the luminance stretching weight corresponding to the first pixel point of the initial image may include: and determining the brightness stretching weight corresponding to the first pixel point according to the size relation between the brightness value of the first pixel point and a preset brightness threshold value. Wherein the brightness threshold may be determined empirically.
Further, under the condition that the brightness value of the first pixel point is larger than the brightness threshold, determining the brightness stretching weight of the first pixel point as a first numerical value; determining the brightness stretching weight of the first pixel point as a second numerical value under the condition that the brightness value of the first pixel point is smaller than or equal to the brightness threshold value; wherein the second value is less than the first value. Illustratively, the second value is 1 and the first value is a value greater than 1.
In some embodiments of the present application, when the brightness value of the first pixel point is greater than the brightness threshold, the following method may be further included, but is not limited to, determine the brightness stretching weight corresponding to the first pixel point:
in the first mode, an exponential function about the brightness value of the pixel point and the brightness stretching weight of the pixel point can be constructed in advance. The larger the brightness value of the pixel point is, the larger the brightness stretching weight of the pixel point is, and conversely, the smaller the brightness value of the pixel point is, the smaller the brightness stretching weight of the pixel point is. Therefore, the brightness value of the first pixel point can be input into a pre-constructed exponential function to obtain the brightness stretching weight of the first pixel point.
In a second mode, a corresponding relation between a brightness interval and a brightness stretching weight can be constructed in advance, wherein the brightness interval is an interval related to a brightness value; if the brightness values in one brightness interval are all larger than the brightness values in another brightness interval, the brightness stretching weight corresponding to the one brightness interval is larger than the brightness stretching weight corresponding to the other brightness interval. Therefore, a target brightness interval where the brightness value of the first pixel point is located can be determined, and the brightness stretching weight corresponding to the target brightness interval is determined to be the brightness stretching weight of the first pixel point, wherein the brightness interval comprises the target brightness interval.
A third mode, a corresponding relation between a brightness difference interval and a brightness stretching weight can be pre-established, wherein the brightness difference interval is an interval related to the difference between the brightness value and the brightness threshold value; if the brightness difference value in one brightness difference value interval is greater than the brightness difference value in another brightness difference value interval, the brightness stretching weight corresponding to the brightness difference value interval is greater than the brightness stretching weight corresponding to the another brightness difference value interval. Therefore, the difference between the brightness value of the first pixel point and the brightness threshold value can be calculated to obtain a target brightness difference, a target brightness difference interval where the target brightness difference is located is determined, and the brightness stretching weight corresponding to the target brightness difference interval is determined to be the brightness stretching weight of the first pixel point, wherein the brightness difference interval comprises the target brightness difference interval.
Therefore, in the first to third modes, the brightness stretching weight of the pixel point is determined according to the corresponding brightness value, so that the brighter pixel point is brighter, and the visual effect is more consistent.
Similarly, for the pixel points with the brightness values smaller than or equal to the brightness threshold, the brightness stretching weight can also be determined according to the corresponding brightness values, so that darker pixel points are darker.
In an optional embodiment of the present application, if the initial image is a gray image, the brightness value of the first pixel point may be determined according to the gray value; the larger the gray value of the first pixel point is, the larger the corresponding brightness value is, and conversely, the smaller the gray value of the first pixel point is, the smaller the corresponding brightness value is.
If the initial image is an RGB image, one way is: converting an RGB image (wherein R represents a red color component, G represents a green color component, and B represents a blue color component) into a YUV image (wherein Y represents brightness, and U and V represent chroma), and acquiring a brightness value of a first pixel point from the YUV image; the other mode is as follows: taking the maximum brightness value of a first pixel point in the RGB image about three color components as the brightness value of the first pixel point; the other mode is as follows: the luminance value of the first pixel point in the RGB image is taken as the luminance value of the first pixel point, the above example is only an example, and the present application is not limited to this.
In some embodiments of the present application, determining the type of defocus blur kernel for the initial image may include: carrying out binarization processing on the defocusing fuzzy core to obtain a discrete defocusing core; and determining the type of the defocusing fuzzy core of the initial image according to the non-zero area in the discrete defocusing core.
It can be understood that, since the discrete defocus kernel is a convolution kernel obtained by performing binarization processing on the defocus blur kernel, the pixel value of the pixel point in the discrete defocus kernel is 0 or 1. Thus, the non-zero region in the discrete defocus kernel includes a region formed by pixel points having a pixel value of 1.
Alternatively, the defocus blur kernel may be a grayscale image.
Further, determining the type of the defocus blur kernel of the initial image according to the non-zero region in the discrete defocus kernel may include: under the condition that each row of a non-zero region in the discrete defocus kernel is a single rectangle and any column of the non-zero region is not the single rectangle, determining the type of the defocus blur kernel to comprise a first type defocus blur kernel; determining the type of the defocus blur kernel to comprise a second type of defocus blur kernel under the condition that each column of the non-zero region in the discrete defocus kernel is a single rectangle and any row of the non-zero region is not a single rectangle; determining the type of the defocus blur kernel comprises a third type of defocus blur kernel under the condition that each row of the non-zero region in the discrete defocus kernel is a single rectangle and each column of the non-zero region is a single rectangle; determining the type of the defocus blur kernel includes determining the type of the defocus blur kernel to be a second type of defocus blur kernel if any row of the non-zero region in the discrete defocus kernel is not a single rectangle, any column of the non-zero region is not a single rectangle, and the discrete defocus kernel is in an axisymmetric structure.
In other embodiments of the present application, the types of defocus blur kernels may be directly classified according to the specific shape of the defocus blur kernel selected by the user. Exemplary types of defocus blur kernel may include: a circular defocus blur kernel, a diamond defocus blur kernel, a triangular defocus blur kernel, a regular polygonal defocus blur kernel (e.g., a regular hexagonal defocus blur kernel, etc.), a heart defocus blur kernel, a pentagon defocus blur kernel, and so forth.
It will be appreciated that the type of defocus blur kernel in this application determines the spot shape. Exemplarily, if the defocus blur kernel is a circular defocus blur kernel, the light spot shape is a circular symmetric light spot; if the defocusing fuzzy core is a heart-shaped defocusing fuzzy core, the shape of the light spot is a heart-shaped light spot; if the defocusing fuzzy core is a pentagram defocusing fuzzy core, the shape of the light spot is a pentagram light spot. Therefore, different light spot shapes can be generated through the selected defocus blur kernel, so that the light spot shapes are diversified, and the rendering effect is improved.
In alternative embodiments of the present application, the user-selected defocus blur kernel may be determined by, but is not limited to, the following manner.
And the first mode is to respond to the selection operation of the user on the defocusing fuzzy kernel and determine the defocusing fuzzy kernel.
In some embodiments, the mobile terminal displays a defocus blur kernel list in the interface displaying the initial image, wherein the defocus blur kernel list includes an identification of a plurality of candidate defocus blur kernels. In this way, the selecting operation is an operation in which the user selects the identifier of the defocus blur kernel from the identifiers of the plurality of candidate defocus blur kernels.
For example, fig. 2 shows an interface schematic diagram of a mobile terminal, and in this application, the mobile terminal is described as including a mobile phone. As shown in fig. 2, a display area 21 may be included in the interface for displaying the initial image, and the display area 21 includes the defocus blur kernel list 22, that is, the identifier 221 of the circular defocus blur kernel, the identifier 222 of the regular hexagonal defocus blur kernel, the identifier 223 of the pentagon star defocus blur kernel, the identifier 224 of the diamond defocus blur kernel, and the identifier 225 of the heart defocus blur kernel in fig. 2. In this way, the user may perform a trigger operation on the identifier of the defocus blur kernel that needs to be selected, for example, the trigger operation may be a long-press operation (that is, the press duration is greater than the preset duration), a double-click operation, or a drag operation according to a certain trajectory, and the like. Fig. 2 illustrates an example of a drag operation performed on the flag 221 of the circular defocus blur kernel.
Further, it is contemplated that for the initial image, the user may want to perform different spot renderings at different image areas of the initial image. Thus, in the embodiment of the application, the mobile terminal can respond to the division operation of the user on the initial image and divide the initial image into a plurality of image areas; and responding to the selection operation of the user for respectively carrying out the defocusing fuzzy cores on different image areas, and acquiring the defocusing fuzzy cores corresponding to the different image areas. Therefore, the rendering of multiple light spots for a single image is realized, and the rendering effect is more diversified.
For example, fig. 3 shows an interface schematic diagram of a mobile terminal, where the interface for displaying an initial image may include: a first area 32 regarding the defocus blur kernel list 31, and a second area 33 displaying the initial image; the defocus blur kernel list 31 includes an identification 311 of a circular defocus blur kernel, an identification 312 of a regular hexagonal defocus blur kernel, an identification 313 of a pentagram-shaped defocus blur kernel, an identification 314 of a diamond-shaped defocus blur kernel, and an identification 315 of a heart-shaped defocus blur kernel.
In this way, upon detecting that the user performs a dividing operation (i.e., a dividing operation from top to bottom as indicated by a dotted arrow in fig. 3) on the initial image in the second region 33, the mobile terminal divides the initial image into a plurality of image regions, i.e., the first image region 331 and the second image region 332 in fig. 3, in response to the dividing operation. Next, the user may select a corresponding defocus blur kernel for each image region. For example, the user may first perform a first trigger operation on the first image region 321, and the mobile terminal may display the first image region 321 in a preset display mode (e.g., grayscale display, highlight display, etc.) in response to the first trigger operation, at this time, the user may perform a second trigger operation on the identification of the defocus blur kernel in the first region 32, so that the mobile terminal acquires the defocus blur kernel of the first image region in response to the second trigger operation. The first trigger operation and the second trigger operation may be different operations or may be the same operation, for example, a long-press operation, a double-click operation, or a drag operation along a certain trajectory.
In other embodiments, the mobile terminal may output a plurality of candidate defocus blur kernels in speech while presenting the initial image. Thus, the selection operation can be based on a plurality of candidate defocusing fuzzy kernels for the user, and the voice input defocusing fuzzy kernels; alternatively, the user may touch the identification of the defocus blur kernel in the interface showing the initial image based on a plurality of candidate defocus blur kernels, for example, the shape of the defocus blur kernel may be drawn or the name of the defocus blur kernel may be written, and so on.
Illustratively, fig. 4 shows an interface diagram of a mobile terminal. As shown in fig. 4, after the user hears a plurality of candidate defocus blur kernels output by voice, the shape of the selected defocus blur kernel may be drawn in the interface for displaying the initial image, and fig. 4 illustrates the selection of a circular defocus blur kernel, which is not limited in this application.
It should be noted that, in the first method, the size of the candidate defocus blur kernel is fixed; alternatively, the sizes of the candidate defocus blur kernels may be adjusted, for example, a size selection control may be provided at a position near each candidate defocus blur kernel, so that the user selects the size of the defocus blur kernel to be used.
And a second mode is that the depth image corresponding to the initial image, the coordinate information of the focusing target in the initial image and the blurring degree are obtained, the blurring radius corresponding to the first pixel point is calculated according to the depth image, the coordinate information and the blurring degree, and the defocusing fuzzy core corresponding to the first pixel point is determined according to the blurring radius.
It is understood that the blurring degree may be a parameter adjusted by a user, and the blurring degree is proportional to the aperture value, that is, a parameter related to the aperture value; the focus target may be a subject in an image, e.g. a portrait if the initial image is an image about a portrait.
In an alternative embodiment, the calculation of the blurring radius may be: acquiring a depth value of a focusing target from a depth image according to a focus coordinate, acquiring a target depth value according to the depth value of the focusing target, calculating an absolute value of a difference value between the depth value of each pixel point in the depth image and the target depth value, and calculating a virtual radius of each pixel point in the depth image according to the absolute value and the virtual degree, wherein the virtual radius of each pixel point in the depth image is the virtual radius of the pixel point at the corresponding position in the initial image, and the specific calculation process is shown as formula 1:
r(x i,j )=kmax(0,|d(x i,j )-d 1 |-d 2 ) Equation 1
Wherein x is i,j Representing the ith row and the jth column of pixel points in the depth image; r (x) i,j ) The blurring radius of the ith row and jth column pixel points in the depth image is represented; d (x) i,j ) Representing the depth value of the ith row and jth column pixel point in the depth image; d 1 Representing a target depth value; d is a radical of 2 Representing a preset depth compensation value so that the depth of field of the focus target is within a certain rangeThe scenery is not blurred; k represents the degree of blurring, and k represents the degree of blurring,c represents the aperture value, so that a user can select the blurring degree by adjusting the aperture value, and the larger the aperture value is, the higher the blurring degree is, and conversely, the smaller the aperture value is, the lower the blurring degree is.
In an embodiment of the present application, obtaining the target depth value according to the depth value of the focusing target may include: taking the depth value of the focusing target as a target depth value; or, the depth values of other pixel points except for the focused target within the preset distance range of the focused target may be obtained, the depth mean value between the depth values of the other pixel points and the depth value of the focused target is calculated, and the depth mean value is taken as the target depth value.
After the blurring radius of the first pixel point in the initial image is obtained, the defocus blur kernel selected by the user can be determined, and the size of the defocus blur kernel corresponding to the first pixel point is determined according to the blurring radius of the first pixel point. Therefore, by the method, the corresponding light spot sizes can be different for the areas where different pixel points are located. The mobile terminal may display the defocus blur kernel of different types by the method described in the above first mode, so as to obtain the defocus blur kernel selected by the user, which is not described herein again.
Note that, the present application does not limit the timing of determining the luminance stretching weight and the defocus blur kernel.
And S102, processing the initial image according to the brightness stretching weight and the type of the defocusing fuzzy core to obtain a target image.
Considering that the defocus blur kernel can be used for depth-of-field blurring of the initial image, the brightness stretching weight can be used for rendering the light spot of the initial image, and the type of the defocus blur kernel determines the shape of the light spot. Therefore, the target image in the application is an image obtained by performing blurring rendering and spot rendering on the initial image.
Considering that the mobile terminal can accelerate defocusing and blurring through an integral table or an integral graph (so-called a thinned area table or an integral image), the effect of blurring the depth of field is achieved, and compared with methods such as ray tracing, the method is simple to implement, and the occupation of a processor in the mobile terminal is reduced. Therefore, in the embodiment of the application, the effect of spot rendering and blurring rendering can be realized by using the integral table, so that the efficiency of image processing can be improved. In addition, considering that the time consumption of operation is increased when the light spots are independently rendered, and rendered light spot characteristics (such as the color, brightness and shape of the light spots and the transparency of the light spot edges) cannot be naturally combined with the initial image, the light spot rendering and the blurring rendering are combined, so that the rendering effect is more natural.
In conclusion, the mapping relationship can be determined according to the brightness stretching weight and the type of the defocusing fuzzy core; and processing the initial image through the mapping relation and the defocusing fuzzy core to obtain a target image. The form of the mapping relationship is not limited in the present application, and may be, for example, the above-mentioned integral table or integral graph.
It is understood that the prior integral table (or integral graph) usually accumulates pixel values in different rectangular areas in the initial image, and the defocus blur kernel used in this application may be a non-rectangular defocus blur kernel such as a circular defocus blur kernel, a heart-shaped defocus blur kernel, a pentagon-shaped defocus blur kernel, and a diamond-shaped defocus blur kernel. In this way, in the process of image processing, after the discrete defocus kernel corresponding to the defocus blur kernel is obtained, the discrete defocus kernel is formed by splicing a plurality of single-row or single-column rectangular regions as a single row, and the initial image is convolved through the plurality of single-row regions or the plurality of single-column regions in the discrete defocus kernel. Therefore, the mapping relationship in the application is obtained by accumulating single lines or single columns in the initial image, and the contents in the following formula 2 and formula 3 may be specifically referred to.
In the embodiment of the application, the mobile terminal can determine the target integral direction according to the type of the defocusing fuzzy core; and in the target integration direction, determining a mapping relation according to the brightness stretching weight.
It will be appreciated that the target integration direction referred to in this application may be the direction in which the initial image is integrated. If each line in the initial image is integrated, the target integration direction comprises a line integration direction; if each column in the initial image is integrated separately, the target integration direction includes a column integration direction. Optionally, if the target integral direction is a line integral direction, the mapping relationship may be referred to as a line-to-integral mapping relationship; if the target integration direction is the column integration direction, the mapping relationship may be referred to as a column-wise integration mapping relationship.
It should also be understood that since the light spots are related to the brightness, the mapping relationship of the initial image is determined by the brightness stretching weight, and the light spot rendering processing is realized.
In some embodiments, if the types of defocus blur kernel include: a circular defocus blur kernel, a diamond defocus blur kernel, a triangular defocus blur kernel, a regular polygonal defocus blur kernel (e.g., a regular hexagonal defocus blur kernel, etc.), a heart-shaped defocus blur kernel, a pentagon defocus blur kernel, etc. The present application may obtain a preset integral direction correspondence relationship, where the preset integral direction correspondence relationship includes correspondence relationships between different candidate defocus blur kernels and the integral direction. In this way, the target integral direction corresponding to the defocus blur kernel can be obtained according to the preset integral direction corresponding relation, and the candidate defocus blur kernel includes the defocus blur kernel.
For example, the preset integration direction correspondence relationship may include: the target integration direction of the circular defocusing fuzzy kernel comprises a row integration direction or a column integration direction; the target integration direction of the rhombus defocusing fuzzy kernel comprises a row integration direction or a column integration direction; the target integration direction of the cardioid defocus blur kernel comprises a column integration direction; the target integration directions of the pentagonal defocusing blur kernel include: and both sides of the symmetry axis of the pentagram defocusing fuzzy core are in the line integral direction.
In still other embodiments, accounting for the type of defocus blur kernel may also include: a first type defocus blur kernel, a second type defocus blur kernel, a third type defocus blur kernel, and a fourth type defocus blur kernel. Thus, in the case where the type of the defocus blur kernel is the first type of defocus blur kernel, determining that the target integration direction includes a line integration direction; determining that the target integration direction includes a column integration direction in a case where the type of the defocus blur kernel is a second type of defocus blur kernel; determining a target integration direction to comprise a row integration direction or a column integration direction under the condition that the type of the defocus blur kernel is a third type of defocus blur kernel; and under the condition that the type of the defocusing fuzzy kernel is the fourth type defocusing fuzzy kernel, segmenting the fourth type defocusing fuzzy kernel along the symmetry axis, and determining the target integration direction according to the segmented fourth defocusing fuzzy kernel.
Further, the method and the device can determine the defocus blur kernel type of the fourth split focus blur kernel after segmentation, and determine the corresponding target integration direction according to the defocus blur kernel type of the fourth split focus blur kernel after segmentation.
Optionally, if the target integration direction is a line integration direction, a calculation process of an integral image value in an ith line and a jth column in the mapping relationship is as shown in formula 2:
the integrated image value of the ith row and the jth column in the mapping relation is expressed by SUT (I, j), the integrated image value of the ith row and the jth column in the mapping relation is expressed by SUT (I, j-1), the pixel value of the ith row and the jth column in the initial image is expressed by I (I, j), and the brightness stretching weight of the ith row and the jth column in the initial image is expressed by W (I, j).
Optionally, if the target integration direction is a column integration direction, a calculation process of an integral image value of the ith row and the jth column in the mapping relationship is as shown in formula 3:
SUT (I, j) represents an integral image value of the ith row and the jth column in the mapping relation, SUT (I-1, j) represents an integral image value of the ith-1 row and the jth column in the mapping relation, I (I, j) represents a pixel value of an ith row and jth column pixel point in the initial image, and W (I, j) represents the brightness stretching weight of an ith row and jth column pixel point in the initial image.
It can be understood that the above I (I, j) may be three channels (i.e., RGB channels), or may be four channels (i.e., RGB channels and the first luminance stretching channel, and each pixel point of the first luminance stretching channel is assigned with the same value, for example, 1); SUT (i, j) may also be three channels (i.e., RGB channel), or may also be four channels (i.e., RGB channel and the second luminance stretching channel, and each pixel point of the second luminance stretching channel is assigned as a luminance accumulated value of the corresponding luminance stretching weight).
After the integral image value is obtained by the method, the mapping relation can be constructed according to the integral image value. In summary, in the target integral direction, the mapping relation corresponding to the initial image is determined according to the brightness stretching weight and the pixel value of each pixel point in the initial image.
Optionally, the discrete defocus kernel includes a first element (i.e., an element having a pixel value of 1) used for performing convolution processing on the initial image, and also includes a second element (i.e., an element having a pixel value of 0) that is not needed to be used. In order to quickly perform convolution processing on the initial image by using the first element, in this embodiment of the present application, a unidirectional edge of the discrete defocus kernel corresponding to the defocus blur kernel may be obtained, and the initial image is processed to obtain the target image through an element (i.e., the first element) enclosed inside the unidirectional edge of the discrete defocus kernel and the mapping relationship.
It should be noted that, in order to distinguish the cells in both the image (i.e., the initial image and the target image in this application) and the convolution kernel (i.e., the discrete defocus kernel and the defocus blur kernel in this application), the cells in the image are referred to as pixel points in this application, and the cells in the convolution kernel are referred to as elements in this application.
It is understood that the target integration direction is the same as the direction corresponding to the unidirectional edge in this application. For example, if the direction corresponding to the unidirectional edge is a column direction, the target integration direction is a column integration direction; and if the direction corresponding to the unidirectional edge is the row direction, the target integration direction is the row integration direction.
Wherein, the corresponding direction of the unidirectional edge can be understood as: the pixel values of two adjacent lines in the defocusing fuzzy kernel are subjected to difference to obtain a column direction edge; and carrying out difference on pixel values of two adjacent columns in the defocusing fuzzy kernel to obtain a row direction edge.
The defocus blur kernel including the circular defocus blur kernel is taken as an example for explanation. The circular discrete defocus kernel corresponding to the circular defocus blur kernel can be obtained, and the difference processing is carried out on the elements of two adjacent rows in the circular discrete defocus kernel to obtain the edge of the circular discrete defocus kernel in the column direction; or, performing difference processing on the elements of two adjacent columns in the circular discrete defocus kernel to obtain the row direction edge of the circular discrete defocus kernel.
Alternatively, if the difference between the jth column element of the n +1 th row and the jth column element of the nth row in the circular discrete defocus kernel is calculated, n, j may be a positive integer. If the difference value between the jth element in the (n + 1) th row and the jth element in the nth row is 0, determining that the jth element in the (n + 1) th row does not belong to the edge of the column direction; and if the pixel difference value between the jth element in the (n + 1) th row and the jth element in the nth row is not 0, determining that the jth element in the (n + 1) th row belongs to the edge of the column direction.
For example, fig. 5 shows a schematic diagram of the processing procedure of the circular defocus blur kernel. Fig. 5 (a) shows a circular defocus blur kernel, fig. 5 (b) shows a circular discrete defocus kernel, and fig. 5 (c) shows a circular discrete defocus kernel marked with a column-direction edge.
The defocus blur kernel including the heart-shaped defocus blur kernel is exemplified for explanation. The method can obtain the heart-shaped discrete defocus kernel corresponding to the heart-shaped defocus blur kernel, and because each row of the non-zero area in the heart-shaped discrete defocus kernel is a single rectangle and a plurality of rows of the non-zero area in the heart-shaped discrete defocus kernel are not single rectangles, the difference processing is carried out on the elements of two adjacent rows in the heart-shaped discrete defocus kernel, and the row direction edge of the heart-shaped discrete defocus kernel is obtained.
For example, fig. 6 shows a schematic diagram of a process of heart-shaped defocus blur kernel. Fig. 6 (a) shows a heart-shaped discrete defocus core, and fig. 6 (b) shows a heart-shaped discrete defocus core marked with a column-direction edge.
The defocus blur kernel including the pentagram defocus blur kernel is exemplified for explanation. The method can obtain the pentagram discrete defocus kernel corresponding to the pentagram defocusing fuzzy kernel, and the pentagram discrete defocus kernel is divided along the symmetry axis and corresponding row one-way edges are obtained according to the row direction respectively according to the divided pentagram discrete defocus kernel because a plurality of columns of non-zero regions in the pentagram discrete defocus kernel are not single rectangles, a plurality of rows of the non-zero regions in the pentagram discrete defocus kernel are not single rectangles, and the pentagram discrete defocus kernel is in an axisymmetric structure.
For example, fig. 7 shows a schematic diagram of a process of a pentagram defocus blur kernel. Fig. 7 (a) shows the divided pentagonal discrete defocus core, and fig. 7 (b) shows the divided pentagonal discrete defocus core marked with the row-direction edge.
Optionally, a calculation process of an initial pixel value of an ith row and a jth column pixel point in the target image is as shown in formula 4:
b (i, j) represents the initial pixel value of the ith row and jth column pixel point in the target image; i (I + delta I, j + delta j) represents the pixel value of the pixel point at the j + delta j row of the I + delta I in the initial image; w (i + delta i, j + delta j) represents the brightness stretching weight of the pixel point at the j + delta j row of the i + delta i in the initial image; k (delta i, delta j) represents the numerical value of the delta j row element of the delta i row obtained by taking the central position of the internal element of the unidirectional edge of the discrete defocus kernel as the reference; Δ i represents a row offset of the pixel coordinates; Δ j represents the column offset of the pixel coordinate, and the magnitude of Δ i and Δ j is related to the number of elements enclosed inside the one-way edge of the discrete defocus kernel.
It can be understood that the discrete defocus kernel is a convolution kernel obtained by performing binarization processing on the defocus blur kernel, so that the pixel value of a pixel point in the discrete defocus kernel is 0 or 1, that is, an element enclosed inside a one-way edge of the discrete defocus kernel is 1, and an element outside the one-way edge of the discrete defocus kernel is 0. Thus, the calculation of formula 4 can be simplified to a region accumulated value of the product between the pixel value of each pixel point in the target region and the corresponding luminance stretching weight; the target area is an area where the ith row and the jth column of pixel points in the initial image are located, and the convolution processing is carried out on the internal elements of the one-way edge of the discrete defocus kernel, so that the shapes of the target area and the area formed inside the one-way edge of the discrete defocus kernel are the same; and then calculating the ratio of the area accumulated value to the brightness accumulated value corresponding to the target area to obtain the final pixel value of the ith row and jth column pixel point in the target image. The brightness stretching weight normalization of the pixel points in the target image is achieved by calculating the ratio, and the problems that the brightness of the target image is excessive and the like are avoided.
It should also be understood that the target region described above may be decomposed into a plurality of single-row or single-column rectangular regions, wherein if the integration direction of the mapping relationship is the row integration direction, the target region may be decomposed into a plurality of single-row rectangular regions, and if the integration direction of the mapping relationship is the column integration direction, the target region may be decomposed into a plurality of single-column rectangular regions. Therefore, the integral image value of the pixel point in each rectangular area can be quickly searched and calculated by adopting a mapping relation, and the integral image values of all the rectangular areas are accumulated to obtain the area accumulated value of the target area.
It should also be understood that, when the integral image value in the mapping relationship is four channels, since the integral image value includes an RGB channel and a second luminance stretching channel, and each pixel point of the second luminance stretching channel is assigned as a luminance accumulated value of the corresponding luminance stretching weight, the application can directly and quickly obtain the luminance accumulated value corresponding to the target region from the mapping relationship; when the integral image value in the mapping relation is three channels, the accumulated value of the brightness stretching weight of each pixel point in the target area can be calculated to obtain the brightness accumulated value.
By adopting the image processing method, the defocusing fuzzy core can be used for blurring the depth of field of the image, and the brightness stretching weight is related to the light spot, so that the initial image can be processed by adopting the defocusing fuzzy core and the brightness stretching weight, and the effects of light spot rendering, blurring rendering and the like can be obtained. In addition, the virtual rendering and the light spot rendering can be rapidly realized through the mapping relation, and the problems that the operation time is long and the light spot effect is poor due to the fact that the light spots are independently rendered are avoided. In addition, the shape of the light spot is determined by the type of the defocusing fuzzy core, so that various types of light spot rendering are realized by selecting the defocusing fuzzy core, and the light spot effect is diversified.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by functions and internal logic of the process, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Based on the image processing method provided by the above embodiment, the embodiment of the present invention further provides an embodiment of an apparatus for implementing the above method embodiment.
Referring to fig. 8, fig. 8 is a schematic diagram of an image processing apparatus according to an embodiment of the present disclosure. The modules included are used to perform the steps in the corresponding embodiment of fig. 1. Please specifically refer to the related description of the corresponding embodiment in fig. 1. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 8, the image processing apparatus 8 includes:
a determining module 81, configured to determine a brightness stretching weight corresponding to a first pixel point of an initial image and determine a type of a defocus blur kernel of the initial image;
and the processing module 82 is configured to process the initial image according to the brightness stretching weight and the type of the defocus blur kernel to obtain a target image.
Optionally, the processing module 82 is further configured to determine a mapping relationship according to the brightness stretching weight and the type of the defocus blur kernel; and processing the initial image through the mapping relation and the defocusing fuzzy core to obtain the target image.
Optionally, the determining module 81 is further configured to determine a brightness stretching weight corresponding to the first pixel point according to a size relationship between the brightness value of the first pixel point and a preset brightness threshold.
Optionally, the determining module 81 is further configured to determine that the brightness stretching weight of the first pixel point is a first numerical value when the brightness value corresponding to the first pixel point is greater than a brightness threshold;
determining the brightness stretching weight of the first pixel point as a second numerical value under the condition that the brightness value corresponding to the first pixel point is smaller than or equal to the brightness threshold value;
wherein the second value is less than the first value.
Optionally, the determining module 81 is further configured to perform binarization processing on the defocus blur kernel to obtain a discrete defocus kernel; and determining the type of the defocusing fuzzy kernel of the initial image according to the non-zero area in the discrete defocusing kernel.
Optionally, the determining module 81 is further configured to determine that the type of the defocus blur kernel includes a first type of defocus blur kernel if each row of the non-zero region in the discrete defocus kernel is a single rectangle and any column of the non-zero region is not a single rectangle;
determining that the type of the defocus blur kernel comprises a second type of defocus blur kernel if each column of a non-zero region in the discrete defocus kernel is a single rectangle and any row of the non-zero region is not a single rectangle;
determining that the type of the defocus blur kernel comprises a third type of defocus blur kernel if each row of non-zero regions in the discrete defocus kernel is a single rectangle and each column of the non-zero regions is a single rectangle;
determining the type of the defocus blur kernel comprises a fourth type of defocus blur kernel when any row of a non-zero region in the discrete defocus kernel is not a single rectangle, any column of the non-zero region is not a single rectangle, and the discrete defocus kernel is in an axisymmetric structure.
Optionally, the processing module 82 is further configured to determine a target integration direction according to the type of the defocus blur kernel; and determining the mapping relation according to the brightness stretching weight in the target integration direction.
Optionally, the processing module 82 is further configured to determine that the target integration direction includes a line integration direction if the type of the defocus blur kernel is the first type of defocus blur kernel;
determining that the target integration direction comprises a column integration direction if the type of the defocus blur kernel is the second type of defocus blur kernel;
determining that the target integration direction comprises a row integration direction or a column integration direction if the type of the defocus blur kernel is the third type of defocus blur kernel;
and under the condition that the type of the defocusing fuzzy kernel is the fourth defocusing fuzzy kernel, segmenting the fourth defocusing fuzzy kernel along a symmetry axis, and determining the target integration direction according to the segmented fourth defocusing fuzzy kernel.
Optionally, the processing module 82 is further configured to obtain a one-way edge of the discrete defocus kernel corresponding to the defocus blur kernel;
and processing the initial image to obtain the target image through the elements enclosed inside the unidirectional edge of the discrete defocus kernel and the mapping relation.
Optionally, the determining module 81 is further configured to determine the defocus blur kernel in response to a user selecting operation on the defocus blur kernel;
or,
acquiring a depth image corresponding to the initial image, coordinate information of a focusing target in the initial image and a blurring degree, calculating a blurring radius according to the depth image, the coordinate information and the blurring degree, and determining the defocusing blurring kernel according to the blurring radius; wherein the blurring degree is proportional to the aperture value.
It should be noted that, because the contents of information interaction, execution process, and the like between the modules are based on the same concept as that of the embodiment of the method of the present application, specific functions and technical effects thereof may be specifically referred to a part of the embodiment of the method, and details are not described here.
Fig. 9 is a schematic diagram of an image processing apparatus provided in an embodiment of the present application. As shown in fig. 9, the image processing apparatus 9 of this embodiment includes: a processor 90, a memory 91 and a computer program 92, such as an image processing program, stored in said memory 91 and executable on said processor 90. The processor 90, when executing the computer program 92, implements the steps in the various image processing method embodiments described above, such as S101 and S102 shown in fig. 1. Alternatively, the processor 90, when executing the computer program 92, implements the functions of the modules/units in the above-described device embodiments, such as the modules 81 and 82 shown in fig. 8.
Illustratively, the computer program 92 may be partitioned into one or more modules/units that are stored in the memory 91 and executed by the processor 90 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution process of the computer program 92 in the image processing apparatus 9. For example, the computer program 92 may be divided into an obtaining module, a fusion module, and a detection module, and specific functions of each module are described in the embodiment corresponding to fig. 1, which are not described herein again.
The image processing device may include, but is not limited to, a processor 90, a memory 91. Those skilled in the art will appreciate that fig. 9 is merely an example of the image processing apparatus 9, and does not constitute a limitation of the image processing apparatus 9, and may include more or less components than those shown, or combine some components, or different components, for example, the image processing apparatus may further include an input-output device, a network access device, a bus, and the like.
The Processor 90 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 91 may be an internal storage unit of the image processing apparatus 9, such as a hard disk or a memory of the image processing apparatus 9. The memory 91 may also be an external storage device of the image processing apparatus 9, such as a plug-in hard disk provided on the image processing apparatus 9, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 91 may also include both an internal storage unit and an external storage device of the image processing apparatus 9. The memory 91 is used for storing the computer program and other programs and data required by the image processing apparatus. The memory 91 may also be used to temporarily store data that has been output or is to be output.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the image processing method can be implemented.
The embodiment of the application provides a computer program product, when the computer program product runs on an image processing device, the image processing device is enabled to realize the image processing method when executed.
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the apparatus may be divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the system, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above-mentioned embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (13)
1. An image processing method, characterized by comprising:
determining a brightness stretching weight corresponding to a first pixel point of an initial image and determining the type of a defocusing fuzzy core of the initial image;
and processing the initial image according to the brightness stretching weight and the type of the defocusing fuzzy core to obtain a target image.
2. The method of claim 1, wherein the processing the initial image according to the brightness stretching weight and the type of the defocus blur kernel to obtain a target image comprises:
determining a mapping relation according to the brightness stretching weight and the type of the defocusing fuzzy core;
and processing the initial image through the mapping relation and the defocusing fuzzy core to obtain the target image.
3. The method according to claim 1 or 2, wherein the determining the luminance stretching weight corresponding to the first pixel point of the initial image comprises:
and determining the brightness stretching weight corresponding to the first pixel point according to the size relation between the brightness value of the first pixel point and a preset brightness threshold value.
4. The method according to claim 3, wherein the determining, according to a magnitude relationship between the brightness value of the first pixel point and a preset brightness threshold, a brightness stretching weight corresponding to the first pixel point comprises:
determining the brightness stretching weight of the first pixel point as a first numerical value under the condition that the brightness value of the first pixel point is greater than the brightness threshold value;
determining the brightness stretching weight of the first pixel point as a second numerical value under the condition that the brightness value of the first pixel point is smaller than or equal to the brightness threshold value;
wherein the second value is less than the first value.
5. The method of claim 2, wherein the determining the type of defocus blur kernel for the initial image comprises:
carrying out binarization processing on the defocusing fuzzy kernel to obtain a discrete defocusing kernel;
and determining the type of the defocusing fuzzy kernel of the initial image according to the non-zero area in the discrete defocusing kernel.
6. The method of claim 5, wherein determining the type of defocus blur kernel for the initial image from non-zero regions in the discrete defocus kernel comprises:
determining that the type of the defocus blur kernel comprises a first type defocus blur kernel if each row of a non-zero region in the discrete defocus kernel is a single rectangle and any column of the non-zero region is not a single rectangle;
determining that the type of the defocus blur kernel comprises a second type of defocus blur kernel under the condition that each column of a non-zero region in the discrete defocus kernel is a single rectangle and any row of the non-zero region is not a single rectangle;
determining that the type of the defocus blur kernel comprises a third type of defocus blur kernel if each row of non-zero regions in the discrete defocus kernel is a single rectangle and each column of the non-zero regions is a single rectangle;
determining the type of the defocus blur kernel comprises a fourth type of defocus blur kernel when any row of a non-zero region in the discrete defocus kernel is not a single rectangle, any column of the non-zero region is not a single rectangle, and the discrete defocus kernel is in an axisymmetric structure.
7. The method according to claim 6, wherein the determining a mapping relationship according to the brightness stretching weight and the type of the defocus blur kernel comprises:
determining a target integral direction according to the type of the defocusing fuzzy kernel;
and determining the mapping relation according to the brightness stretching weight in the target integration direction.
8. The method of claim 7, wherein determining a target integration direction according to the type of the defocus blur kernel comprises:
determining that the target integration direction comprises a line integration direction if the type of the defocus blur kernel is the first type of defocus blur kernel;
determining that the target integration direction comprises a column integration direction if the type of the defocus blur kernel is the second type of defocus blur kernel;
determining that the target integration direction includes a row integration direction or a column integration direction in a case that the type of the defocus blur kernel is the third type of defocus blur kernel;
and under the condition that the type of the defocusing fuzzy kernel is the fourth defocusing fuzzy kernel, segmenting the fourth defocusing fuzzy kernel along a symmetry axis, and determining the target integration direction according to the segmented fourth defocusing fuzzy kernel.
9. The method of claim 5, wherein the processing the initial image through the mapping relationship and the defocus blur kernel to obtain the target image comprises:
acquiring a one-way edge of a discrete defocus core corresponding to the defocus blur core;
and processing the initial image to obtain the target image through the elements enclosed inside the unidirectional edge of the discrete defocus kernel and the mapping relation.
10. The method according to claim 1 or 2, characterized in that the method further comprises:
responding to the selection operation of a user on the defocusing fuzzy kernel, and determining the defocusing fuzzy kernel;
or,
acquiring a depth image corresponding to the initial image, coordinate information of a focusing target in the initial image and a blurring degree, calculating a blurring radius corresponding to the first pixel point according to the depth image, the coordinate information and the blurring degree, and determining a defocusing blurring kernel corresponding to the first pixel point according to the blurring radius; wherein the degree of blurring is proportional to the aperture value.
11. An image processing apparatus characterized by comprising:
the determining module is used for determining brightness stretching weight corresponding to a first pixel point of an initial image and determining the type of a defocusing fuzzy core of the initial image;
and the processing module is used for processing the initial image according to the brightness stretching weight and the type of the defocusing fuzzy core to obtain a target image.
12. An image processing apparatus comprising a processor, a memory and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 10 when executing the computer program.
13. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110105672.9A CN114792283A (en) | 2021-01-26 | 2021-01-26 | Image processing method, device and equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110105672.9A CN114792283A (en) | 2021-01-26 | 2021-01-26 | Image processing method, device and equipment and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114792283A true CN114792283A (en) | 2022-07-26 |
Family
ID=82459453
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110105672.9A Pending CN114792283A (en) | 2021-01-26 | 2021-01-26 | Image processing method, device and equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114792283A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117745727A (en) * | 2024-02-21 | 2024-03-22 | 北京科技大学 | Device and method for monitoring hardness of water stemming liquid filling bag |
CN117764856A (en) * | 2022-12-05 | 2024-03-26 | 行吟信息科技(武汉)有限公司 | Image processing method and device, electronic equipment and computer readable storage medium |
WO2024193145A1 (en) * | 2023-03-20 | 2024-09-26 | 蔚来移动科技有限公司 | Image blurring processing method and apparatus, and medium and device |
-
2021
- 2021-01-26 CN CN202110105672.9A patent/CN114792283A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117764856A (en) * | 2022-12-05 | 2024-03-26 | 行吟信息科技(武汉)有限公司 | Image processing method and device, electronic equipment and computer readable storage medium |
WO2024193145A1 (en) * | 2023-03-20 | 2024-09-26 | 蔚来移动科技有限公司 | Image blurring processing method and apparatus, and medium and device |
CN117745727A (en) * | 2024-02-21 | 2024-03-22 | 北京科技大学 | Device and method for monitoring hardness of water stemming liquid filling bag |
CN117745727B (en) * | 2024-02-21 | 2024-04-26 | 北京科技大学 | Device and method for monitoring hardness of water stemming liquid filling bag |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114792283A (en) | Image processing method, device and equipment and computer readable storage medium | |
CN108769634B (en) | Image processing method, image processing device and terminal equipment | |
CN107204034B (en) | A kind of image processing method and terminal | |
CN106295644B (en) | Symbol identification method and device | |
JP5811416B2 (en) | Image processing apparatus, image processing method, and program | |
CN109272526B (en) | Image processing method and system and electronic equipment | |
CN108205671A (en) | Image processing method and device | |
CN113628100B (en) | Video enhancement method, device, terminal and storage medium | |
CN108304562B (en) | Question searching method and device and intelligent terminal | |
CN111028276A (en) | Image alignment method and device, storage medium and electronic equipment | |
US20110025701A1 (en) | Method and system for creating an image | |
CN109410295B (en) | Color setting method, device, equipment and computer readable storage medium | |
CN104967786A (en) | Image selection method and device | |
CN111179166B (en) | Image processing method, device, equipment and computer readable storage medium | |
CN108932703B (en) | Picture processing method, picture processing device and terminal equipment | |
CN108470547B (en) | Backlight control method of display panel, computer readable medium and display device | |
CN113284063A (en) | Image processing method, image processing apparatus, electronic device, and readable storage medium | |
CN110321190B (en) | Method and device for optimizing icons in desktop | |
CN114677319A (en) | Stem cell distribution determination method and device, electronic equipment and storage medium | |
CN110047126B (en) | Method, apparatus, electronic device, and computer-readable storage medium for rendering image | |
CN111861965A (en) | Image backlight detection method, image backlight detection device and terminal equipment | |
CN113989404B (en) | Picture processing method, apparatus, device, storage medium, and program product | |
CN111292247A (en) | Image processing method and device | |
CN113393391B (en) | Image enhancement method, image enhancement device, electronic apparatus, and storage medium | |
CN115840550A (en) | Angle-adaptive display screen display method, device and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |