CN109191428A - Full-reference image quality evaluating method based on masking textural characteristics - Google Patents
Full-reference image quality evaluating method based on masking textural characteristics Download PDFInfo
- Publication number
- CN109191428A CN109191428A CN201810834955.5A CN201810834955A CN109191428A CN 109191428 A CN109191428 A CN 109191428A CN 201810834955 A CN201810834955 A CN 201810834955A CN 109191428 A CN109191428 A CN 109191428A
- Authority
- CN
- China
- Prior art keywords
- image
- formula
- reference picture
- color space
- gradient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 230000000873 masking effect Effects 0.000 title claims abstract description 19
- 238000006243 chemical reaction Methods 0.000 claims abstract description 17
- 238000007637 random forest analysis Methods 0.000 claims abstract description 13
- 238000011156 evaluation Methods 0.000 claims abstract description 8
- 230000004927 fusion Effects 0.000 claims abstract description 5
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 239000000284 extract Substances 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 8
- 238000010586 diagram Methods 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000004040 coloring Methods 0.000 claims description 3
- 238000003066 decision tree Methods 0.000 claims description 3
- 238000000926 separation method Methods 0.000 claims description 3
- 238000013441 quality evaluation Methods 0.000 abstract description 11
- 230000000007 visual effect Effects 0.000 abstract description 5
- 238000012360 testing method Methods 0.000 abstract description 2
- 241000208340 Araliaceae Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 230000035479 physiological effects, processes and functions Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
- Color Image Communication Systems (AREA)
- Facsimile Image Signal Circuits (AREA)
Abstract
Full-reference image quality evaluating method disclosed by the invention based on masking textural characteristics belongs to image procossing and image quality evaluation technical field, color space conversion is carried out to reference and distorted image first, secondly reference picture and distorted image gradient amplitude and gradient direction feature are extracted and calculates image gradient information similitude, then textural characteristics similitude and color difference are calculated, and its mean value and standard deviation are counted respectively, constitute a 6-D feature vector, regression model is established come fusion feature vector and subjectivity MOS value according to random forest, and is trained;The 6-D feature vector for finally extracting testing image, is input in trained regression model, completes Objective image quality evaluation.Evaluation method disclosed by the invention uses three kinds of different similarity features, establishes regression model using random forest, realizes that full-reference image quality carries out high-precision and objectively evaluates, can keep higher consistency with human-eye visual characteristic.
Description
Technical field
The invention belongs to image procossing and image quality evaluation technical fields, are related to a kind of based on the complete of masking textural characteristics
Reference image quality evaluation method.
Background technique
With the arriving of big data era, more and more images are shared on network.Digital picture is obtained as people
It wins the confidence breath, carries out the important carrier of communication exchange, people's lives mode is just being altered in steps.With the substantially increasing of data scale
It is long, huge challenge is also brought, image may all occur to a certain degree during acquisition, storage, transmission and processing
Distortion.Therefore, how effectively to handle, transmit image, accurately picture quality is evaluated, has become and urgently studies
The problem of.
In recent years, since full-reference image quality evaluation algorithm and related device are widely used in all kinds of image procossing systems
Carry out Optimal Parameters in system, therefore, full-reference image quality evaluation has become a hot topic of research.Current most of existing full references
It is principle that type image quality evaluating method, which is all using based on human visual system (human visual systems, HVS),
Frame, W.Zhou et al. propose a kind of image evaluation method: firstly, extracting the brightness of reference picture and corresponding distorted image respectively
Three indexs such as information, contrast information and structural information;Secondly, calculate three indexs similitude, obtain brightness similitude,
Contrast similitude and structural similarity;Finally, three similarity features of average weighted, obtain the mass fraction of distorted image,
Under background premised on this theory, visual signature weight is assigned according to picture material.In addition, there are also methods in sky
A global characteristics are extracted to whole image in domain and realize quality evaluation, but this method cannot be used to evaluate color image.
Image structure information is described using frequency domain character in current some researchs, is further improved image quality evaluation mould
Type, still, the image quality evaluating method for being mostly based on characteristic similarity calculating cannot precisely reflect human eye vision masking effect
It answers, have ignored human eye vision is influenced by complicated factors such as physiology and psychology, lower so as to cause evaluation result precision.
Summary of the invention
The object of the present invention is to provide a kind of full-reference image quality evaluating methods based on masking textural characteristics, solve
Existing evaluation method cannot precisely reflect human eye vision masking effect, have ignored human eye vision by physiology and psychology etc. it is complicated because
The problem of element influences, the present invention are calculated to establish model by reference to the characteristic similarity of image and distorted image, be realized accurate
Distorted image quality evaluation.
The technical scheme adopted by the invention is that the full-reference image quality evaluating method based on masking textural characteristics,
Specific operation process includes the following steps:
Step 1. by database reference picture and distorted image be transformed into Lab color space by RGB color,
Separate the colouring information of image with luminance information;
The Lab color space that step 2. is obtained according to step 1 extracts reference picture and distorted image in the channel L respectively
Gradient amplitude and gradient direction feature, and calculate gradient amplitude similitude and gradient direction similitude;
For step 3. after step 1, the Laws texture successively extracted in reference picture and distorted image in the channel L is special
Sign, and the texture paging mean value and standard deviation of statistical-reference image and distorted image;
The Lab color space that step 4. is obtained according to step 1 calculates reference picture and distorted image in L, and a, b tri- logical
The color difference in road, and count the mean value and standard deviation of color difference;
After the completion of step 5. step 2, step 3 and step 4, the gradient amplitude similitude that will acquire by random forest, ladder
Degree directional similarity, texture paging mean value and standard deviation and the fusion of the mean value and standard deviation of characteristic similarity color difference are returning
In model, and subjective assessment score MOS value is also entered into regression model and is trained, trained model is used directly to essence
Really predict the quality of image to be evaluated.
Other features of the invention also reside in,
Detailed process is as follows for step 1:
According to formula 1-3 to the reference picture and distorted image progress color space conversion in database, by RGB color sky
Between be transformed into Lab color space:
Wherein, R, G and B respectively indicate the triple channel of color image, and X, Y and Z respectively indicate the tristimulus values of color, X0=
0.9505, Y0=1.000, Z0=1.0890 be D65Tristimulus values under lighting condition, L*Lightness after indicating color space conversion
Channel, a*And b*Chrominance channel after respectively indicating color space conversion;RGB color passes through formula 1, formula 2 and formula 3
Lab color space is obtained after calculating, the image size after color space conversion and the image size phase before color space conversion
Together, the channel luminance information L and chrominance information a, the b channel separation of image are realized.
Detailed process is as follows for step 2:
Step 2.1 using the Prewitt operator with 3*3 window level and vertical two components respectively to reference picture and
Distorted image carries out convolution algorithm, carries out the extraction of gradient amplitude and gradient direction feature:
Wherein, for piece image f (x), x is the coordinate of pixel in image, to method such as 4 institute of formula of image convolution
Show:
In formula, Gx(x) horizontal gradient range value, G are indicatedy(x) vertical gradient range value is indicated;
Step 2.2 is through calculating separately the ladder of reference picture and distorted image according to formula 5 and formula 6 after the completion of step 2.1
Range value GM (x) and gradient direction value θ (x) are spent, circular is as follows:
Step 2.3 is through calculating separately the ladder of reference picture and distorted image according to formula 7 and formula 8 after the completion of step 2.2
Degree amplitude similitudeWith gradient direction similitude Sor(x), circular is as follows:
M in formula 7, n respectively indicate the width and height of image, and x indicates the position of pixel, Ir(x) and Id(x) it respectively indicates
Reference picture and distorted image;In formula 8, θrAnd θdRespectively indicate the gradient direction of reference picture and distorted image, C1=1.
Detailed process is as follows for step 3:
Step 3.1 texture feature extraction is to carry out convolution algorithm to image using four two dimension Laws filters, wherein four
A two dimension Laws filter is as shown in Equation 9:
It is the coordinate of pixel in image for piece image f (x), x, image is made with four templates in formula 9 respectively
Convolution algorithm is simultaneously maximized, and concrete form is as shown in formula 10:
Te=max (f (x) * i), i=(a), (b), (c), (d) (10)
Step 3.2 calculates the texture paging of reference picture and distorted image, specific calculation after step 3.1
It is as follows:
In formula 11, terAnd tedRespectively indicate the textural characteristics of reference picture and distorted image, C2=100;
Step 3.3 counts the mean value of convolution results after step 3.2And standard deviationSpecific statistical form
It is as follows:;
In formula 12,Indicate texture paging mean value,Indicate texture paging standard deviation, n indicates pixel
The sum of point.
Detailed process is as follows for step 4:
The Lab color space that step 4.1 is obtained according to step 1 calculates separately reference picture and distorted image in L, a, b tri-
Value of chromatism Δ E under a channel, as shown in formula 13:
In formula 13,Respectively indicate triple channel under Lab color space
Value, wherein subscript r and subscript d respectively represent reference picture and distorted image;
The mean value of step 4.2 statistics color differenceAnd standard deviationAs shown in formula 14 and formula 15:
In formula, m, n respectively indicate the width and height of chromaticity difference diagram, the position of two-dimensional surface where (i, j) indicates pixel.
Detailed process is as follows for step 5:
Step 5.1 is by six similarity features of acquisitionSor,WithWith
The subjective average mark MOS value of distorted image in database, they are input to jointly random forest foundation regression model into
Row training, and the quantity ntree=500, several sections of point pre-selection variable number mtry=2 of decision tree in model are set;
Step 5.2 is using trained regression model, by one or more distortion map to be detected and corresponding ginseng
It examines image and extracts similarity feature according to step 2, step 3 and step 4, then similarity feature is input to trained random gloomy
In woods regression model, the forecast quality score exported completes the evaluation to distorted image quality.
The invention has the advantages that the full-reference image quality evaluating method based on masking textural characteristics is large-scale public
Extraction three kinds of different similarity features of image on database are opened, and the mean value and the standard deviation that count these similarity features are mutual
For additional notes image information, solve the problems, such as that traditional characteristic is low with human eye subjective perception consistency;It can be according to random gloomy
Woods (RF) establishes the mean value and variance that regression model merges each similarity feature, and in conjunction with subjective scores MOS value carry out study and
Prediction, improves the robustness of model, to increase widespread popularity;In use, image quality estimation can be increased substantially
Precision, and there is high consistency with human visual system.
Detailed description of the invention
Fig. 1 is the full-reference image quality evaluating method frame diagram of the invention based on masking textural characteristics.
Specific embodiment
The following describes the present invention in detail with reference to the accompanying drawings and specific embodiments.
Full-reference image quality evaluating method based on masking textural characteristics of the invention, as shown in Figure 1, can be by its point
For two large divisions, be respectively as follows: the prediction of foundation and the image quality evaluation of RF model: RF model establishes part, process object
It is the reference picture and distorted image in image data base, extracts the mean value and variance of three kinds of similarity features in the present invention,
Subjective MOS value in combined data library, establishes regression model using random forest RF;
The gradient amplitude similitude of the predicted portions of image quality evaluation, calculated distortion image and corresponding reference picture,
Gradient direction similitude, texture paging mean value, texture paging standard deviation, color difference typical value and standards of chromatic aberration are poor, by these three
Similarity feature fashions into a 6-D feature vector, and is input to RF regression model as input value, thus to distortion map
The quality of picture is predicted that picture quality is evaluated in completion.
Specific operation process includes the following steps:
Step 1. by database reference picture and distorted image be transformed into Lab color space by RGB color,
Separate the colouring information of image with luminance information:
According to formula 1-3 to the reference picture and distorted image progress color space conversion in database, by RGB color sky
Between be transformed into Lab color space:
Wherein, R, G and B respectively indicate the triple channel of color image, and X, Y and Z respectively indicate the tristimulus values of color, X0=
0.9505, Y0=1.000, Z0=1.0890 be D65Tristimulus values under lighting condition, L*Lightness after indicating color space conversion
Channel, a*And b*Chrominance channel after respectively indicating color space conversion;RGB color passes through formula 1, formula 2 and formula 3
Lab color space is obtained after calculating, the image size after color space conversion and the image size phase before color space conversion
Together, the channel luminance information L and chrominance information a, the b channel separation of image are realized;
The Lab color space that step 2. is obtained according to step 1 extracts reference picture and distorted image in the channel L respectively
Gradient amplitude and gradient direction feature, and calculate gradient amplitude similitude and gradient direction similitude:
Step 2.1 using the Prewitt operator with 3*3 window level and vertical two components respectively to reference picture and
Distorted image carries out convolution algorithm, carries out the extraction of gradient amplitude and gradient direction feature:
Wherein, for piece image f (x), x is the coordinate of pixel in image, to method such as 4 institute of formula of image convolution
Show:
In formula, Gx(x) horizontal gradient range value, G are indicatedy(x) vertical gradient range value is indicated;
Step 2.2 is through calculating separately the ladder of reference picture and distorted image according to formula 5 and formula 6 after the completion of step 2.1
Range value GM (x) and gradient direction value θ (x) are spent, circular is as follows:
Step 2.3, through after the completion of step 2.2, reference picture and distorted image are calculated separately according to formula 7 and formula 8
Gradient amplitude similitudeWith gradient direction similitude Sor(x), circular is as follows:
M in formula 7, n respectively indicate the width and height of image, and x indicates the position of pixel, Ir(x) and Id(x) it respectively indicates
Reference picture and distorted image;In formula 8, θrAnd θdRespectively indicate the gradient direction of reference picture and distorted image, C1=1, it uses
In stable formula 8, the phenomenon that avoid denominator from occurring be zero.
For step 3. after step 1, the Laws texture successively extracted in reference picture and distorted image in the channel L is special
Sign, and the texture paging mean value and standard deviation of statistical-reference image and distorted image:
Step 3.1 texture feature extraction is to carry out convolution algorithm to image using four two dimension Laws filters and take maximum
Value, wherein four two dimension Laws filters are as shown in Equation 9:
It is the coordinate of pixel in image for piece image f (x), x, image is made with four templates in formula 9 respectively
Convolution algorithm is simultaneously maximized, and concrete form is as shown in formula 10:
Te=max (f (x) * i), i=(a), (b), (c), (d) (10)
Step 3.2 calculates the texture paging of reference picture and distorted image, specific calculation after step 3.1
It is as follows:
In formula 11, terAnd tedRespectively indicate the textural characteristics of reference picture and distorted image, C2=100, for stablizing
The phenomenon that formula 11, to avoid denominator from occurring be zero;
Step 3.3 counts the mean value of convolution results after step 3.2And standard deviationSpecific statistical form is such as
Shown in lower:
In formula 12,Indicate texture paging mean value, σ SmteIndicate texture paging standard deviation, n indicates pixel
Sum.
The Lab color space that step 4. is obtained according to step 1 calculates reference picture and distorted image in L, and a, b tri- logical
The color difference in road, and count the mean value and standard deviation of color difference;
The Lab color space that step 4.1 is obtained according to step 1 calculates separately reference picture and distorted image in L, a, b tri-
Value of chromatism Δ E under a channel, as shown in formula 13:
In formula 13,Respectively indicate triple channel under Lab color space
Value, wherein subscript r and subscript d respectively represent reference picture and distorted image;
The mean value of step 4.2 statistics color differenceAnd standard deviationAs shown in formula 14 and formula 15:
In formula 14 and formula 15, m, n respectively indicate the width and height of chromaticity difference diagram, and two dimension where (i, j) indicates pixel is flat
The position in face;
After the completion of step 5. step 2, step 3 and step 4, the gradient amplitude similitude that will acquire by random forest, ladder
Degree directional similarity, texture paging mean value and standard deviation and the fusion of the mean value and standard deviation of characteristic similarity color difference are returning
In model, and subjective assessment score MOS value is also entered into regression model and is trained, trained model is used directly to essence
Really predict the quality of image to be evaluated:
Step 5.1 is by six similarity features of acquisitionSor,WithSum number
According to the subjective average mark MOS value of distorted image in library, the regression model that they are input to random forest foundation jointly is carried out
Training, and the quantity ntree=500, several sections of point pre-selection variable number mtry=2 of decision tree in model are set;
Step 5.2 is using trained regression model, by one or more distortion map to be detected and corresponding ginseng
It examines image and extracts similarity feature according to step 2, step 3 and step 4, then similarity feature is input to trained random gloomy
In woods regression model, the forecast quality score exported completes the evaluation to distorted image quality.
Full-reference image quality evaluating method based on masking textural characteristics of the invention, is first carried out in database
Reference and distorted image carry out color space conversion;Secondly the airspace gradient and frequency for extracting reference picture and distorted image are executed
Domain phase property calculates global max architecture characteristic similarity;Then it is similar with spatial frequency features to execute calculating frequency domain texture
Property, airspace color characteristic similitude, and a 6-D feature vector is constituted in conjunction with global max architecture characteristic similarity;It connects down
To be trained by random forest RF binding characteristic vector sum MOS value to establish regression model;It finally executes and extracts to mapping
It is pre- to carry out high-precision to testing image quality as the input value of random forest RF regression model for the 6-D feature vector of picture
It surveys, to evaluate picture quality.
Full-reference image quality evaluating method based on masking textural characteristics of the invention, takes full advantage of and regards with human eye
The mean value and variance for feeling three kinds of consistent similarity features of characteristic, can be according to the reference picture and distortion map in database
Picture establishes random forest RF regression model fusion similarity feature, and is trained and predicts, thus high-precision forecast image matter
Amount evaluation keeps higher consistency with eye recognition.
Claims (6)
1. the full-reference image quality evaluating method based on masking textural characteristics, which is characterized in that specific operation process includes
Following steps:
Step 1. by database reference picture and distorted image be transformed into Lab color space by RGB color, make figure
The colouring information of picture is separated with luminance information;
The Lab color space that step 2. is obtained according to step 1 extracts reference picture and distorted image in the gradient in the channel L respectively
Amplitude and gradient direction feature, and calculate gradient amplitude similitude and gradient direction similitude;
Step 3. successively extracts the Laws textural characteristics in reference picture and distorted image in the channel L after step 1, and
The texture paging mean value and standard deviation of statistical-reference image and distorted image;
The Lab color space that step 4. is obtained according to step 1, calculates reference picture and distorted image in L, tri- channels a, b
Color difference, and count the mean value and standard deviation of color difference;
After the completion of step 5. step 2, step 3 and step 4, the gradient amplitude similitude that will acquire by random forest, gradient side
To similitude, texture paging mean value and standard deviation and the fusion of the mean value and standard deviation of characteristic similarity color difference in regression model
In, and subjective assessment score MOS value is also entered into regression model and is trained, trained model is used directly to accurate pre-
Survey the quality of image to be evaluated.
2. the full-reference image quality evaluating method as described in claim 1 based on masking textural characteristics, which is characterized in that
Detailed process is as follows for the step 1:
According to formula 1-3 to reference picture and distorted image progress color space conversion in database, turned by RGB color
Change to Lab color space:
Wherein, R, G and B respectively indicate the triple channel of color image, and X, Y and Z respectively indicate the tristimulus values of color, X0=
0.9505, Y0=1.000, Z0=1.0890 be D65Tristimulus values under lighting condition, L*Lightness after indicating color space conversion
Channel, a*And b*Chrominance channel after respectively indicating color space conversion;RGB color passes through formula 1, formula 2 and formula 3
Lab color space is obtained after calculating, the image size after color space conversion and the image size phase before color space conversion
Together, the channel luminance information L and chrominance information a, the b channel separation of image are realized.
3. the full-reference image quality evaluating method as described in claim 1 based on masking textural characteristics, which is characterized in that
Detailed process is as follows for the step 2:
Step 2.1 is using the Prewitt operator with 3*3 window level and vertical two components respectively to reference picture and distortion
Image carries out convolution algorithm, carries out the extraction of gradient amplitude and gradient direction feature:
It wherein, is the coordinate of pixel in image for piece image f (x), x, as shown in formula 4 to the method for image convolution:
In formula, Gx(x) horizontal gradient range value, G are indicatedy(x) vertical gradient range value is indicated;
Step 2.2 is through calculating separately the gradient width of reference picture and distorted image according to formula 5 and formula 6 after the completion of step 2.1
Angle value GM (x) and gradient direction value θ (x), circular are as follows:
Step 2.3 is through calculating separately the gradient width of reference picture and distorted image according to formula 7 and formula 8 after the completion of step 2.2
Spend similitudeWith gradient direction similitude Sor(x), circular is as follows:
M in formula 7, n respectively indicate the width and height of image, and x indicates the position of pixel, Ir(x) and Id(x) reference is respectively indicated
Image and distorted image;In formula 8, θrAnd θdRespectively indicate the gradient direction of reference picture and distorted image, C1=1.
4. the full-reference image quality evaluating method as described in claim 1 based on masking textural characteristics, which is characterized in that
Detailed process is as follows for the step 3:
Step 3.1 texture feature extraction is to carry out convolution algorithm to image using four two dimension Laws filters, wherein four two
It is as shown in Equation 9 to tie up Laws filter:
For piece image f (x), x is the coordinate of pixel in image, and image is made convolution with four templates in formula 9 respectively
Operation is simultaneously maximized, and concrete form is as shown in formula 10:
Te=max (f (x) * i), i=(a), (b), (c), (d) (10)
Step 3.2 calculates the texture paging of reference picture and distorted image after step 3.1, and specific calculation is as follows:
In formula 11, terAnd tedRespectively indicate the textural characteristics of reference picture and distorted image, C2=100;
Step 3.3 counts the mean value of convolution results after step 3.2And standard deviationSpecific statistical form is as follows
It is shown:;
In formula 12,Indicate texture paging mean value,Indicate texture paging standard deviation, n indicates the total of pixel
Number.
5. the full-reference image quality evaluating method as described in claim 1 based on masking textural characteristics, which is characterized in that
Detailed process is as follows for the step 4:
The Lab color space that step 4.1 is obtained according to step 1 calculates separately reference picture and distorted image in L, and a, b tri- logical
Value of chromatism Δ E under road, as shown in formula 13:
In formula 13,The value of triple channel under Lab color space is respectively indicated,
Middle subscript r and subscript d respectively represent reference picture and distorted image;
The mean value of step 4.2 statistics color differenceAnd standard deviationAs shown in formula 14 and formula 15:
In formula, m, n respectively indicate the width and height of chromaticity difference diagram, the position of two-dimensional surface where (i, j) indicates pixel.
6. the full-reference image quality evaluating method as described in claim 1 based on masking textural characteristics, which is characterized in that
Detailed process is as follows for the step 5:
Step 5.1 is by six similarity features of acquisitionSor,WithAnd database
The subjective average mark MOS value of middle distorted image, the regression model that they are input to random forest foundation jointly are trained,
And the quantity ntree=500, several sections of point pre-selection variable number mtry=2 of decision tree in model are set;
Step 5.2 is using trained regression model, by one or more distortion map to be detected and corresponding with reference to figure
As being input to trained random forest time according to step 2, step 3 and step 4 extraction similarity feature, then by similarity feature
Return in model, the forecast quality score exported, completes the evaluation to distorted image quality.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810834955.5A CN109191428B (en) | 2018-07-26 | 2018-07-26 | Masking texture feature-based full-reference image quality evaluation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810834955.5A CN109191428B (en) | 2018-07-26 | 2018-07-26 | Masking texture feature-based full-reference image quality evaluation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109191428A true CN109191428A (en) | 2019-01-11 |
CN109191428B CN109191428B (en) | 2021-08-06 |
Family
ID=64937628
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810834955.5A Expired - Fee Related CN109191428B (en) | 2018-07-26 | 2018-07-26 | Masking texture feature-based full-reference image quality evaluation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109191428B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109919920A (en) * | 2019-02-25 | 2019-06-21 | 厦门大学 | The full reference of unified structure and non-reference picture quality appraisement method |
CN110838119A (en) * | 2019-11-15 | 2020-02-25 | 珠海全志科技股份有限公司 | Human face image quality evaluation method, computer device and computer readable storage medium |
CN111598837A (en) * | 2020-04-21 | 2020-08-28 | 中山大学 | Full-reference image quality evaluation method and system suitable for visual two-dimensional code |
CN112118457A (en) * | 2019-06-20 | 2020-12-22 | 腾讯科技(深圳)有限公司 | Live broadcast data processing method and device, readable storage medium and computer equipment |
CN112381812A (en) * | 2020-11-20 | 2021-02-19 | 深圳市优象计算技术有限公司 | Simple and efficient image quality evaluation method and system |
CN112837319A (en) * | 2021-03-29 | 2021-05-25 | 深圳大学 | Intelligent evaluation method, device, equipment and medium for real distorted image quality |
CN112950597A (en) * | 2021-03-09 | 2021-06-11 | 深圳大学 | Distorted image quality evaluation method and device, computer equipment and storage medium |
CN113971675A (en) * | 2021-10-13 | 2022-01-25 | 南京邮电大学 | Full-reference image quality evaluation method based on split structure distortion and signal error |
CN115984283A (en) * | 2023-03-21 | 2023-04-18 | 山东中济鲁源机械有限公司 | Intelligent detection method for welding quality of reinforcement cage |
CN116188809A (en) * | 2023-05-04 | 2023-05-30 | 中国海洋大学 | Texture similarity judging method based on visual perception and sequencing driving |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102036098A (en) * | 2010-12-01 | 2011-04-27 | 北京航空航天大学 | Full-reference type image quality evaluation method based on visual information amount difference |
CN102750695A (en) * | 2012-06-04 | 2012-10-24 | 清华大学 | Machine learning-based stereoscopic image quality objective assessment method |
US20130182972A1 (en) * | 2012-01-12 | 2013-07-18 | Xiaochen JING | Image defect visibility predictor |
US20170140518A1 (en) * | 2015-11-12 | 2017-05-18 | University Of Virginia Patent Foundation D/B/A/ University Of Virginia Licensing & Ventures Group | System and method for comparison-based image quality assessment |
CN106780441A (en) * | 2016-11-30 | 2017-05-31 | 杭州电子科技大学 | A kind of stereo image quality objective measurement method based on dictionary learning and human-eye visual characteristic |
-
2018
- 2018-07-26 CN CN201810834955.5A patent/CN109191428B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102036098A (en) * | 2010-12-01 | 2011-04-27 | 北京航空航天大学 | Full-reference type image quality evaluation method based on visual information amount difference |
US20130182972A1 (en) * | 2012-01-12 | 2013-07-18 | Xiaochen JING | Image defect visibility predictor |
CN102750695A (en) * | 2012-06-04 | 2012-10-24 | 清华大学 | Machine learning-based stereoscopic image quality objective assessment method |
US20170140518A1 (en) * | 2015-11-12 | 2017-05-18 | University Of Virginia Patent Foundation D/B/A/ University Of Virginia Licensing & Ventures Group | System and method for comparison-based image quality assessment |
CN106780441A (en) * | 2016-11-30 | 2017-05-31 | 杭州电子科技大学 | A kind of stereo image quality objective measurement method based on dictionary learning and human-eye visual characteristic |
Non-Patent Citations (3)
Title |
---|
WESLEY GRIFFIN,ET AL: "《Evaluating Texture Compression Masking Effects Using Objective Image Quality Assessment Metrics》", 《IEEE TRANSCATIONS ON VISUALIZATION AND COMPUTER GRAPHICS》 * |
刘明娜: "《基于视觉系统和特征提取的图像质量客观评价方法及应用研究》", 《中国博士学位论文全文数据库 信息科技辑》 * |
谢德红,等: "《基于最优色空间和视觉掩蔽的彩色图像评价算法》", 《包装工程》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109919920A (en) * | 2019-02-25 | 2019-06-21 | 厦门大学 | The full reference of unified structure and non-reference picture quality appraisement method |
CN112118457A (en) * | 2019-06-20 | 2020-12-22 | 腾讯科技(深圳)有限公司 | Live broadcast data processing method and device, readable storage medium and computer equipment |
CN112118457B (en) * | 2019-06-20 | 2022-09-09 | 腾讯科技(深圳)有限公司 | Live broadcast data processing method and device, readable storage medium and computer equipment |
CN110838119B (en) * | 2019-11-15 | 2022-03-04 | 珠海全志科技股份有限公司 | Human face image quality evaluation method, computer device and computer readable storage medium |
CN110838119A (en) * | 2019-11-15 | 2020-02-25 | 珠海全志科技股份有限公司 | Human face image quality evaluation method, computer device and computer readable storage medium |
CN111598837A (en) * | 2020-04-21 | 2020-08-28 | 中山大学 | Full-reference image quality evaluation method and system suitable for visual two-dimensional code |
CN111598837B (en) * | 2020-04-21 | 2023-05-05 | 中山大学 | Full-reference image quality evaluation method and system suitable for visualized two-dimensional code |
CN112381812A (en) * | 2020-11-20 | 2021-02-19 | 深圳市优象计算技术有限公司 | Simple and efficient image quality evaluation method and system |
CN112950597B (en) * | 2021-03-09 | 2022-03-08 | 深圳大学 | Distorted image quality evaluation method and device, computer equipment and storage medium |
CN112950597A (en) * | 2021-03-09 | 2021-06-11 | 深圳大学 | Distorted image quality evaluation method and device, computer equipment and storage medium |
CN112837319B (en) * | 2021-03-29 | 2022-11-08 | 深圳大学 | Intelligent evaluation method, device, equipment and medium for real distorted image quality |
CN112837319A (en) * | 2021-03-29 | 2021-05-25 | 深圳大学 | Intelligent evaluation method, device, equipment and medium for real distorted image quality |
CN113971675A (en) * | 2021-10-13 | 2022-01-25 | 南京邮电大学 | Full-reference image quality evaluation method based on split structure distortion and signal error |
CN115984283A (en) * | 2023-03-21 | 2023-04-18 | 山东中济鲁源机械有限公司 | Intelligent detection method for welding quality of reinforcement cage |
CN116188809A (en) * | 2023-05-04 | 2023-05-30 | 中国海洋大学 | Texture similarity judging method based on visual perception and sequencing driving |
CN116188809B (en) * | 2023-05-04 | 2023-08-04 | 中国海洋大学 | Texture similarity judging method based on visual perception and sequencing driving |
Also Published As
Publication number | Publication date |
---|---|
CN109191428B (en) | 2021-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109191428A (en) | Full-reference image quality evaluating method based on masking textural characteristics | |
CN106447646A (en) | Quality blind evaluation method for unmanned aerial vehicle image | |
CN103402117B (en) | Based on the video image color cast detection method of Lab chrominance space | |
CN102663745B (en) | Color fusion image quality evaluation method based on vision task. | |
WO2017092431A1 (en) | Human hand detection method and device based on skin colour | |
CN101562675B (en) | No-reference image quality evaluation method based on Contourlet transform | |
CN103714349B (en) | Image recognition method based on color and texture features | |
CN109214298B (en) | Asian female color value scoring model method based on deep convolutional network | |
CN109191460B (en) | Quality evaluation method for tone mapping image | |
CN105741328A (en) | Shot image quality evaluation method based on visual perception | |
CN108830823A (en) | The full-reference image quality evaluating method of frequency-domain analysis is combined based on airspace | |
CN102800111B (en) | Color harmony based color fusion image color quality evaluation method | |
CN101146226A (en) | A highly-clear video image quality evaluation method and device based on self-adapted ST area | |
CN107705286A (en) | A kind of color image quality integrated evaluating method | |
CN109635636A (en) | The pedestrian that blocking characteristic based on attributive character and weighting blends recognition methods again | |
CN108961227A (en) | A kind of image quality evaluating method based on airspace and transform domain multiple features fusion | |
CN107154058A (en) | A kind of method for guiding user to reduce magic square | |
CN110120034A (en) | A kind of image quality evaluating method relevant to visual perception | |
CN107610093A (en) | Full-reference image quality evaluating method based on similarity feature fusion | |
CN109741285B (en) | Method and system for constructing underwater image data set | |
CN108960142A (en) | Pedestrian based on global characteristics loss function recognition methods again | |
CN106485266A (en) | A kind of ancient wall classifying identification method based on extraction color characteristic | |
CN111882516B (en) | Image quality evaluation method based on visual saliency and deep neural network | |
CN109345552A (en) | Stereo image quality evaluation method based on region weight | |
CN104050678A (en) | Underwater monitoring color image quality measurement method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210806 |
|
CF01 | Termination of patent right due to non-payment of annual fee |