CN112381408A - Quality inspection method and device and electronic equipment - Google Patents
Quality inspection method and device and electronic equipment Download PDFInfo
- Publication number
- CN112381408A CN112381408A CN202011276569.2A CN202011276569A CN112381408A CN 112381408 A CN112381408 A CN 112381408A CN 202011276569 A CN202011276569 A CN 202011276569A CN 112381408 A CN112381408 A CN 112381408A
- Authority
- CN
- China
- Prior art keywords
- data
- result
- business data
- risk
- service data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 81
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000012550 audit Methods 0.000 claims abstract description 63
- 238000012552 review Methods 0.000 claims description 55
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000012372 quality testing Methods 0.000 claims description 8
- 230000009467 reduction Effects 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 description 13
- 238000004422 calculation algorithm Methods 0.000 description 8
- 230000000875 corresponding effect Effects 0.000 description 8
- 230000006872 improvement Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 3
- 238000009825 accumulation Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000013178 mathematical model Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 101100217298 Mus musculus Aspm gene Proteins 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0635—Risk analysis of enterprise or organisation activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06395—Quality analysis or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Landscapes
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Economics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Operations Research (AREA)
- Educational Administration (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Development Economics (AREA)
- Game Theory and Decision Science (AREA)
- Data Mining & Analysis (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The embodiment of the specification discloses a quality inspection method, a quality inspection device and electronic equipment. The quality inspection method comprises the following steps: acquiring an initial risk auditing result of target service data; acquiring audited service data with the similarity meeting a first condition with the target service data; calculating an audit consistency rate according to an initial risk audit result of the audited business data; and determining whether to perform quality inspection on the initial risk inspection result of the target service data according to the inspection consistency rate. The quality inspection method, the quality inspection device and the electronic equipment in the embodiment of the specification can reduce risk omission in quality inspection.
Description
Technical Field
The embodiment of the specification relates to the technical field of computers, in particular to a quality inspection method, a quality inspection device and electronic equipment.
Background
With the rapid development of the internet, risk audit needs to be performed on the business data to obtain a risk audit result of the business data. In order to improve the accuracy of risk audit, quality inspection needs to be performed on the risk audit result of the business data. In the related art, the amount of the business data is very large, and in consideration of cost, a certain proportion of business data can be randomly extracted from a large amount of business data, so that quality inspection is performed on a risk audit result of the extracted business data.
In the related art, a certain proportion of service data is randomly extracted from a large amount of service data for quality inspection. This random decimation approach often results in missed risks.
Disclosure of Invention
The embodiment of the specification provides a quality inspection method, a quality inspection device and electronic equipment, so that risk omission in quality inspection is reduced. The technical scheme of the embodiment of the specification is as follows.
In a first aspect of embodiments herein, there is provided a method of quality inspection, comprising: acquiring an initial risk auditing result of target service data; acquiring audited service data with the similarity meeting a first condition with the target service data; calculating an audit consistency rate according to an initial risk audit result of the audited business data; and determining whether to perform quality inspection on the initial risk inspection result of the target service data according to the inspection consistency rate.
In a second aspect of embodiments of the present specification, there is provided a quality inspection apparatus including: the first acquisition unit is used for acquiring an initial risk auditing result of the target service data; the second acquisition unit is used for acquiring the audited service data of which the similarity with the target service data meets a first condition; the calculation unit is used for calculating the auditing consistency rate according to the initial risk auditing result of the audited business data; and the determining unit is used for determining whether to perform quality inspection on the initial risk inspection result of the target service data according to the inspection consistency rate.
In a third aspect of embodiments of the present specification, there is provided an electronic apparatus including: at least one processor; a memory storing program instructions configured to be suitable for execution by the at least one processor, the program instructions comprising instructions for performing the method of the first aspect.
According to the technical scheme provided by the embodiment of the specification, the initial risk auditing result of the target business data can be obtained; the checked business data with the similarity meeting the first condition with the target business data can be obtained; calculating the auditing consistency rate according to the initial risk auditing result of the audited business data; and determining whether to perform quality inspection on the initial risk inspection result of the target business data according to the inspection consistency rate. The review consistency rate may be used to indicate a degree of inconsistency when performing risk review on a plurality of similar contents. Whether quality inspection is carried out on the initial risk inspection result of the target business data is determined according to the inspection consistency rate, so that the quality inspection cost can be saved, and risk omission can be reduced.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, the drawings in the following description are only some embodiments described in the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a quality inspection method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a quality inspection process in an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating a flow of acquiring a text data feature character string in an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating a process of obtaining a text data feature string in an embodiment of the present disclosure;
FIG. 5 is a schematic diagram illustrating a flow of acquiring a character string of image data features in an embodiment of the present disclosure;
FIG. 6 is a schematic structural diagram of a quality inspection apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device in an embodiment of the present specification.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step should fall within the scope of protection of the present specification.
In the related art, a certain proportion of service data is randomly extracted from a large amount of service data for quality inspection, which may cause the following situations: (1) the risk checking result of some business data is wrong, but the quality inspection is not extracted, so that risk omission is caused; (2) in the extracted quality inspection business data, most of the risk auditing results of the business data are accurate, so that the validity of the quality inspection is not high. Therefore, the embodiment of the specification provides a quality inspection method. The quality inspection method can be applied to a server, and the server can be a single server, a server cluster formed by a plurality of servers, or a server deployed in the cloud. Referring to fig. 1 and 2, the quality inspection method may include the following steps.
Step S12: and acquiring an initial risk auditing result of the target service data.
In some embodiments, the service data may include User Generated Content (UGC), such as a barrage published by a User in video playing software, a short video published by a User, and the like. In practical applications, the business data may be bad business data (e.g., advertising data, abuse data, pornographic data, etc.), and thus risk review of the business data is required. The target business data can be business data to be subjected to risk auditing. The target service data may specifically be text data, image data, video data, audio data, or any other type of data.
In some embodiments, an initial risk review result may be obtained by performing a risk review on the target business data. The type of the initial risk review result may be a first type or a second type, that is, the initial risk review result may be a first type initial risk review result or a second type initial risk review result. The first type of initial risk review result may be risk-free and the second type of initial risk review result may be at risk. In some embodiments, the server may perform risk review on the target business data. Specifically, the server may perform risk auditing on the target service data by using a mathematical model to obtain an initial risk auditing result. Or, the server can also provide the target business data to the auditor. The auditor can input the initial risk audit result of the target business data in the server. The server may receive an initial risk review result. In other embodiments, the server may send the target traffic data to other devices. The other device may receive the target service data; risk auditing can be carried out on the target service data to obtain an initial risk auditing result; the initial risk review result may be sent to the server. The server may receive an initial risk review result. The risk auditing of the target service data by the other equipment can be as follows: and other equipment performs risk auditing on the target service data by using the mathematical model to obtain an initial risk auditing result. Or, the risk auditing of the target service data by the other devices may also be: other equipment provides the target business data to the auditor; an auditor inputs an initial risk audit result of the target business data in other equipment; and the other equipment receives the initial risk auditing result.
In some embodiments, the server may calculate a characteristic string of the target business data. The characteristic string may be used to identify the target business data. The characteristic character string may be used to calculate a similarity between the target service data and other service data. The calculation process of the characteristic character string is different according to different types of the target service data. In the following, the calculation process of the characteristic character string is described by taking the target service data as text data or image data as an example.
Please refer to fig. 3 and 4. If the target service data is text data, the calculation process of the characteristic character string may include the following steps. It should be noted that, taking the SimHash algorithm as an example, the calculation process of the characteristic character string is described here. The embodiments of the present specification do not exclude other technical solutions for calculating the characteristic character string. For example, the server may further obtain a Word vector (Word Embedding) of the text data as a feature string by using a neural network model.
Step S302: extracting keywords from the text data.
The server can perform word segmentation processing on the text data to obtain a plurality of words; one or more words may be selected from the plurality of words as keywords. For example, the server may select a word that occurs in the text data a number of times greater than or equal to T1, with T1 being a preset threshold. As another example, the server may further select T2 words that appear most frequently in the text data, where T2 is a preset threshold. Of course, the server may also take each word as a keyword.
Step S304: and acquiring the weight of the keyword in the text data and the hash value of the keyword.
The weight may be used to represent the degree of importance of a keyword in the text data. The magnitude of the weight is positively correlated with the degree of importance of the keyword in the text data. In practical applications, the server may use the number of times of occurrence of the keyword in the text data as a weight. Or, the server may also assign a weight to the keyword according to the number of times the keyword appears in the text data. For example, if the number of times of occurrence of a keyword in the text data is in a first numerical range, the server may assign a weight of 1 to the keyword; if the number of times of occurrence of the keyword in the text data is within a second numerical range, the server may assign a weight 2 to the keyword; if the number of times of occurrence of the keyword in the text data is in a third numerical range, the server may assign a weight 3 to the keyword.
The Hash values may include MD5(Message Digest Algorithm) values, SHA1(Secure Hash Algorithm) values, SHA256, SHA384, SHA512 values, and CRC32(Cyclic Redundancy Check Algorithm) values, among others.
Step S306: and according to the weight, weighting the digits on each digit in the hash value to obtain a weighting result of the keyword.
The number may refer to the position occupied by each digit in a number. For example, 12345 may include 5 digits, units, tens, hundreds, thousands, tens of thousands, etc. The number in the units is 5, the number in the tens is 4, the number in the hundreds is 3, the number in the thousands is 2, and the number in the tens is 1.
For each keyword, the server may weight the digits on each digit in the hash value of the keyword according to the weight of the keyword, so as to obtain a weighted result of the keyword. In practical applications, for each digit in the hash value, if the number on the digit is 1, the server may use the weight as the weighting result of the digit; if the number on the digit is 0, the server may use the negative of the weight as the weighting result for the digit. Further, the server may use a character string formed by a weighting result of each digit as a weighting result of the keyword. For example, the hash value of a certain keyword may be 100110, and the weight of the keyword may be 5. The server may weight the digits in the hash value 100110 according to the weight 5 to obtain a weighted result 5-5-555-5. As another example, the hash value of a keyword may be 110000, and the weight of the keyword may be 4. The server may weight the digits in the hash value 110000 according to the weight 4 to obtain a weighted result 44-4-4-4-4.
Step S308: and accumulating the numbers on the same digit in the weighted result of each keyword to obtain an accumulated result.
The server may extract a plurality of keywords from the text data. For each keyword, the server may obtain a weighted result. In this way, the server can accumulate the numbers on the same digit in the weighted result of each keyword to obtain an accumulated result. For example, the server may extract keywords feature1, keywords feature2, and keywords feature3 from the text data. The weighted result of the keyword feature1 may be w1-w1-w1w1w1-w1The weighted result of the keyword feature2 may be w2w2-w2-w2-w2-w2The weighted result of the keyword feature3 may be-w3-w3w3-w3-w3w3. The server may weight the result w1-w1-w1w1w1-w1Weighted result w2w2-w2-w2-w2-w2Weighted result-w3-w3w3-w3-w3w3The numbers on the same digit are accumulated to obtain an accumulation result w1+w2-w3,-w1+w2-w3,-w1-w2+w3,w1-w2-w3,w1-w2-w3,-w1-w2+w3. Specifically, for example, the weighted result w1-w1-w1w1w1-w1Can be 5-5-555-5, and the weighting result w2w2-w2-w2-w2-w2May be 44-4-4-4-4, weighted result-w3-w3w3-w3-w3w3May be-3-33-3-33. Then the accumulated result may be 6, -4, -6, -2, -2, -6.
Step S310: and performing dimension reduction processing on the accumulated result to obtain the characteristic character string of the text data.
In the accumulated result, it is possible that the numbers on some digits are multi-digit numbers (e.g., two-digit numbers, or three-digit numbers). The server may perform dimensionality reduction on the accumulated result such that a number on each digit in the accumulated result is a single digit. For each digit in the accumulation result, if the number on the digit is greater than 0, the server may use 1 as the dimensionality reduction result of the digit; if the number on the digit is less than or equal to 0, the server may use 0 as the dimensionality reduction result of the digit. Further, the server may use a character string formed by the dimensionality reduction result of each digit as a feature character string of the text data. For example, the accumulated result may be 6, -4, -6, -2, -2, -6. The characteristic character string of the text data may be 100000. As another example, the accumulated result may be 13,108, -22, -5, -32, 55. The characteristic character string of the text data may be 110001. Of course, the server may also perform the dimension reduction processing on the accumulated result in other manners.
Referring to fig. 5, if the target service data is image data, the calculation process of the characteristic character string may include the following steps. It should be noted that, here, the DHash algorithm in the perceptual hash algorithm is taken as an example to describe the calculation process of the characteristic character string. The embodiments of the present specification do not exclude other technical solutions for calculating the characteristic character string. For example, the server may further obtain the feature character string of the image data by using a PHash algorithm in a perceptual hash algorithm.
Step S502: and acquiring a corresponding gray-scale image according to the image data.
The image data may be a color image. The server can perform graying on the color image to obtain a grayscale image. Optionally, the server may also scale the size of the color image; the scaled color image may be grayed to obtain a grayscale image. For example, the server may scale the size of the color image to 9 × 8 pixels; the scaled color image may be grayed to obtain a grayscale image. Thus the gray scale map may comprise 72 pixels.
Step S504: and calculating the difference value between adjacent pixel points in each row of pixel points of the gray-scale image.
Aiming at each row of pixel points of the gray-scale map, the server can compare the pixel values between adjacent pixel points in the row of pixel points; if the pixel value of the left pixel point is greater than that of the right pixel point, the server may use 1 as the difference value between the adjacent pixel points; if the pixel value of the left pixel point is less than or equal to the pixel value of the right pixel point, the server may use 0 as the difference value between the adjacent pixel points. For example, a certain gray scale map has a size of n × m pixels. Specifically, the gray-scale map may include m rows of pixel points, and each row of pixel points may include n pixel points. Thus, each row of pixels of the gray scale map may include n-1 pairs of adjacent pixels. For each row of pixel points of the gray-scale map, the server can obtain n-1 difference values. According to the gray map, the server can obtain m x (n-1) difference values. Of course, the server may also calculate the difference value between adjacent pixels in other manners.
Step S506: and constructing a characteristic character string of the image data according to the difference value.
The server may combine the obtained difference values to obtain a characteristic character string of the image data. The embodiments of the present disclosure are not limited to specific combinations. For example, for each row of pixels in the grayscale map, the server may combine difference values between adjacent pixels in the row of pixels to obtain a character string of the row of pixels. Further, the server may combine character strings of each row of pixel points to obtain a feature character string of the image data.
In some embodiments, the server may add the target business data as audited business data to the first data set. Wherein the first set of data may include at least one audited traffic data. Each audited business data corresponds to an initial risk audit result and a characteristic character string.
Step S14: and acquiring the audited service data of which the similarity with the target service data meets a first condition.
In some embodiments, the audited business data may be risk audited business data. The server may select, from the first data set, audited business data whose similarity to the target business data satisfies a first condition. Specifically, the server may determine, according to the feature character string of the target service data and the feature character string of the audited service data in the first data set, a similarity between the target service data and the audited service data in the first data set; the audited business data with the similarity meeting the first condition can be selected from the first data set. The server can obtain a plurality of audited business data which are similar to the content of the target business data.
The server can calculate the Hamming Distance (Hamming Distance), the editing Distance (Minimum Edit Distance), or the Cosine Similarity (Cosine Similarity) between the characteristic character string of the target service data and the characteristic character string of the checked service data; the similarity between the target service data and the audited service data can be represented by using a hamming distance, an edit distance, or a cosine similarity. The first condition may be different according to the representation of the similarity. For example, the server may use hamming distance to represent the similarity between the target business data and the audited business data. The smaller the hamming distance, the greater the similarity. The larger the hamming distance is, the smaller the similarity is. The hamming distance between the characteristic strings may include the number of characters of the characteristic strings that differ in the same digit. For example, the hamming distance between the feature string 1011101 and the feature string 1001001 is 2. Then, the first condition may be: the Hamming distance is less than or equal to T3, and T3 is a preset threshold. Specifically, for example, the T3 may be 3, 4, or 6, etc.
It is noted that, before step S14, the server may add the target business data as audited business data to the first data set. Thus, in step S14, the audited service data acquired by the server includes the target service data. Alternatively, after step S14, the server may add the target business data as audited business data to the first data set. Thus, in step S14, the audited service data acquired by the server does not include the target service data.
Step S16: and calculating the auditing consistency rate according to the initial risk auditing result of the audited business data.
In some embodiments, the review compliance rate may be used to indicate a degree of inconsistency in risk reviews of multiple similar content. The larger the audit consistency rate is, the smaller the inconsistency degree is. The smaller the audit consistency rate is, the larger the inconsistency degree is.
In some embodiments, the server may obtain a plurality of audited business data, each corresponding to an initial risk audit result, via step S14. The server may use a set formed by the initial risk audit results of the plurality of audited business data as a risk audit result set. That is, the risk review result set may include a plurality of initial risk review results, where each initial risk review result corresponds to one obtained reviewed service data. In the risk review result set, the server may count the number of first type initial risk review results as a first number and the number of second type initial risk review results as a second number; a difference between the first number and the second number may be calculated; and calculating the auditing consistency rate according to the difference and the number of the initial risk auditing results in the risk auditing result set.
For example, the server may be based on a formulaAnd calculating the audit consistency rate. r represents an initial risk review result in the set of risk review results. And if the initial risk audit result is the first type initial risk audit result, r is 1. And if the initial risk audit result is the second type initial risk audit result, r is-1. Sigmatype1r represents a first quantity, - Σtype2r represents the second quantity, | Σtype1r-(-∑type2r)|=|∑type1r+∑type2r | represents the difference, and Σ | r | represents the number of initial risk review results in the set of risk review results. It is worth to say that according to the formulaIt can be known that if the types of the initial risk review results in the risk review result set are completely consistent, the CheckSameResult is 1; if the types of the initial risk review results in the risk review result set are completely inconsistent (i.e., half of the initial risk review results in the initial risk review result set are completely inconsistent)The risk review result is the first type of initial risk review result, and the other half of the initial risk review results are the second type of initial risk review result), then the CheckSameResult is 0.
Step S18: and determining whether to perform quality inspection on the initial risk inspection result of the target service data according to the inspection consistency rate.
In some embodiments, if the audit consistency rate satisfies the second condition, the server may determine not to perform quality inspection on the initial risk audit result of the target service data. And if the audit consistency rate does not meet the second condition, the server can determine to perform quality inspection on the initial risk audit result of the target service data. The second condition may be: and checking that the consistency rate is greater than or equal to sigma, wherein the sigma is a preset threshold. In practical application, by controlling the value of σ, the amount of service data to be subjected to quality inspection can be controlled. Specifically, if the number of quality testing personnel is small and the human resources are insufficient, the value of the sigma can be set to be large, so that the number of service data for quality testing is small; if the number of quality testing personnel is large and the human resources are sufficient, the value of the sigma can be set to be small, so that the quantity of the business data for quality testing is small. The quality inspector can be understood as a trusted auditor.
In some embodiments, in a case that it is determined that quality inspection is not performed on the initial risk review result of the target business data, the server may use the initial risk review result of the target business data as a final risk review result.
In some embodiments, the server may obtain a plurality of secondary risk audit results of the target service data when it is determined that the initial risk audit result of the target service data is to be subjected to quality inspection; the types of most of the secondary risk auditing results in the plurality of secondary risk auditing results can be obtained; the initial risk audit result of the target business data can be corrected according to the types of most of the secondary risk audit results, and a final risk audit result is obtained. It is worth noting that each secondary risk audit result can be determined by a quality inspector. By voting the type of the secondary risk auditing result, errors caused by cognitive difference of a single quality inspector can be avoided, and the final risk auditing result of the target business data can be more accurately obtained. The greater the number of quality testing personnel participating in the voting, the higher the credibility of the final risk auditing result. In practical application, the number of quality testing personnel participating in voting can be determined according to requirements.
In some embodiments, the server may provide targeted business data to a plurality of quality control personnel. Each quality inspector can input a secondary risk auditing result of the target business data in the server. The server can receive a plurality of secondary risk auditing results input by a plurality of quality inspectors. In other embodiments, the server may send targeted traffic data to a plurality of other devices. Each other device may correspond to a quality inspector. Each of the other devices may receive the targeted traffic data. And the quality testing personnel corresponding to each other device can input a secondary risk auditing result of the target business data in the other devices. Each other device can receive the secondary risk auditing result input by the quality inspector; the secondary risk audit result may be sent to the server. The server can receive a plurality of secondary risk auditing results sent by a plurality of other devices.
The type of the secondary risk audit result may be a first type or a second type, that is, the secondary risk audit result may be a first type secondary risk audit result or a second type secondary risk audit result. The first type of secondary risk review result may be risk-free and the second type of secondary risk review result may be risk. The server may count the number of the first type of secondary risk audit results as a third number and the number of the second type of secondary risk audit results as a fourth number; the third number and the fourth number may be compared to obtain a plurality of types of secondary risk review results. For example, the server may be based on a formulaAnd determining the types of most secondary risk auditing results. p represents twoNumber of sub-risk review results. RQuality inspectionAnd representing a secondary risk auditing result. If the secondary risk audit result is the first type secondary risk audit result, R Quality inspection1. If the secondary risk audit result is a second type secondary risk audit result, then RQuality inspectionIs-1. Thus, if R isxAnd if the type of the most secondary risk auditing result is more than 0, the server can determine that the type of the most secondary risk auditing result is the first type. If R isx< 0, the server may determine that the type of the majority of secondary risk audit results is of the second type.
The server can correct the initial risk auditing result of the target service data according to the types of most secondary risk auditing results; the corrected initial risk review result may be used as a final risk review result. The server corrects the initial risk auditing result of the target service data, and may include: if the type of the initial risk checking result is inconsistent with the types of the majority of secondary risk checking results, modifying the type of the initial risk checking result; and if the type of the initial risk checking result is consistent with the types of the majority of secondary risk checking results, keeping the type of the initial risk checking result unchanged.
In some embodiments, the server may further add the target business data as quality checked business data to the second data set. The second data set may include at least one quality-checked service data, and each quality-checked service data corresponds to a final risk audit result and a feature string.
In some embodiments, the server may obtain the quality-checked service data whose similarity with the target service data satisfies a third condition; and determining an initial risk auditing result of the target service data according to the final risk auditing result of the quality-checked service data. The quality-checked business data can be business data of an initial risk auditing result after quality checking by quality checking personnel. The server may select the qualified service data, of which the similarity with the target service data satisfies a third condition, from the second data set. The third condition may be explained with reference to the first condition. The process of obtaining the quality checked service data by the server may be explained with reference to the process of obtaining the audited service data in step S14. The obtained quality inspection business data can be used as a reference object for assisting in determining an initial risk auditing result of the target business data.
In some embodiments, the server may provide the target business data, the quality-checked business data, and the final risk audit result of the quality-checked business data to the auditor. The auditor can determine an initial risk audit result of the target service data according to the quality checked service data and the final risk audit result corresponding to the quality checked service data; the initial risk audit result of the target business data can be input in the server. The server may receive an initial risk review result for the target business data. In other embodiments, the server may send the target service data, the quality-checked service data, and the final risk audit result of the quality-checked service data to other devices. The other equipment can receive the target service data, the quality-checked service data and the final risk auditing result of the quality-checked service data; the final risk auditing results of the target business data, the quality-inspected business data and the quality-inspected business data can be provided for the auditors. The auditor can determine an initial risk audit result of the target service data according to the quality checked service data and the final risk audit result corresponding to the quality checked service data; the initial risk review result of the target business data may be input in the other device. The other device may receive an initial risk review result; an initial risk review result may be sent to the server. The server may receive an initial risk review result.
According to the technical scheme provided by the embodiment of the specification, the initial risk auditing result of the target business data can be obtained; the checked business data with the similarity meeting the first condition with the target business data can be obtained; calculating the auditing consistency rate according to the initial risk auditing result of the audited business data; and determining whether to perform quality inspection on the initial risk inspection result of the target business data according to the inspection consistency rate. The review consistency rate may be used to indicate a degree of inconsistency when performing risk review on a plurality of similar contents. Whether quality inspection is carried out on the initial risk inspection result of the target business data is determined according to the inspection consistency rate, so that the risk omission is reduced while the quality inspection cost is saved; in addition, the proportion of the service data with the wrong initial risk auditing result in the extracted service data for quality inspection is higher, and the effectiveness of the quality inspection is improved.
Please refer to fig. 6. An embodiment of the present specification further provides a quality inspection apparatus, including:
a first obtaining unit 62, configured to obtain an initial risk auditing result of the target service data;
a second obtaining unit 64, configured to obtain audited service data whose similarity with the target service data meets a first condition;
the calculating unit 66 is configured to calculate an audit consistency rate according to an initial risk audit result of the audited business data;
and the determining unit 68 is configured to determine whether to perform quality inspection on the initial risk inspection result of the target service data according to the inspection consistency rate.
An embodiment of an electronic device of the present description is described below. Fig. 7 is a schematic diagram of a hardware configuration of the electronic apparatus in this embodiment. As shown in fig. 7, the electronic device may include one or more processors (only one of which is shown), memory, and a transmission module. Of course, it is understood by those skilled in the art that the hardware structure shown in fig. 7 is only an illustration, and does not limit the hardware structure of the electronic device. In practice the electronic device may also comprise more or fewer component elements than those shown in fig. 7; or have a different configuration than that shown in fig. 7.
The memory may comprise high speed random access memory; alternatively, non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory may also be included. Of course, the memory may also comprise a remotely located network memory. The remotely located network storage may be connected to the blockchain client through a network such as the internet, an intranet, a local area network, a mobile communications network, or the like. The memory may be used to store program instructions or modules of application software, such as program instructions or modules used to implement the corresponding embodiments of fig. 1 of the present specification.
The processor may be implemented in any suitable way. For example, the processor may take the form of, for example, a microprocessor or processor and a computer-readable medium that stores computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, an embedded microcontroller, and so forth. The processor may read and execute the program instructions or modules in the memory.
The transmission module may be used for data transmission via a network, for example via a network such as the internet, an intranet, a local area network, a mobile communication network, etc.
This specification also provides one embodiment of a computer storage medium. The computer storage medium includes, but is not limited to, a Random Access Memory (RAM), a Read-Only Memory (ROM), a Cache (Cache), a Hard Disk (HDD), a Memory Card (Memory Card), and the like. The computer storage medium stores computer program instructions. The computer program instructions when executed implement: the quality inspection method in the embodiment corresponding to fig. 1 in this specification.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and the same or similar parts in each embodiment may be referred to each other, and each embodiment focuses on differences from other embodiments. In addition, it is understood that one skilled in the art, after reading this specification document, may conceive of any combination of some or all of the embodiments listed in this specification without the need for inventive faculty, which combinations are also within the scope of the disclosure and protection of this specification.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
From the above description of the embodiments, it is clear to those skilled in the art that the present specification can be implemented by software plus a necessary general hardware platform. Based on such understanding, the technical solutions of the present specification may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments of the present specification.
The description is operational with numerous general purpose or special purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
While the specification has been described with examples, those skilled in the art will appreciate that there are numerous variations and permutations of the specification that do not depart from the spirit of the specification, and it is intended that the appended claims include such variations and modifications that do not depart from the spirit of the specification.
Claims (13)
1. A method of quality inspection comprising:
acquiring an initial risk auditing result of target service data;
acquiring audited service data with the similarity meeting a first condition with the target service data;
calculating an audit consistency rate according to an initial risk audit result of the audited business data;
and determining whether to perform quality inspection on the initial risk inspection result of the target service data according to the inspection consistency rate.
2. The method of claim 1, further comprising:
calculating a characteristic character string of the target service data;
and adding the target business data serving as audited business data into a first data set, wherein the first data set comprises at least one audited business data, and each audited business data corresponds to an initial risk audit result and a characteristic character string.
3. The method of claim 2, the target service data comprising text data;
the calculating the characteristic character string of the target service data comprises the following steps:
extracting key words from the text data;
acquiring the weight of the keyword in the text data and the hash value of the keyword;
according to the weight, weighting the digits on each digit in the hash value to obtain a weighting result of the keyword;
accumulating the numbers on the same digit in the weighted result of each keyword to obtain an accumulated result;
and performing dimension reduction processing on the accumulated result to obtain the characteristic character string of the text data.
4. The method of claim 2, the target traffic data comprising image data;
the calculating the characteristic character string of the target service data comprises the following steps:
acquiring a corresponding gray-scale image according to the image data;
aiming at each row of pixel points of the gray-scale image, calculating the difference value between adjacent pixel points in the row of pixel points;
and constructing a characteristic character string of the image data according to the difference value.
5. The method of claim 1, wherein the obtaining of the audited business data whose similarity to the target business data satisfies a first condition comprises:
determining similarity between target business data and audited business data in a first data set according to the characteristic character string of the target business data and the characteristic character string of the audited business data in the first data set, wherein the first data set comprises at least one audited business data, and each audited business data corresponds to a characteristic character string;
and selecting the audited business data with the similarity meeting a first condition from the first data set.
6. The method of claim 5, wherein determining the similarity between the target business data and the audited business data in the first data set comprises:
and calculating the hamming distance between the characteristic character string of the target business data and the characteristic character string of the audited business data, wherein the hamming distance is used for representing the similarity between the target business data and the audited business data.
7. The method of claim 1, the calculating an audit compliance rate, comprising:
in a risk auditing result set, counting the number of first type initial risk auditing results as a first number and the number of second type initial risk auditing results as a second number, wherein the risk auditing result set comprises a plurality of initial risk auditing results, and each initial risk auditing result corresponds to one acquired audited service data;
calculating a difference between the first number and the second number;
and calculating the auditing consistency rate according to the difference and the number of the initial risk auditing results in the risk auditing result set.
8. The method of claim 1, wherein the determining whether to perform quality inspection on the initial risk review result of the target business data comprises:
if the audit consistency rate does not meet the second condition, determining that the quality inspection is performed on the initial risk audit result of the target service data;
and if the audit consistency rate meets the second condition, determining that the quality inspection is not performed on the initial risk audit result of the target service data.
9. The method of claim 8, in the case that the quality inspection is determined to be performed on the initial risk audit result of the target business data, the method further comprises:
acquiring a plurality of secondary risk auditing results of target service data, wherein the secondary risk auditing results are determined by quality inspectors;
obtaining the types of most of the secondary risk auditing results in the plurality of secondary risk auditing results;
and correcting the initial risk auditing result of the target business data according to the types of the majority of secondary risk auditing results.
10. The method of claim 9, the method further comprising:
and adding the target business data serving as quality-checked business data into a second data set, wherein the second data set comprises at least one quality-checked business data, and each quality-checked business data corresponds to a final risk auditing result.
11. The method of claim 1, wherein obtaining an initial risk review result for the target business data comprises:
acquiring quality-checked service data of which the similarity with the target service data meets a third condition;
and determining an initial risk checking result of the target service data according to the final risk checking result of the quality checked service data.
12. A quality testing device comprising:
the first acquisition unit is used for acquiring an initial risk auditing result of the target service data;
the second acquisition unit is used for acquiring the audited service data of which the similarity with the target service data meets a first condition;
the calculation unit is used for calculating the auditing consistency rate according to the initial risk auditing result of the audited business data;
and the determining unit is used for determining whether to perform quality inspection on the initial risk inspection result of the target service data according to the inspection consistency rate.
13. An electronic device, comprising:
at least one processor;
a memory storing program instructions configured for execution by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011276569.2A CN112381408B (en) | 2020-11-16 | 2020-11-16 | Quality inspection method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011276569.2A CN112381408B (en) | 2020-11-16 | 2020-11-16 | Quality inspection method and device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112381408A true CN112381408A (en) | 2021-02-19 |
CN112381408B CN112381408B (en) | 2022-10-14 |
Family
ID=74584634
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011276569.2A Active CN112381408B (en) | 2020-11-16 | 2020-11-16 | Quality inspection method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112381408B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108470028A (en) * | 2017-02-23 | 2018-08-31 | 北京唱吧科技股份有限公司 | A kind of picture examination method and apparatus |
CN109191080A (en) * | 2018-09-17 | 2019-01-11 | 北京点网聚科技有限公司 | One quality testing method and device |
CN109284894A (en) * | 2018-08-10 | 2019-01-29 | 广州虎牙信息科技有限公司 | Picture examination method, apparatus, storage medium and computer equipment |
CN109756746A (en) * | 2018-12-28 | 2019-05-14 | 广州华多网络科技有限公司 | Video reviewing method, device, server and storage medium |
CN109803176A (en) * | 2018-12-28 | 2019-05-24 | 广州华多网络科技有限公司 | Audit monitoring method, device, electronic equipment and storage medium |
CN110149529A (en) * | 2018-11-01 | 2019-08-20 | 腾讯科技(深圳)有限公司 | Processing method, server and the storage medium of media information |
CN110674255A (en) * | 2019-09-24 | 2020-01-10 | 湖南快乐阳光互动娱乐传媒有限公司 | Text content auditing method and device |
CN110675269A (en) * | 2019-08-16 | 2020-01-10 | 阿里巴巴集团控股有限公司 | Text auditing method and device |
CN110765596A (en) * | 2019-10-10 | 2020-02-07 | 北京字节跳动网络技术有限公司 | Simulation model modeling method and device for auditing process and electronic equipment |
CN111314566A (en) * | 2020-01-20 | 2020-06-19 | 北京神州泰岳智能数据技术有限公司 | Voice quality inspection method, device and system |
-
2020
- 2020-11-16 CN CN202011276569.2A patent/CN112381408B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108470028A (en) * | 2017-02-23 | 2018-08-31 | 北京唱吧科技股份有限公司 | A kind of picture examination method and apparatus |
CN109284894A (en) * | 2018-08-10 | 2019-01-29 | 广州虎牙信息科技有限公司 | Picture examination method, apparatus, storage medium and computer equipment |
CN109191080A (en) * | 2018-09-17 | 2019-01-11 | 北京点网聚科技有限公司 | One quality testing method and device |
CN110149529A (en) * | 2018-11-01 | 2019-08-20 | 腾讯科技(深圳)有限公司 | Processing method, server and the storage medium of media information |
CN109756746A (en) * | 2018-12-28 | 2019-05-14 | 广州华多网络科技有限公司 | Video reviewing method, device, server and storage medium |
CN109803176A (en) * | 2018-12-28 | 2019-05-24 | 广州华多网络科技有限公司 | Audit monitoring method, device, electronic equipment and storage medium |
CN110675269A (en) * | 2019-08-16 | 2020-01-10 | 阿里巴巴集团控股有限公司 | Text auditing method and device |
CN110674255A (en) * | 2019-09-24 | 2020-01-10 | 湖南快乐阳光互动娱乐传媒有限公司 | Text content auditing method and device |
CN110765596A (en) * | 2019-10-10 | 2020-02-07 | 北京字节跳动网络技术有限公司 | Simulation model modeling method and device for auditing process and electronic equipment |
CN111314566A (en) * | 2020-01-20 | 2020-06-19 | 北京神州泰岳智能数据技术有限公司 | Voice quality inspection method, device and system |
Also Published As
Publication number | Publication date |
---|---|
CN112381408B (en) | 2022-10-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108985066B (en) | Intelligent contract security vulnerability detection method, device, terminal and storage medium | |
US20130188863A1 (en) | Method for context aware text recognition | |
CN112784582A (en) | Error correction method and device and computing equipment | |
CN115795000A (en) | Joint similarity algorithm comparison-based enclosure identification method and device | |
CN114265740A (en) | Error information processing method, device, equipment and storage medium | |
CN110069594B (en) | Contract confirmation method, contract confirmation device, electronic equipment and storage medium | |
CN111079379A (en) | Shape and proximity character acquisition method and device, electronic equipment and storage medium | |
CN111008624A (en) | Optical character recognition method and method for generating training sample for optical character recognition | |
CN112381408B (en) | Quality inspection method and device and electronic equipment | |
CN113157583A (en) | Test method, device and equipment | |
JP2019537177A (en) | Method and apparatus for bar code identification | |
CN116127925B (en) | Text data enhancement method and device based on destruction processing of text | |
CN118396786A (en) | Contract document auditing method and device, electronic equipment and computer readable storage medium | |
CN112364630B (en) | License content error correction method, device and system | |
CN111881382B (en) | Information display method and device, system and medium implemented by computer system | |
CN115203364A (en) | Software fault feedback processing method, device, equipment and readable storage medium | |
CN113282837A (en) | Event analysis method and device, computer equipment and storage medium | |
CN114357978A (en) | Document comparison method and device, computer equipment and storage medium | |
CN113254787A (en) | Event analysis method and device, computer equipment and storage medium | |
CN113420699A (en) | Face matching method and device and electronic equipment | |
CN113239226A (en) | Image retrieval method, device, equipment and storage medium | |
CN113836297A (en) | Training method and device for text emotion analysis model | |
CN113743902A (en) | Information auditing method and device based on artificial intelligence, terminal equipment and medium | |
CN111914868A (en) | Model training method, abnormal data detection method and device and electronic equipment | |
CN114492413B (en) | Text proofreading method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |