Nothing Special   »   [go: up one dir, main page]

CN115017027A - Interface automation continuous integration test method, device, equipment and storage medium - Google Patents

Interface automation continuous integration test method, device, equipment and storage medium Download PDF

Info

Publication number
CN115017027A
CN115017027A CN202110240392.9A CN202110240392A CN115017027A CN 115017027 A CN115017027 A CN 115017027A CN 202110240392 A CN202110240392 A CN 202110240392A CN 115017027 A CN115017027 A CN 115017027A
Authority
CN
China
Prior art keywords
target
test
cases
flow data
test case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110240392.9A
Other languages
Chinese (zh)
Inventor
刘璐辰
林晓升
高玉军
邹意林
杨萍
曹紫光
卢凯旋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202110240392.9A priority Critical patent/CN115017027A/en
Publication of CN115017027A publication Critical patent/CN115017027A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/21Design, administration or maintenance of databases
    • G06F16/215Improving data quality; Data cleansing, e.g. de-duplication, removing invalid entries or correcting typographical errors

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Data Mining & Analysis (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application provides an interface automation continuous integration test method and device, computer equipment and a storage medium, relates to the technical field of test, and is used for realizing an automation interface test. The method mainly comprises the following steps: collecting flow data on the line, and cleaning the collected flow data to obtain cleaned target flow data; calling a combination generation tool to generate a target test case set based on the target flow data; the target test case set is used for testing a case for a target interface or a target service; dispatching, distributing and executing the target test case set based on the test machine information, and acquiring a corresponding execution result; and performing assertion on the execution result according to an expected result corresponding to the target test case set to obtain an assertion result.

Description

Interface automation continuous integration test method, device, equipment and storage medium
Technical Field
The present application relates to the field of testing technologies, and in particular, to a method and an apparatus for testing an automatic continuous integration of an interface, a computer device, and a storage medium.
Background
An Application Programming Interface (API) Testing method is one of software Testing methods, and refers to verifying performance indexes such as functional integrity, reliability, and security of a service by calling an API during an integration Testing process.
The traditional interface test data needs manual writing of test cases after manual packet grabbing by a user, and the labor cost for interface test is very high.
Disclosure of Invention
The embodiment of the application provides an interface automation continuous integration test method and device, computer equipment and a storage medium, which are used for realizing an automation interface test.
The embodiment of the invention provides an automatic continuous integration testing method for an interface, which comprises the following steps:
collecting flow data on the line, and cleaning the collected flow data to obtain cleaned target flow data;
calling a combination generation tool to generate a target test case set based on the target flow data; the target test case set is used for testing a case for a target interface or a target service;
dispatching, distributing and executing the target test case set based on the test machine information, and acquiring a corresponding execution result;
and according to the expected result corresponding to the target test case set, performing assertion on the execution result to obtain an assertion result.
The embodiment of the invention provides an interface automation continuous integration testing device, which comprises:
the acquisition module is used for acquiring flow data on the line and cleaning the acquired flow data to obtain cleaned target flow data;
the generating module is used for calling a combined generating tool to generate a target test case set based on the target flow data; the target test case set is used for testing a target interface or a target service;
the execution module is used for scheduling, distributing and executing the target test case set based on the information of the test machine and acquiring a corresponding execution result;
and the determining module is used for asserting the execution result according to the expected result corresponding to the target test case set to obtain an asserted result.
A computer device comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the computer program to realize the interface automation continuous integration testing method.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the above-mentioned method of interface automated persistent integration testing.
The invention provides an interface automation continuous integration test method, an interface automation continuous integration test device, computer equipment and a storage medium, wherein the method comprises the steps of collecting online flow data, and cleaning the collected flow data to obtain cleaned target flow data; calling a combination generation tool to generate a target test case set based on the target flow data; the target test case set is used for testing a target interface or a target service; dispatching, distributing and executing the target test case set based on the test machine information, and acquiring a corresponding execution result; and performing assertion on the execution result according to an expected result corresponding to the target test case set to obtain an assertion result. Therefore, the invention realizes the automatic interface test.
Drawings
FIG. 1 is a flowchart of a method for automated continuous integration testing of an interface according to an embodiment of the present application;
FIG. 2 is a flowchart of generating a target test case set according to an embodiment of the present application;
FIG. 3 is a flowchart of generating a target test case set according to an embodiment of the present application;
FIG. 4 is a flow chart of screening qualified use cases as test use cases according to an embodiment of the present application;
FIG. 5 is a flow chart of screening qualified use cases as test use cases according to an embodiment of the present application;
FIG. 6 is a block diagram illustrating an exemplary embodiment of an apparatus for testing a continuous integration of an interface automation system;
FIG. 7 is a diagram illustrating a computer device according to an embodiment of the present application.
Detailed Description
In order to better understand the technical solutions described above, the technical solutions of the embodiments of the present application are described in detail below with reference to the drawings and the specific embodiments, and it should be understood that the specific features of the embodiments and the embodiments of the present application are detailed descriptions of the technical solutions of the embodiments of the present application, and are not limitations of the technical solutions of the present application, and the technical features of the embodiments and the embodiments of the present application may be combined with each other without conflict.
Referring to fig. 1, a method for testing an automatic persistent integration of an interface according to an embodiment of the present invention is shown, and the method specifically includes steps S101 to S104.
S101, collecting flow data on a line, and cleaning the collected flow data to obtain cleaned target flow data.
In this embodiment, the traffic data may be collected at regular time or in real time, so as to ensure the sensing and recording of the new access increment.
In order to ensure the safety of data in the flow data, corresponding covering replacement and desensitization processing need to be performed on sensitive information in the acquired flow data. Specifically, the step of performing data cleaning on the acquired flow data to obtain cleaned target flow data includes: when the acquired flow data comprises first-class flow data belonging to a specified service, replacing the website parameters contained in the first-class flow data with preset desensitization website parameters to obtain cleaned target flow data; when the acquired flow data contains second type flow data of the specified type information, desensitization processing is carried out on the specified type information contained in the second type flow data to obtain cleaned target flow data.
Specifically, the collected flow data is cleaned according to a preset rule base to obtain cleaned target flow data. And according to different service requirements, replacing and performing subsequent intervention processing on the request data related to the acquired target flow data through a self-set rule. The method comprises the steps of enabling url parameters of a get method in an http request, and enabling custom replacement of different fields in the url parameters and body parameters in a post request. The purpose of the replacement is that sensitive information that may be involved in the targeted traffic data may be determined by different business parties.
In an embodiment provided by the present invention, parameters in the target traffic data that have no influence on the coverage rate and the number of the selectable values is greater than the number of preset selectable values are deleted. For example, if there are a large number of common parameters such as user _ id, device _ id, etc. in the service, and the number of the selectable values is large but there is no substantial influence on the code coverage, these common parameters need to be deleted.
S102, calling a combined generation tool to generate a target test case set based on the target flow data; the target test case set is used for testing a target interface or a target service.
Generating a target test case set, extracting a batch of public parameters (the batch of parameters may belong to different requests) through target flow data recorded by each interface; and combining the parameters pairwise through a preset combination algorithm, increasing the number of the expansion test cases, and finally generating a target test case set.
S103, dispatching, distributing and executing the target test case set based on the test machine information, and acquiring a corresponding execution result.
In the embodiment of the invention, the dispatching, distributing and executing can poll the state of the current test case, carry out decentralized dispatching on the request, carry out dispatching, distributing and executing on the target test case set and acquire the corresponding execution result. Scheduling, distributing and executing improve the efficiency of actual request calling of a large number of test cases, and simultaneously avoid the influence on the on-line service caused by simultaneous execution of a large number of requests.
And S104, performing assertion on the execution result according to the expected result corresponding to the target test case set to obtain an assertion result.
And for the use cases which are in line with the expectation in the target test use case set, obtaining the parameter common part generated by the common part combination of the use cases through corresponding algorithm incremental calculation and common parameter extraction calculation. The parameter public part is a part carried in a request head/request parameter/request body of each request; the common part of an assertion is the field that is contained in the response taken by each request and is stored offline as the expected assertion result.
If the interface information changes (by extracting parameters from the response obtained after the request, when the number/type of the parameters are not matched with the original assertion, namely the interface information possibly changes), the previously generated expected assertion result is automatically dynamically calculated and updated and adjusted.
In the embodiment of the invention, the result obtained by scheduling execution is predicated, the expected result is pulled from the off-line service and compared with the actual result in the data type to judge whether the result is consistent with the actual result, if the result is consistent with the actual result, the case execution is judged to be successful, and if the result is not consistent with the actual result, the case execution is judged to be failed. Then, according to the execution result of each test case, counting the interface test result and/or the service test result aiming at the tested interface and/or the tested service; and storing the interface test result and/or the service test result and related use case execution detailed information in a file form at the cloud end, and notifying the specified user of the stored file content in a notification form.
The invention provides an automatic continuous integration testing method of an interface, which comprises the steps of collecting online flow data, and cleaning the collected flow data to obtain cleaned target flow data; calling a combination generation tool to generate a target test case set based on the target flow data; the target test case set is used for testing a case for a target interface or a target service; dispatching, distributing and executing the target test case set based on the test machine information, and acquiring a corresponding execution result; and according to the expected result corresponding to the target test case set, performing assertion on the execution result to obtain an assertion result. Therefore, the invention realizes the automatic interface test.
As shown in fig. 2 and fig. 3, in an embodiment provided by the present invention, the target traffic data includes a plurality of parameters, and selectable values and selectable value numbers respectively corresponding to the parameters, and the invoking of the combination generation tool to generate the target test case set based on the target traffic data includes:
s201, determining whether the number of the selectable values of the parameter is larger than a first preset value.
The first preset value may be set according to the number of test cases to be generated, or may be determined according to the time required for generating the test cases, which is not specifically limited in the embodiment of the present invention. It can be understood that, the larger the number of test cases that need to be generated, or the longer the time required for generating the test cases, the larger the first preset value may be set accordingly.
And S202, if the number of the optional values of the parameters is larger than a first preset value, generating a fixed number of candidate cases according to the target flow data.
For example, the first preset value is 100, and if the number of the selectable values in the target flow data exceeds 100 and the number of the parameters exceeds 5, a fixed number of candidate cases are generated according to the target flow data, so that test cases meeting conditions are screened out according to the generated candidate cases in the subsequent steps.
Specifically, the embodiment generates a fixed number of candidate use cases based on an Adaptive Random Testing (ART). It should be noted that, because the target flow data in this embodiment includes multiple parameters, in the process of determining the number of the selectable values of the parameters, it needs to be determined whether the selectable value of the parameter of the preset parameter number is greater than the preset number, where the preset parameter number may be specifically 4, 5, 6, and the like, and the embodiment of the present invention is not limited specifically. For example, if the number of selectable values in the target traffic data exceeds 100 and exceeds 5, the target traffic data is generated into a fixed number of candidate cases through an adaptive random test.
S203, screening out the cases meeting the preset conditions from the candidate cases to be used as test cases.
The preset condition can be specifically the case with the largest difference between the candidate case and the executed test case, so that the boundary case with higher probability covering the input domain is screened from the candidate case and is used as the test case.
And S204, taking the test case as a case in a target test case set.
In this embodiment, it is determined whether the number of selectable values of the parameter is greater than a first preset value; if the number of the selectable values of the parameters is larger than a first preset value, generating a fixed number of candidate cases according to the target flow data; and screening out the use cases meeting the preset conditions from the candidate use cases to serve as test use cases. Compared with the conventional combination test, the method and the device have the advantages that under the condition that the quantity of the selectable values of the determined parameters is larger than the first preset value, the fixed quantity of candidate cases are generated firstly, and then the preset quantity of cases meeting the preset conditions are screened out from the candidate cases to serve as the test cases, so that a large amount of time occupied by generating the test cases can be avoided, the quantity of generated test cases is reduced, and the test efficiency is improved.
S205, if the number of the selectable values of the parameter is less than or equal to the first preset value and greater than a second preset value, screening the second preset value from the corresponding selectable values for the parameter of which the number of the selectable values exceeds the second preset value.
Wherein the first preset value is greater than the second preset value.
In this embodiment, if the number of the selectable values of the parameter is less than or equal to the first preset value and greater than the second preset value, a conventional combined test (CIT) is used to eliminate the parameter that has no influence on the coverage rate and has a large number of selectable values, and then the upper limit of the number of the selectable values of the parameter is limited to the second preset value. Specifically, a certain number of selectable values may be randomly extracted from the parameters whose selectable values exceed the second preset value. For example, for parameters with the number of selectable values exceeding 50, for data exceeding, the parameters are uniformly selected from the whole input range; if too many test cases are generated, the test cases are discarded randomly instead of replacing the old test cases with new test cases, so that the test cases are ensured to cover the input domain range.
In the embodiment of the present invention, if the number of the selectable values of the parameter is less than or equal to the second preset value, the test case is generated for the target traffic data according to the combinational test.
S206, generating the test case according to the parameters and the corresponding screened optional values, and taking the generated test case as a case in a target test case set.
S207, if the number of the generated test cases is larger than the preset number of cases, the generated test cases are discarded randomly.
Further, if the number of the selectable values of the parameter is less than or equal to the second preset value, determining the use cases in the target test use case set through a combined test mode.
According to the interface automation continuous integration test method provided by the embodiment of the invention, a corresponding interface automation continuous integration test mode is determined according to the number of the selectable values of the parameters, namely when the number of the selectable values is limited, a test case is generated by using CIT (common information technology) so as to ensure the code coverage rate and the error detection rate; when the number of the selectable values is large, generating a test case by using the optimized CIT, eliminating parameters which have no influence on the coverage rate and are large in the selectable values, limiting the number of the selectable values of the parameters, and uniformly selecting the excessive data from the whole input range; if too many test cases are generated, the test cases are discarded randomly instead of being replaced by new ones, and the test cases are ensured to cover the range of the input domain; when the number of the selectable values is very large, candidate cases are generated through ART, and then test cases are determined from the candidate cases through case difference, so that the coverage rate of the test cases is ensured.
Referring to fig. 4 and fig. 5, a method for testing an automatic continuous integration of an interface according to an embodiment of the present invention is shown, where the step of screening a case meeting a preset condition from the candidate cases as a test case includes:
s301, calculating the parameter difference between each candidate use case in the fixed number of candidate use cases and the executed use case.
A group of candidate cases with fixed quantity is randomly generated before a new test case is generated each time. A best candidate case is selected as a test case through a screening criterion. The selection criteria can be maxi-min, maxi-maxi, maxi-sum, etc.
maxi-min: counting the distance (difference) between each candidate case and the case which is closest to the candidate case (with the minimum difference, for non-numerical input) and has already been executed, and then selecting the largest candidate case in the closest cases as the selected test case;
maxi-maxi: counting the distance (difference) between each candidate case and one executed case which is farthest away from the candidate case (the difference is largest, and the input is not a number), and then selecting the largest candidate case in the farthest cases as the selected test case;
maxi-sum: the sum of the distance (difference, for non-numeric input) between each candidate case and each executed case is counted, and then the candidate case with the largest sum is selected as the selected test case.
ART does not differ much in the overall results for the various screening criteria described above. Because of the range of the input field and the discontinuous data (only discrete recorded flow), the method is suitable for generating the use case by the election method. While using ART based on maxi-min has a greater probability of being overlaid onto the input domain boundaries, the boundary values tend to be more prone to error.
For the embodiment of the invention, after the candidate case is generated according to the target flow data, the candidate case is determined from the candidate case based on the maxi-min standard, because ART based on maxi-max has higher probability to cover the boundary of the input domain, and the boundary value is more prone to generate errors, the test case screened from the candidate case by the standard can better ensure the coverage rate and the error detection rate of the test case.
S302, selecting the candidate case with the maximum parameter difference with the executed case as the test case.
S303, judging whether the number of the selected test cases reaches a preset number.
The preset number is the number of the test cases needing to be generated.
S304, if the preset number is not reached, randomly generating a fixed number of candidate cases again according to the target flow data, and screening the cases meeting the preset conditions from the candidate cases to be used as test cases until the number of the generated test cases reaches the preset number.
The interface automation continuous integration test method provided by the embodiment of the invention comprises the steps of firstly calculating the parameter difference between each candidate case in a fixed number of candidate cases and an executed case, selecting the candidate case with the maximum parameter difference with the executed case as a test case, then judging whether the number of the selected test cases reaches the preset number, if not, randomly generating the fixed number of candidate cases again according to target flow data, and screening the cases meeting the preset conditions from the candidate cases as the test cases until the number of the generated test cases reaches the preset number. That is, the coverage of the test cases is ensured by comparing the differences of the executed cases, so that the test effect is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
In an embodiment, an interface automation continuous integration testing device is provided, and the interface automation continuous integration testing device corresponds to the interface automation continuous integration testing method in the embodiment one to one. As shown in fig. 6, the interface automation continuous integration testing apparatus includes: the device comprises an acquisition module 10, a generation module 20, an execution module 30 and a determination module 40. The functional modules are explained in detail as follows:
the acquisition module 10 is used for acquiring flow data on a line and cleaning the acquired flow data to obtain cleaned target flow data;
a generating module 20, configured to invoke a combination generating tool to generate a target test case set based on the target flow data; the target test case set is used for testing a case for a target interface or a target service;
the execution module 30 is configured to schedule, distribute and execute the target test case set based on the test machine information, and obtain a corresponding execution result;
and the determining module 40 is configured to assert the execution result according to an expected result corresponding to the target test case set to obtain an asserted result.
Specifically, the target traffic data includes a plurality of parameters, and selectable values and selectable value numbers respectively corresponding to the parameters, and the generating module 20 includes:
a determining unit 21, configured to determine whether the number of selectable values of the parameter is greater than a first preset value;
a generating unit 22, configured to randomly generate a fixed number of candidate cases according to the target traffic data if the number of selectable values of the parameter is greater than a first preset value;
the screening unit 23 is configured to screen a use case meeting a preset condition from the candidate use cases as a test use case;
the generating unit 22 is further configured to use the test case as a case in a target test case set;
the screening unit 23 is configured to, if the number of the selectable values of the parameter is less than or equal to the first preset value and greater than a second preset value, screen the second preset value from corresponding selectable values for the parameter whose number of the selectable values exceeds the second preset value;
the generating unit 22 is configured to generate the test case according to the parameter and the selectable value corresponding to the screening;
and the deleting unit 24 is configured to randomly discard the generated test cases if the generated test cases are greater than a preset number of cases.
The generating unit 22 is configured to determine, if the number of the selectable values of the parameter is less than or equal to the second preset value, a use case in the target test case set by a combined test mode.
Specifically, the screening module 23 is configured to:
calculating the parameter difference between each candidate use case in the fixed number of candidate use cases and the executed use case;
selecting the candidate case with the maximum parameter difference with the executed case as the test case;
judging whether the number of the selected test cases reaches a preset number or not;
if the preset number is not reached, randomly generating a fixed number of candidate cases again according to the target flow data;
screening out cases meeting preset conditions from the candidate cases as test cases;
and if the preset number is reached, determining that the test case is generated completely.
Further, the apparatus further comprises:
the statistical module is used for counting the test result and/or the service test result of the interface to be tested and/or the service to be tested according to the execution result of each test case;
and the storage module is used for storing the interface test result and/or the service test result and related case execution detailed information in a file form at the cloud end and notifying the specified user of the stored file content in a notification form.
The acquisition module 10 is specifically configured to, when the acquired traffic data includes first-class traffic data belonging to a specified service, replace a website parameter included in the first-class traffic data with a preset desensitization website parameter to obtain cleaned target traffic data;
when the acquired flow data contains second type flow data of the specified type information, desensitization processing is carried out on the specified type information contained in the second type flow data to obtain cleaned target flow data.
The determining module 40 is specifically configured to:
determining whether the expected result of each test case in the target test case set is consistent with the corresponding execution result;
if the test cases are consistent, determining that the test cases are successfully executed;
and if not, determining that the test case fails to execute.
For the specific definition of the device, reference may be made to the above definition of the interface automation continuous integration test method, which is not described herein again. The various modules in the above-described apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure thereof may be as shown in fig. 7. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operating system and the computer program to run on the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of automated persistent integration testing of an interface.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
collecting flow data on the line, and cleaning the collected flow data to obtain cleaned target flow data;
calling a combination generation tool to generate a target test case set based on the target flow data; the target test case set is used for testing a case for a target interface or a target service;
dispatching, distributing and executing the target test case set based on the information of the test machine, and acquiring a corresponding execution result;
and performing assertion on the execution result according to an expected result corresponding to the target test case set to obtain an assertion result.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
collecting flow data on the line, and cleaning the collected flow data to obtain cleaned target flow data;
calling a combined generation tool to generate a target test case set based on the target flow data; the target test case set is used for testing a target interface or a target service;
dispatching, distributing and executing the target test case set based on the information of the test machine, and acquiring a corresponding execution result;
and performing assertion on the execution result according to an expected result corresponding to the target test case set to obtain an assertion result.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the apparatus may be divided into different functional units or modules to perform all or part of the above described functions.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. An interface automation continuous integration test method is characterized by comprising the following steps:
collecting flow data on the line, and cleaning the collected flow data to obtain cleaned target flow data;
calling a combined generation tool to generate a target test case set based on the target flow data; the target test case set is used for testing a case for a target interface or a target service;
dispatching, distributing and executing the target test case set based on the information of the test machine, and acquiring a corresponding execution result;
and performing assertion on the execution result according to an expected result corresponding to the target test case set to obtain an assertion result.
2. The method of claim 1, wherein the target traffic data includes a plurality of parameters, and selectable values and selectable value quantities respectively corresponding to the parameters, and wherein generating the target test case set based on the target traffic data by the calling combination generation tool comprises:
if the number of the selectable values of the parameters is larger than a first preset value, randomly generating a fixed number of candidate cases according to the target flow data; screening out cases meeting preset conditions from the candidate cases to serve as test cases, and taking the test cases as cases in a target test case set;
if the number of the selectable values of the parameter is less than or equal to the first preset value and greater than the second preset value, screening the selectable values with the second preset value from the corresponding selectable values; the first preset value is greater than the second preset value; generating the test case according to the parameters and the selected values correspondingly, and taking the generated test case as a case in a target test case set;
if the generated test cases are larger than the preset case quantity, randomly discarding the generated test cases;
and if the number of the selectable values of the parameters is less than or equal to the second preset value, determining the use cases in the target test use case set through a combined test mode.
3. The method according to claim 2, wherein the screening out, from the candidate use cases, use cases meeting preset conditions as test cases comprises:
calculating the parameter difference between each candidate use case in the fixed number of candidate use cases and the executed use case;
selecting a candidate case with the maximum parameter difference with the executed case as the test case;
judging whether the number of the selected test cases reaches a preset number or not;
if the preset number is not reached, randomly generating a fixed number of candidate cases again according to the target flow data, and screening the cases meeting preset conditions from the candidate cases as test cases;
and if the preset number is reached, determining that the test case is generated completely.
4. The method of claim 1, further comprising:
according to the execution result of each test case, counting the interface test result and/or the service test result aiming at the tested interface and/or the tested service;
and storing the interface test result and/or the service test result and related use case execution detailed information in a file form at the cloud end, and notifying the specified user of the stored file content in a notification form.
5. The method of claim 1, wherein the data cleansing of the collected flow data to obtain cleansed target flow data comprises:
when the acquired flow data comprises first-class flow data belonging to a specified service, replacing the website parameters contained in the first-class flow data with preset desensitization website parameters to obtain cleaned target flow data;
when the acquired flow data contains second type flow data of the specified type information, desensitization processing is carried out on the specified type information contained in the second type flow data to obtain cleaned target flow data.
6. The method of claim 1, wherein asserting the execution result according to the expected result corresponding to the target test case set to obtain an asserted result comprises:
determining whether the expected result of each test case in the target test case set is consistent with the corresponding execution result;
if the test cases are consistent, determining that the test cases are successfully executed;
and if not, determining that the test case fails to execute.
7. An interface automation continuous integration testing device, characterized in that the device comprises:
the acquisition module is used for acquiring flow data on the line and cleaning the acquired flow data to obtain cleaned target flow data;
the generating module is used for calling a combined generating tool to generate a target test case set based on the target flow data; the target test case set is used for testing a case for a target interface or a target service;
the execution module is used for scheduling, distributing and executing the target test case set based on the information of the test machine and acquiring a corresponding execution result;
and the determining module is used for asserting the execution result according to the expected result corresponding to the target test case set to obtain an asserted result.
8. The apparatus of claim 7, wherein the target flow data comprises a plurality of parameters, and selectable values and selectable value numbers corresponding to the parameters, respectively, and the generating module comprises:
the generating unit is used for randomly generating a fixed number of candidate cases according to the target flow data if the number of the selectable values of the parameters is greater than a first preset value;
the screening unit is used for screening out the use cases meeting the preset conditions from the candidate use cases as test cases;
the generating unit is further configured to use the test case as a case in a target test case set;
the screening unit is further configured to screen the second preset value from the corresponding selectable values if the number of the selectable values of the parameter is less than or equal to the first preset value and greater than the second preset value; the first preset value is greater than the second preset value;
the generating unit is further configured to generate the test case according to the parameter and the corresponding screened optional value, and use the generated test case as a case in a target test case set;
the deleting unit is used for randomly discarding the generated test cases if the generated test cases are larger than the preset number of cases;
the generating unit is further configured to determine, if the number of the selectable values of the parameter is less than or equal to the second preset value, a case in the target test case set by a combined test mode.
9. A computer device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the interface automated persistent integration testing method of any of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method of interface automated persistent integration testing according to any one of claims 1 to 6.
CN202110240392.9A 2021-03-04 2021-03-04 Interface automation continuous integration test method, device, equipment and storage medium Pending CN115017027A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110240392.9A CN115017027A (en) 2021-03-04 2021-03-04 Interface automation continuous integration test method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110240392.9A CN115017027A (en) 2021-03-04 2021-03-04 Interface automation continuous integration test method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115017027A true CN115017027A (en) 2022-09-06

Family

ID=83064456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110240392.9A Pending CN115017027A (en) 2021-03-04 2021-03-04 Interface automation continuous integration test method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115017027A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117971705A (en) * 2024-03-28 2024-05-03 成都九洲电子信息系统股份有限公司 Intelligent interface automatic test system and method based on customized flow insight

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117971705A (en) * 2024-03-28 2024-05-03 成都九洲电子信息系统股份有限公司 Intelligent interface automatic test system and method based on customized flow insight

Similar Documents

Publication Publication Date Title
CN109634730B (en) Task scheduling method, device, computer equipment and storage medium
CN112631686A (en) Data processing method, data processing device, computer equipment and storage medium
CN110942190A (en) Queuing time prediction method and device, computer equipment and storage medium
CN110851159A (en) Business rule updating method and device, computer equipment and storage medium
CN111742309A (en) Automated database query load assessment and adaptive processing
CN116643906B (en) Cloud platform fault processing method and device, electronic equipment and storage medium
CN114185763A (en) Dynamic allocation method, device, storage medium and electronic equipment
CN111061610B (en) Generation method and device of cluster system performance test report and computer equipment
CN115017027A (en) Interface automation continuous integration test method, device, equipment and storage medium
CN112511384A (en) Flow data processing method and device, computer equipment and storage medium
CN112051771A (en) Multi-cloud data acquisition method and device, computer equipment and storage medium
CN115239450A (en) Financial data processing method and device, computer equipment and storage medium
CN114281260A (en) Storage method, device, equipment and medium applied to distributed storage system
CN114003339A (en) Detection method and device for zombie virtual machine, computer equipment and storage medium
CN111367782A (en) Method and device for automatic generation of regression test data
CN113079063A (en) Offline judgment method, system and device for charging device and computer storage medium
CN112954087A (en) Domain name connection method and device for SaaS (software as a service), computer equipment and storage medium
CN113014633B (en) Method and device for positioning preset equipment, computer equipment and storage medium
CN116342256A (en) Wind control strategy testing method and device, computer equipment and storage medium
CN114528213A (en) Automatic baffle plate testing method, device, equipment and storage medium
CN114281474A (en) Resource adjusting method and device
CN113127542A (en) Data anomaly analysis method and device
CN112286704A (en) Processing method and device of delay task, computer equipment and storage medium
CN115766481B (en) Micro-service treatment method and system
CN114020611B (en) Test data monitoring processing method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant before: Tiktok vision (Beijing) Co.,Ltd.