CN112306857A - Method and apparatus for testing applications - Google Patents
Method and apparatus for testing applications Download PDFInfo
- Publication number
- CN112306857A CN112306857A CN202010110804.2A CN202010110804A CN112306857A CN 112306857 A CN112306857 A CN 112306857A CN 202010110804 A CN202010110804 A CN 202010110804A CN 112306857 A CN112306857 A CN 112306857A
- Authority
- CN
- China
- Prior art keywords
- test
- task
- testing
- application
- terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 779
- 238000000034 method Methods 0.000 title claims abstract description 64
- 238000009434 installation Methods 0.000 claims abstract description 56
- 238000004891 communication Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 10
- 238000010801 machine learning Methods 0.000 claims description 10
- 238000007619 statistical method Methods 0.000 claims description 10
- 238000012549 training Methods 0.000 claims description 7
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000004904 shortening Methods 0.000 abstract 1
- 230000008569 process Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 238000013515 script Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 238000011161 development Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000010187 selection method Methods 0.000 description 2
- 102100035115 Testin Human genes 0.000 description 1
- 101710070533 Testin Proteins 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/60—Software deployment
- G06F8/61—Installation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Software Systems (AREA)
- Telephone Function (AREA)
Abstract
Embodiments of the present disclosure disclose methods and apparatus for testing applications. One embodiment of the method comprises: receiving an installation package, a test task set and test data for testing an application to be tested, wherein the test task in the test task set is used for indicating a test target aiming at the application; for the test tasks in the test task set, selecting terminals for completing the test tasks from the terminal set as test terminals corresponding to the test tasks; installing an application on a test terminal corresponding to the test task by using an installation package, and completing the test task by using test data to obtain a test result corresponding to the test task; and storing the test results corresponding to the test tasks in the test task set. The implementation mode is beneficial to improving the testing efficiency and shortening the testing period.
Description
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a method and an apparatus for testing an application.
Background
With the rapid development of the mobile internet, mobile terminals such as smart phones and tablet computers are also increasingly popularized, and various terminal applications are emerging to provide various functions. Due to the variety of types of mobile terminals (such as the model of the terminal, the model of the CPU, the resolution, the operating system, etc.), when an application is developed, a test needs to be performed on various types of mobile terminals to ensure the compatibility of the application.
For development and testing personnel, how to acquire enough amount of various types of mobile terminals for testing is one of the problems to be considered when testing applications. Currently, there are some cloud-based platforms that provide users with various types of mobile terminals, and developers and testers can test applications by using the cloud-based platforms. However, to complete testing of an application, it is also often necessary for a tester to monitor and participate in the entire testing process, additionally develop test scripts, and the like.
Disclosure of Invention
Embodiments of the present disclosure propose methods and apparatuses for testing applications.
In a first aspect, an embodiment of the present disclosure provides a method for testing an application, the method including: receiving an installation package, a test task set and test data for testing an application to be tested, wherein the test task in the test task set is used for indicating a test target aiming at the application; for the test tasks in the test task set, selecting terminals for completing the test tasks from the terminal set as test terminals corresponding to the test tasks; installing an application on a test terminal corresponding to the test task by using an installation package, and completing the test task by using test data to obtain a test result corresponding to the test task; and storing the test results corresponding to the test tasks in the test task set.
In some embodiments, the above method further comprises: splitting a test task in the test task set to obtain a sub-test task set corresponding to the test task; and for the sub-test tasks in the sub-test task set corresponding to the test task, determining test data for completing the sub-test tasks from the test data.
In some embodiments, the terminals selected from the terminal set for completing the test task are used as test terminals corresponding to the test task; the method for installing the application on the test terminal corresponding to the test task by using the installation package and completing the test task by using the test data to obtain the test result corresponding to the test task comprises the following steps: for the sub-test tasks in the sub-test task set corresponding to the test task, selecting terminals for completing the sub-test tasks from the terminal set as test terminals corresponding to the sub-test tasks; and installing the application on the test terminal corresponding to the sub-test task by using the installation package, and completing the sub-test task by using the determined test data for completing the sub-test task to obtain a test result corresponding to the sub-test task.
In some embodiments, the application to be tested is implemented based on machine learning; and the test data includes training data for implementing machine learning.
In some embodiments, the terminals in the terminal set include cloud computers.
In some embodiments, after selecting a terminal for completing the test task from the terminal set as a test terminal corresponding to the test task, the method further includes: and the communication connection is carried out with the selected test terminal.
In some embodiments, the above method further comprises: and analyzing and summarizing the stored test results, generating a statistical analysis report of the application and sending the report.
In a second aspect, embodiments of the present disclosure provide an apparatus for testing an application, the apparatus comprising: a receiving unit configured to receive an installation package of an application to be tested, a set of test tasks, and test data for testing the application, wherein the test tasks in the set of test tasks are used to indicate a test target for the application; the testing unit is configured to select terminals for completing the testing task from the terminal set as testing terminals corresponding to the testing task for the testing task in the testing task set; installing an application on a test terminal corresponding to the test task by using an installation package, and completing the test task by using test data to obtain a test result corresponding to the test task; and the storage unit is configured to store the test results corresponding to the test tasks in the test task set.
In some embodiments, the apparatus for testing an application described above further comprises: the splitting unit is configured to split a test task in the test task set to obtain a sub-test task set corresponding to the test task; and for the sub-test tasks in the sub-test task set corresponding to the test task, determining test data for completing the sub-test tasks from the test data.
In some embodiments, the test unit is further configured to, for a sub-test task in the sub-test task set corresponding to the test task, select a terminal for completing the sub-test task from the terminal set as a test terminal corresponding to the sub-test task; and installing the application on the test terminal corresponding to the sub-test task by using the installation package, and completing the sub-test task by using the determined test data for completing the sub-test task to obtain a test result corresponding to the sub-test task.
In some embodiments, the application to be tested is implemented based on machine learning; and the test data includes training data for implementing machine learning.
In some embodiments, the terminals in the terminal set include cloud computers.
In some embodiments, after selecting a terminal for completing the test task from the terminal set as a test terminal corresponding to the test task, the method further includes: and the communication connection is carried out with the selected test terminal.
In some embodiments, the above method further comprises: and analyzing and summarizing the stored test results, generating a statistical analysis report of the application and sending the report.
In a third aspect, an embodiment of the present disclosure provides a server, where the electronic device includes: one or more processors; storage means for storing one or more programs; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation of the first aspect.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable medium on which a computer program is stored, which computer program, when executed by a processor, implements the method as described in any of the implementations of the first aspect.
According to the method and the device for testing the application, the installation package of the application to be tested, the test task set for indicating the test target and the test data for testing the application are received, the test terminals for testing are selected from the terminal set respectively to complete each test task in the test task set, and the test result of each test task is stored. Meanwhile, the participation of technical personnel is omitted, and the corresponding test terminal is selected for each test task to test, so that the test flow is accelerated, the test efficiency is improved, and the test period is shortened.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for testing an application, according to the present disclosure;
FIG. 3 is a schematic diagram of one application scenario of a method for testing an application, in accordance with an embodiment of the present disclosure;
FIG. 4 is a flow diagram of yet another embodiment of a method for testing an application according to the present disclosure;
FIG. 5 is a schematic block diagram of one embodiment of an apparatus for testing applications according to the present disclosure;
FIG. 6 is a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary architecture 100 to which embodiments of the disclosed method for testing an application or apparatus for testing an application may be applied.
As shown in fig. 1, the system architecture 100 may include a terminal device 101, a server 102, and a set of terminal devices 103.
The terminal device 101 may interact with the server 102 via a network to receive or transmit messages or the like. The network may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. Various client applications may be installed on the terminal device. Such as browser-type applications, test-type applications, communication-type applications, etc.
The terminal apparatus 101 may be hardware or software. When the terminal device 101 is hardware, it may be various electronic devices including, but not limited to, a smart phone, a tablet computer, an e-book reader, a laptop portable computer, a desktop computer, and the like. When the terminal apparatus 101 is software, it can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The terminal device set 103 may include a batch of various types of terminal devices. For example, different models of terminal devices, different versions of operating system terminal devices, etc. The terminal device set 103 may be communicatively connected with the server 102 to implement data interaction and the like.
The terminal devices in the terminal device set 103 may be hardware or software. When the terminal devices of the terminal device set 103 are hardware, they may be various electronic devices, including but not limited to smart phones, tablet computers, e-book readers, laptop portable computers, desktop computers, and the like. When the terminal devices in the terminal device set 103 are software, the software can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules to provide distributed services) or as a single piece of software or software module. For example, the terminal devices in the terminal device set 103 may be cloud-live machines (or virtual mobile phones, mobile phone simulators), and the like.
The server 102 may be a server that provides various services. For example, an installation package of an application to be tested, which is sent by the terminal device 101, and test data for indicating that a test task set for a test target of the application is used for testing the application are received, and each test task in the test task set is completed according to the received data, so as to obtain a test result of each test task. Specifically, the server 102 may select a plurality of terminal devices from the terminal device set 103, install an application to be tested on the selected terminal devices, and complete a test task in the test task set to obtain a test result. Further, the server 102 may also return the obtained test result to the terminal device 101.
It should be noted that the installation package of the application to be tested and the test data for testing the application, which is used for indicating the test task set of the test target for the application, may also be directly stored in the local of the server 102, and the server 102 may directly extract and process the installation package of the application to be tested and the test data for testing the application, which is stored in the local, which is used for indicating the test task set of the test target for the application, at this time, the terminal device 101 may not exist.
It should be noted that the method for testing the application provided by the embodiment of the present disclosure is generally executed by the server 102, and accordingly, the apparatus for testing the application is generally disposed in the server 102.
The server 102 may be hardware or software. When the server 102 is hardware, it may be implemented as a distributed server cluster composed of multiple servers, or may be implemented as a single server. When the server 102 is software, it may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules used to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for testing an application in accordance with the present disclosure is shown. The method for testing an application comprises the following steps:
In the present embodiment, the application to be tested may be various applications developed by a technician. For example, applications for face recognition, applications for voice interaction, and so forth. An installation package for an application may refer to a file used to install the application. Generally, installation of an application can be achieved by running an installation package.
It should be understood that the type of the corresponding installation package may be different according to the development language or the operating environment of the application. For example, the installation Package may be an EXE (Executable) file, an APK (Android Application Package) file, or the like.
In this embodiment, the test tasks in the test task set may be used to indicate the test targets for the applications to be tested. Wherein, the test target can be determined according to the actual application requirement. For example, the test target may include the security, compatibility, and the like of the test application, and may also include whether the interface of the test application is displayed normally, and the like. As another example, in some cases, the application to be tested is implemented by a number of algorithms. At this time, testing the target may also include testing the accuracy of the algorithms, etc.
In this embodiment, the test data may be used to test an application to be tested, so as to complete the test task in the test task set. It should be understood that the test data can be flexibly set according to the actual application requirements. Different applications may typically have different test data.
In some cases, the application to be tested may be implemented based on machine learning. For example, the application to be tested may be implemented based on a deep learning algorithm. At this time, the test data includes training data for realizing machine learning.
As an example, the application to be tested may be implemented based on a speech recognition model. At this time, the test target indicated by the test task includes continuously training the speech recognition model by using the training data to obtain an output result of the speech recognition model, and testing the accuracy of the output result.
In the present embodiment, an execution subject (e.g., server 102 shown in fig. 1) of the method for testing an application may receive an installation package, a test task set, and test data of the application to be tested from other terminal devices (e.g., terminal device 101 shown in fig. 1).
As an example, a technician such as development or test may send an installation package, a test task set, and test data of an application to be tested to the execution main body through a terminal device used by the technician. For example, an installation package, a set of test tasks, and test data for an application to be tested may be uploaded to a server based on Web Service. Then, the execution main body can select the terminal from the terminal set according to the received installation package, the test task set and the test data to complete the test task in the test task set.
step 2021, selecting the terminal for completing the test task from the terminal set as the test terminal corresponding to the test task.
In this embodiment, the terminal set may be composed of at least one terminal. The terminal set may be predetermined according to actual application requirements or specific application scenarios. The terminals in the terminal set may be hardware. For example, the terminals in the terminal set may be various smart phones, tablet computers, and the like. The terminals in the terminal set may also be software or software modules. For example, the terminals in the terminal set may be cloud-enabled machines, virtual phones, phone simulators, and the like. Of course, the terminal set may include both a terminal in the form of hardware and a terminal in the form of software.
Alternatively, the terminal set may include various types of terminals. For example, different models of smartphones or tablets, different operating system versions of smartphones or tablets, etc. may be included.
For an application, the installation environment is usually limited. For example, there is a demand for an operating system, a CPU, and the like of a terminal to which the application can be installed. Only terminals that meet the requirements can install the application normally. Therefore, by enriching the types of the terminals included in the terminal set, the situation that the terminal capable of installing the application to be tested cannot be found can be avoided, so that the problems that the test result is wrong or the test task cannot be completed and the like are caused.
In addition, many existing applications (such as applications developed based on the Android system) generally need to be tested on various types of terminals respectively in order to ensure compatibility of the applications. At this time, when the terminal set includes various types of terminals, it is more convenient to provide more types of terminals for testing the compatibility of the application and the like, so as to improve the test coverage.
It should be appreciated that the terminal set may include more than one number of terminals of the same type. In other words, the number of terminals having the same configuration may be two or more.
In this embodiment, the terminal set may be composed of a large number of terminals previously designated by the technician related to the execution subject described above.
Alternatively, the terminal set may be provided by some existing cloud-live platform. At this time, the execution subject can conveniently acquire the cloud terminal from the cloud terminal platforms. Among other things, cloud and live machine platforms (e.g., Testin, DevEco, etc.) typically provide users with remote access to various terminals. The user can perform operations such as application installation, log checking, debugging and the like on a terminal provided by the cloud real machine platform. And if testers prepare various types of terminals required by the test by themselves, the cost is usually higher, and the test cost can be reduced by remembering the cloud real machine platform to carry out the test.
In this embodiment, the terminal for testing corresponding to each test task in the test task set can be flexibly selected from the terminal set by using various selection methods according to actual application requirements.
For example, according to the type of the application, a terminal capable of installing the application of the type may be first selected from the terminal set to form a terminal set. And then selecting a test terminal corresponding to each test task from the terminal subset.
For another example, for a test task in the test task set, a terminal may be randomly selected from the terminal set as a test terminal corresponding to the test task. In some cases, the correspondence between the test tasks and the terminals in the terminal set may be set in advance. At this time, for the test task in the test task set, the terminal corresponding to the test task may be selected from the terminal set as the test terminal for the test task.
In this embodiment, for the test tasks in the test task set, the number of the selected test terminals corresponding to the test task may be more than one. When the number of the selected test terminals corresponding to the test task is more than two, the types of the selected test terminals can be different according to actual application requirements or different test tasks.
In this embodiment, the test terminals corresponding to different test tasks may be the same or different.
In this embodiment, the application to be tested may be installed on the testing terminal according to the corresponding installation method according to the type of the installation package. Generally, the installation of the application to be tested can be achieved by running an installation package on the test terminal.
For the test tasks in the test task set, if the number of the test terminals corresponding to the test task is one, the application to be tested can be installed on one test terminal corresponding to the test task. If the number of the testing terminals corresponding to the testing task is more than two, the applications to be tested can be respectively installed on the testing terminals corresponding to the testing task.
In the present embodiment, the processing logic for completing various test tasks may be developed in advance by a person skilled in the art related to the execution subject described above. For example, processing scripts for various test tasks may be written in advance, and the test tasks may be completed by running the written scripts.
For the test task with the concentrated test task, after the execution main body installs the application to be tested on the test terminal corresponding to the test task, the test task can be completed by utilizing the pre-developed processing logic corresponding to the test task according to the test data.
For the test tasks in the test task set, if the number of the test terminals corresponding to the test task is one, the test task can be completed on one test terminal corresponding to the test task according to the test data by using the pre-developed processing logic corresponding to the test task. If the number of the test terminals corresponding to the test task is two, the test task can be completed on each test terminal corresponding to the test task according to the test data by using the pre-developed processing logic corresponding to the test task.
Optionally, for the test task in the test task set, after the terminal for completing the test task is selected from the terminal set as the test terminal corresponding to the test task, the execution main body may first perform communication connection with the selected test terminal. For example, when the application to be tested is an application developed on an Android system and the terminal for testing is a smart phone or a cloud phone, the communication connection between the execution main body and the terminal for testing can be realized based on adb (Android Debug bridge).
In the prior art, usually, a tester needs to autonomously implement ADB connection with terminals in a terminal set, and the execution main body implements ADB connection with terminals in the terminal set, so that the terminal set can be shielded for the tester, the tester does not need to care about related work related to the terminal set in a testing process, and the workload of the tester is reduced.
Optionally, the execution main body may first send the installation package of the application to be tested and the test data to the terminal for testing, then install the application to be tested on the terminal for testing according to the installation package, and complete the test task with the concentrated test task according to the test data.
In this embodiment, the test result may be different according to the test task. For example, if the test task indicates a test target for testing the accuracy of a certain algorithm implementing the application to be tested, the test result may be used to characterize the accuracy of the certain algorithm implementing the application to be tested. For another example, the test target indicated by the test task is used to test whether the application to be tested has a flash back or the like, and the test result may be used to characterize whether the application to be tested has a flash back or the like.
Optionally, for a test task in the test task set, the execution subject may first determine test data for completing the test task from the test data. And then, after the application to be tested is installed on the testing terminal corresponding to the testing task, the testing task can be completed by using the testing number corresponding to the testing task.
Alternatively, the received test data may conform to a preset format. At this time, after receiving the test data, the execution main body may parse the test data according to a preset format to obtain test data corresponding to each test task in the test task set.
Generally, the test data corresponding to the test task is determined according to a specific application scenario. Therefore, the test data corresponding to different test tasks in the test task set may be the same or different.
By splitting the test data, each test terminal can be prevented from receiving useless test data, so that resources are saved, and the test efficiency of the test task executed on each test terminal is improved.
Alternatively, test data may be received in the form of a file. In other words, the test data may be stored using a file. At this time, the file may store the test data according to a preset format, so as to facilitate the analysis of the test data by the execution subject.
For the test task with the concentrated test task, after obtaining the test data corresponding to the test task, the execution main body may only send the test data corresponding to the test task to the test terminal corresponding to the test task, so as to complete the test task by using the test data corresponding to the test task on the test terminal corresponding to the test task.
Optionally, for a test task in the test task set, the test task may be split to obtain a sub-test task set corresponding to the test task, and for a sub-test task in the sub-test task set corresponding to the test task, the test data for completing the sub-test task is determined from the test data.
The specific splitting method for the test task can be flexibly set according to different application requirements. For example, the corresponding relationship between the test task and the splitting manner may be preset. At this time, the test task may be split according to the splitting manner corresponding to the test task.
As an example, for some repeat or loop tests, it may be necessary to perform the same operation N times. For this type of test task, each execution may be regarded as one sub-task, so that N sub-tasks are obtained by splitting.
Optionally, each sub-test task in the obtained sub-test task set may be executed in parallel.
Optionally, for a test task in the test task set, in response to determining that the test task belongs to a detachable task, splitting the test task to obtain a sub-test task set corresponding to the test task,
technicians can preset test tasks needing to be split according to actual application requirements. For example, in some cases, some test tasks may not be split or need not be split, and may be considered to be non-split tasks. And some complex or very large test tasks can be regarded as detachable tasks, so that the subsequent test efficiency is improved.
For the test task in the test task set, after the test task is split to obtain the sub-test task set corresponding to the test task, for the sub-test task in the obtained sub-test task set, the test data for completing the sub-test task can be determined from the test data.
By further splitting the test task, the granularity of the task executed in the test process can be reduced, so that the test process can be more finely designed, and more detailed test information can be recorded.
Optionally, for a sub-test task, the test data corresponding to the target test task corresponding to the sub-test task may be determined from the test data. And the target test task is a test task corresponding to the sub-test task set to which the sub-test task belongs. Then, the test data used for completing the sub-test task may be determined from the test data corresponding to the target test task as the test data corresponding to the sub-test task.
For example, scripts for splitting test data according to sub-test tasks may be pre-written by a technician. At this time, the test data for completing the sub-test task may be determined from the test data corresponding to the target test task using a pre-written script.
It should be understood that the test data corresponding to each sub-test task in a sub-test task set may be the same or different.
By determining the test data corresponding to each sub-test task, the influence of unnecessary test data can be avoided in the actual execution process of the sub-test tasks, the completion time of each sub-test task can be shortened, and the test efficiency is improved.
In this embodiment, the test results corresponding to the test tasks obtained by the local storage of the execution main body may be obtained.
Optionally, the obtained test results corresponding to the respective test tasks may be sent. For example, the obtained test results corresponding to each test task may be sent to the installation package of the application to be tested, the test task set, and the sender of the test data.
Optionally, the stored test results may be analyzed and summarized, and a statistical analysis report of the application to be tested may be generated and sent.
Wherein a statistical analysis report of the application to be tested can be used to record the analysis and summary results for the test results. According to different practical application requirements, various analysis methods can be adopted to analyze and summarize the stored test results.
For example, the test results may be stored in a preset format. And then, an analysis script for analyzing the test result to generate a statistical analysis report is written in advance according to the actual application requirement. After the test results are obtained, the stored test results may be processed using an analysis script to generate a statistical analysis report.
For example, the statistical analysis report may count the success rate of the test tasks in the test task set, the test time of each test task, the total test completion time, the model information of the test terminal corresponding to each test task, and the like.
The tester and the like can know the test result of the application to be tested according to the received statistical analysis report, and further can improve the application to be tested according to the test result.
Optionally, according to an actual application requirement, the execution main body may further record and store log information and the like in the whole testing process. And then, the main body can also send the stored log information and the like to an installation package, a test task set and a test data sending party of the application to be tested so that a tester can know the test process of the application to be tested in detail. Meanwhile, the tester can also quickly locate the problem of the application to be tested according to log information and the like, so that the application to be tested is further improved.
With continued reference to fig. 3, fig. 3 is an exemplary application scenario 300 of the method for testing an application according to the present embodiment. In the application scenario of fig. 3, a developer or tester of an application to be tested may upload an installation package and test data of the application to be tested to a server 302 (the execution main body) through a web page by using a computer 301 used by the developer or tester, and create a test task to form a test task set. As shown in the figure, the set of test tasks includes task 1 and task 2.
The server 302 may select a cloud-live machine from the connected cloud-live machine platform 303 to complete task 1 and task 2, respectively. Taking as an example that the server 302 selects one cloud real machine 304 from the connected cloud real machine platform 303 to complete task 1, the server 302 may establish a connection with the cloud real machine 304 based on the ADB technology, and then send an installation package and test data of an application to be tested to the cloud real machine 304. Then, the server 302 may first run the installation package on the cloud real machine 304 to install the application to be tested on the cloud real machine 304, and then complete the task 1 according to the test data by using the pre-written processing script, obtain the test result, and store the test result in the local of the server 302. Meanwhile, the server 302 may send the test result to the computer 301 for the development of the application to be tested or the tester to check.
Compared with the prior art that the test can be completed only by requiring a tester to specially develop various test scripts, prepare a terminal for testing, or regulate and control the terminal for testing in real time in the testing process, the method provided by the embodiment of the disclosure reduces the workload of the tester by providing the installation package, the test task set and the test data for receiving the application to be tested, and acquiring the terminal with the concentrated terminal as the terminal for testing to complete the test task with the concentrated test task, so that the tester does not need to participate in controlling the whole testing process after preparing the installation package, the test task set and the test data for the application to be tested, shields the related work of the terminal set for the tester, realizes the automatic test for the application, and is beneficial to improving the testing efficiency.
With further reference to FIG. 4, a flow 400 of yet another embodiment of a method for testing an application is shown. The flow 400 of the method for testing an application includes the steps of:
Step 4022, for the sub-test tasks in the sub-test task set corresponding to the test task, determining the test data for completing the sub-test task from the test data.
step 40231, selecting a terminal for completing the sub-test task from the terminal set as a test terminal corresponding to the sub-test task.
In this embodiment, according to the actual application requirement, various selection methods can be flexibly adopted to collectively select the test terminal corresponding to the sub-test task from the terminals. This is not limited by the present application.
The test terminals corresponding to different sub-test tasks may be the same or different. For a sub-test task, the number of terminals for testing corresponding to the sub-test task may be one, or may be more than two. When the number of the testing terminals corresponding to the sub-testing tasks is more than two, the types of the more than two selected testing terminals can be different according to the actual application requirements or different testing tasks.
Step 40232, installing the application on the test terminal corresponding to the sub-test task by using the installation package, and completing the sub-test task by using the determined test data for completing the sub-test task to obtain the test result corresponding to the sub-test task.
In this step, if the number of the testing terminals corresponding to the sub-testing task is one, the application to be tested may be installed on one testing terminal corresponding to the sub-testing task. If the number of the testing terminals corresponding to the sub-testing task is more than two, the applications to be tested can be respectively installed on the testing terminals corresponding to the sub-testing task.
Optionally, the executing main body may first send the installation package of the application to be tested and the test data corresponding to the sub-test task to the test terminal corresponding to the sub-test task, then install the application to be tested on the test terminal according to the installation package, and complete the sub-test task according to the corresponding test data.
And step 403, storing the test results corresponding to each test task in the test task set.
For the test tasks in the test task set, the test results corresponding to the test tasks may be composed of the test results corresponding to each sub-test task in the sub-test task set corresponding to the test tasks.
The parts not described in the steps 401-403 may refer to the related descriptions in the corresponding embodiment of fig. 2, and are not described herein again.
According to the method provided by the embodiment of the disclosure, the test tasks in the test task set are split, the corresponding terminals are allocated for each sub-test task obtained through splitting, and the sub-test tasks are completed according to the corresponding test data, so that each sub-test task can be completed in parallel by using a plurality of terminals, the test efficiency of each test task is further improved, the test efficiency of the whole application is further improved, and the test period is shortened. Particularly, for some applications with very high complexity or very large related data volume (such as some applications realized based on a deep learning algorithm), the task is split and distributed to a plurality of terminals to complete the task test, so that the test efficiency can be effectively improved, and the problem of overlong test period is avoided.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an apparatus for testing applications, which corresponds to the embodiment of the method shown in fig. 2, and which may be applied in various electronic devices in particular.
As shown in fig. 5, the apparatus 500 for testing an application provided by the present embodiment includes a receiving unit 501, a testing unit 502, and a storing unit 503. Wherein the receiving unit 501 is configured to receive an installation package of an application to be tested, a set of test tasks, and test data for testing the application, wherein the test tasks in the set of test tasks are used to indicate a test target for the application; the test unit 502 is configured to, for a test task in the test task set, select a terminal for completing the test task from the terminal set as a test terminal corresponding to the test task; installing an application on a test terminal corresponding to the test task by using an installation package, and completing the test task by using test data to obtain a test result corresponding to the test task; the storage unit 503 is configured to store test results corresponding to respective test tasks in the test task set.
In the present embodiment, in the apparatus 500 for testing an application: the detailed processing of the receiving unit 501, the testing unit 502, and the storing unit 503 and the technical effects thereof can refer to the related descriptions of step 201, step 202, and step 203 in the corresponding embodiment of fig. 2, which are not repeated herein.
In some optional implementations of this embodiment, the apparatus for testing an application further includes: the splitting unit (not shown in the figure) is configured to split a test task in the test task set to obtain a sub-test task set corresponding to the test task; and for the sub-test tasks in the sub-test task set corresponding to the test task, determining test data for completing the sub-test tasks from the test data.
In some optional implementation manners of this embodiment, the test unit 502 is further configured to, for a sub-test task in the sub-test task set corresponding to the test task, select a terminal for completing the sub-test task from the terminal set as a test terminal corresponding to the sub-test task; and installing the application on the test terminal corresponding to the sub-test task by using the installation package, and completing the sub-test task by using the determined test data for completing the sub-test task to obtain a test result corresponding to the sub-test task.
In some optional implementations of this embodiment, the application to be tested is implemented based on machine learning; and the test data includes training data for implementing machine learning.
In some optional implementation manners of this embodiment, the terminals in the terminal set include a cloud terminal.
In some optional implementation manners of this embodiment, after selecting a terminal for completing the test task from the terminal set as a test terminal corresponding to the test task, the method further includes: and the communication connection is carried out with the selected test terminal.
In some optional implementations of the present embodiment, the stored test results are analyzed and summarized, and a statistical analysis report of the application is generated and sent.
The device provided by the above embodiment of the present disclosure receives, by a receiving unit, an installation package of an application to be tested, a test task set, and test data for testing the application, where a test task in the test task set is used to indicate a test target for the application; for the test tasks in the test task set, the test unit selects terminals for completing the test tasks from the terminal set as test terminals corresponding to the test tasks; installing an application on a test terminal corresponding to the test task by using an installation package, and completing the test task by using test data to obtain a test result corresponding to the test task; the storage unit stores the test results corresponding to the test tasks in the test task set respectively, so that in the whole test process of application, technical personnel are not needed to monitor or participate, the technical personnel are not needed to develop the test script aiming at the test, the workload of the technical personnel is greatly reduced, the automatic test of the application is realized, the test flow is accelerated, the test efficiency is improved, and the test period is shortened.
Referring now to FIG. 6, a schematic diagram of an electronic device (e.g., the server of FIG. 1) 600 suitable for use in implementing embodiments of the present disclosure is shown. The server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of embodiments of the present disclosure.
It should be noted that the computer readable medium described in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the server; or may exist separately and not be assembled into the server. The computer readable medium carries one or more programs which, when executed by the server, cause the server to: receiving an installation package, a test task set and test data for testing an application to be tested, wherein the test task in the test task set is used for indicating a test target aiming at the application; for the test tasks in the test task set, selecting terminals for completing the test tasks from the terminal set as test terminals corresponding to the test tasks; installing an application on a test terminal corresponding to the test task by using an installation package, and completing the test task by using test data to obtain a test result corresponding to the test task; and storing the test results corresponding to the test tasks in the test task set.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a receiving unit, a testing unit, and a storage unit. Where the names of these units do not in some cases constitute a definition of the unit itself, for example, a receiving unit may also be described as a "unit that receives an installation package of an application to be tested, a set of test tasks, and test data for testing the application, where the test tasks in the set of test tasks are used to indicate a test target for the application".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.
Claims (10)
1. A method for testing an application, comprising:
receiving an installation package, a test task set and test data for testing an application to be tested, wherein the test task in the test task set is used for indicating a test target for the application;
for the test tasks in the test task set, selecting terminals for completing the test tasks from the terminal set as test terminals corresponding to the test tasks; the application is installed on the testing terminal corresponding to the testing task by using the installation package, and the testing task is completed by using the testing data to obtain a testing result corresponding to the testing task;
and storing the test results corresponding to the test tasks in the test task set.
2. The method of claim 1, wherein the method further comprises:
splitting the test task in the test task set to obtain a sub-test task set corresponding to the test task;
and for the sub-test tasks in the sub-test task set corresponding to the test task, determining test data for completing the sub-test tasks from the test data.
3. The method according to claim 2, wherein the terminals for completing the test task are selected from the terminal set as the test terminals corresponding to the test task; the installing package is used for installing the application on the testing terminal corresponding to the testing task, and the testing data is used for completing the testing task to obtain the testing result corresponding to the testing task, and the method comprises the following steps:
for the sub-test tasks in the sub-test task set corresponding to the test task, selecting terminals for completing the sub-test tasks from the terminal set as test terminals corresponding to the sub-test tasks;
and installing the application on the test terminal corresponding to the sub-test task by using the installation package, and completing the sub-test task by using the determined test data for completing the sub-test task to obtain a test result corresponding to the sub-test task.
4. The method of claim 1, wherein the application is implemented based on machine learning; and
the test data includes training data for implementing machine learning.
5. The method of claim 1, wherein the terminals in the terminal set comprise cloud computers.
6. The method of claim 1, wherein after the selecting the terminal for completing the test task from the terminal set as the test terminal corresponding to the test task, the method further comprises:
and the communication connection is carried out with the selected test terminal.
7. The method according to one of claims 1-6, wherein the method further comprises:
and analyzing and summarizing the stored test results, generating a statistical analysis report of the application and sending the report.
8. An apparatus for testing an application, comprising:
a receiving unit configured to receive an installation package of an application to be tested, a set of test tasks, and test data for testing the application, wherein the test tasks in the set of test tasks are used to indicate a test target for the application;
the test unit is configured to select terminals for completing the test task from the terminal set as test terminals corresponding to the test task for the test task in the test task set; the application is installed on the testing terminal corresponding to the testing task by using the installation package, and the testing task is completed by using the testing data to obtain a testing result corresponding to the testing task;
and the storage unit is configured to store the test results corresponding to the test tasks in the test task set respectively.
9. A server, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010110804.2A CN112306857A (en) | 2020-02-24 | 2020-02-24 | Method and apparatus for testing applications |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010110804.2A CN112306857A (en) | 2020-02-24 | 2020-02-24 | Method and apparatus for testing applications |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112306857A true CN112306857A (en) | 2021-02-02 |
Family
ID=74336683
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010110804.2A Pending CN112306857A (en) | 2020-02-24 | 2020-02-24 | Method and apparatus for testing applications |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112306857A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113297094A (en) * | 2021-06-30 | 2021-08-24 | 中国银行股份有限公司 | Compatibility testing method and system for mobile phone bank |
CN113392027A (en) * | 2021-07-07 | 2021-09-14 | 北京智慧星光信息技术有限公司 | Compatibility testing method and system for mobile terminal application and electronic equipment |
CN114884931A (en) * | 2022-04-27 | 2022-08-09 | 京东科技控股股份有限公司 | Test system and construction method, device, equipment and medium thereof |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20030002999A (en) * | 2001-06-30 | 2003-01-09 | 주식회사 케이티 | Speech recognition system tester and the method to take advantage of script creation techniques |
CN104035861A (en) * | 2013-03-07 | 2014-09-10 | 腾讯科技(深圳)有限公司 | Method, device and system for obtaining intelligent terminal software |
CN104699616A (en) * | 2015-03-31 | 2015-06-10 | 北京奇虎科技有限公司 | Method, device and system for testing application |
CN104809055A (en) * | 2014-01-26 | 2015-07-29 | 腾讯科技(深圳)有限公司 | Application program test method and device based on cloud platform |
CN106294102A (en) * | 2015-05-20 | 2017-01-04 | 腾讯科技(深圳)有限公司 | The method of testing of application program, client, server and system |
CN106649103A (en) * | 2016-11-25 | 2017-05-10 | 深圳大学 | Android application program automatically black box testing method and system |
CN106933729A (en) * | 2015-12-29 | 2017-07-07 | 苏宁云商集团股份有限公司 | A kind of method of testing and system based on cloud platform |
CN107193728A (en) * | 2016-03-15 | 2017-09-22 | 展讯通信(天津)有限公司 | Mobile terminal automation testing method and device |
CN108109633A (en) * | 2017-12-20 | 2018-06-01 | 北京声智科技有限公司 | The System and method for of unattended high in the clouds sound bank acquisition and intellectual product test |
CN109684218A (en) * | 2018-12-26 | 2019-04-26 | 世纪龙信息网络有限责任公司 | Test macro and test method based on cloud prototype |
-
2020
- 2020-02-24 CN CN202010110804.2A patent/CN112306857A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20030002999A (en) * | 2001-06-30 | 2003-01-09 | 주식회사 케이티 | Speech recognition system tester and the method to take advantage of script creation techniques |
CN104035861A (en) * | 2013-03-07 | 2014-09-10 | 腾讯科技(深圳)有限公司 | Method, device and system for obtaining intelligent terminal software |
CN104809055A (en) * | 2014-01-26 | 2015-07-29 | 腾讯科技(深圳)有限公司 | Application program test method and device based on cloud platform |
CN104699616A (en) * | 2015-03-31 | 2015-06-10 | 北京奇虎科技有限公司 | Method, device and system for testing application |
CN106294102A (en) * | 2015-05-20 | 2017-01-04 | 腾讯科技(深圳)有限公司 | The method of testing of application program, client, server and system |
CN106933729A (en) * | 2015-12-29 | 2017-07-07 | 苏宁云商集团股份有限公司 | A kind of method of testing and system based on cloud platform |
CN107193728A (en) * | 2016-03-15 | 2017-09-22 | 展讯通信(天津)有限公司 | Mobile terminal automation testing method and device |
CN106649103A (en) * | 2016-11-25 | 2017-05-10 | 深圳大学 | Android application program automatically black box testing method and system |
CN108109633A (en) * | 2017-12-20 | 2018-06-01 | 北京声智科技有限公司 | The System and method for of unattended high in the clouds sound bank acquisition and intellectual product test |
CN109684218A (en) * | 2018-12-26 | 2019-04-26 | 世纪龙信息网络有限责任公司 | Test macro and test method based on cloud prototype |
Non-Patent Citations (1)
Title |
---|
常晓荣 等: "电网云测试服务平台的设计与应用", 《电信科学》, pages 179 - 181 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113297094A (en) * | 2021-06-30 | 2021-08-24 | 中国银行股份有限公司 | Compatibility testing method and system for mobile phone bank |
CN113392027A (en) * | 2021-07-07 | 2021-09-14 | 北京智慧星光信息技术有限公司 | Compatibility testing method and system for mobile terminal application and electronic equipment |
CN114884931A (en) * | 2022-04-27 | 2022-08-09 | 京东科技控股股份有限公司 | Test system and construction method, device, equipment and medium thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108900776B (en) | Method and apparatus for determining response time | |
CN109302522B (en) | Test method, test device, computer system, and computer medium | |
CN105338110A (en) | Remote debugging method, platform and server | |
CN109977012B (en) | Joint debugging test method, device, equipment and computer readable storage medium of system | |
CN112306857A (en) | Method and apparatus for testing applications | |
US11200155B2 (en) | System and method for automated application testing | |
CN111917603A (en) | Client test method and device, computer equipment and storage medium | |
CN107045475B (en) | Test method and device | |
CN112463634A (en) | Software testing method and device under micro-service architecture | |
CN107391362A (en) | Application testing method, mobile terminal and storage medium | |
CN110609786B (en) | Software testing method, device, computer equipment and storage medium | |
CN111767209A (en) | Code testing method, device, storage medium and terminal | |
CN112181822A (en) | Test method and test method for starting time consumption of application program | |
US10417116B2 (en) | System, method, and apparatus for crowd-sourced gathering of application execution events for automatic application testing and replay | |
CN113360365B (en) | Flow test method and flow test system | |
CN105339974B (en) | Analog sensor | |
CN117112393A (en) | Application program debugging method and device, electronic equipment and storage medium | |
CN113986263A (en) | Code automation test method, device, electronic equipment and storage medium | |
CN113268426A (en) | Application testing method and device, computer equipment and storage medium | |
CN114564402A (en) | Task flow testing method and device, readable medium and electronic equipment | |
CN114116497A (en) | Test method and device and server | |
CN112597026B (en) | Test case generation method and device | |
CN111831531B (en) | Test method and device | |
CN118519920B (en) | Automatic test method, device, equipment and storage medium | |
CN115484200B (en) | Buried point testing method, buried point testing device, server, storage medium and buried point testing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |