Nothing Special   »   [go: up one dir, main page]

CN115509913A - Software automation test method, device, machine readable medium and equipment - Google Patents

Software automation test method, device, machine readable medium and equipment Download PDF

Info

Publication number
CN115509913A
CN115509913A CN202211184685.0A CN202211184685A CN115509913A CN 115509913 A CN115509913 A CN 115509913A CN 202211184685 A CN202211184685 A CN 202211184685A CN 115509913 A CN115509913 A CN 115509913A
Authority
CN
China
Prior art keywords
test
target
tested
application
queue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211184685.0A
Other languages
Chinese (zh)
Inventor
钱建锋
孙斌
任兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHANGHAI INNOVATECH INFORMATION TECHNOLOGY CO LTD
Original Assignee
SHANGHAI INNOVATECH INFORMATION TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHANGHAI INNOVATECH INFORMATION TECHNOLOGY CO LTD filed Critical SHANGHAI INNOVATECH INFORMATION TECHNOLOGY CO LTD
Priority to CN202211184685.0A priority Critical patent/CN115509913A/en
Publication of CN115509913A publication Critical patent/CN115509913A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a software automation test method, which comprises the following steps: acquiring a test case corresponding to the target project from a test case database, wherein the test case comprises a test script generated in a recording mode; packaging the test case into a target test application installation package, and loading the target test application into the equipment to be tested; analyzing the target test application installation package through the equipment to be tested to obtain one or more test scripts; creating a test queue and importing one or more test scripts into the test queue; and running the test script based on the test sequence represented by the test queue to complete the test task, thereby obtaining the test result. According to the invention, after the software automatic test is realized, the automatic test task can be easily operated, more tests can be operated in less time, the complicated test task is automated, the accuracy and the enthusiasm of a tester can be improved, and a test technician can put more energy into designing a better test case.

Description

Software automation test method, device, machine readable medium and equipment
Technical Field
The invention relates to the field of software testing, in particular to a software automatic testing method, a device, a machine readable medium and equipment.
Background
Currently, the mainstream Android software automation test frameworks include the following:
1. monkey is a self-contained testing tool of the Android SDK, and sends a pseudorandom user event stream, such as key input, touch screen input, gesture input, and the like, to the system in the testing process), so as to implement pressure testing on the developing APP program and also output logs. In fact, the tool can only be used for conducting some pressure tests by a program, and the test events and data are random and cannot be customized, so that the tool has great limitation.
2. MonkeyRunner is also a test tool provided by android SDK. Strictly speaking, the Monkey Runner is actually an Api toolkit, is stronger than Monkey, and can write test scripts to define data and events. The defects are that the script is written by Python, the requirement on a tester is high, and the learning cost is high.
3. The Robotium is also a testing framework based on Instrumentation, and mainly carries out automatic testing on a certain APK, and the APK can have active codes or no active codes, so that the function is strong; the method has the defects that a certain Java basis is required for testers, android basic components are known, and App cannot be crossed.
4. Instrumentation is an early Android automated testing tool provided by Google, and even though it is possible for junt to test Android at that time, instrumentation allows you to perform more complex tests on APP programs, even at the framework level, which is the basis of many other testing frameworks and allows the components under test to be loaded in the same process. It has many rich high-level packages, and users can use other frameworks based on instrumentation to avoid excessive secondary development. But Instrumentation does not support cross-APP, resulting in Instrumentation-based frameworks that all inherit this shortcoming.
5. UIAutomator is a testing framework provided by google that provides advanced UI testing of native Android apps and games. Basically supporting all Android event operations, compared with Instrumentation, the method does not need testers to know the code implementation details (UiAutomatiroviwer can be used for capturing control attributes on an App page without looking at source codes), and has the defects that during testing, a USB (universal serial bus) is required to be connected, certain code writing capacity is required, and the debugging of a test program is complicated.
Aiming at the existing several mainstream Android software automatic testing frameworks, the following defects exist:
the test trigger event is random, cannot be defined by user, and has great limitation; the test scripts and the test programs are difficult to compile and tedious to debug, and a software test engineer has a relatively large learning cost; several mainstream test frames need to be tested on line by connecting a USB, and the test is stopped after the USB is disconnected; the test range is limited to a certain APP, and the system APP cannot be comprehensively tested; network interaction is required to be carried out on the tool end and the equipment to be tested, so that the power consumption of the system is too high, the use scene of a user cannot be simulated normally, and the accuracy of the relay navigation test is greatly reduced.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, it is an object of the present invention to provide a method, an apparatus, a machine-readable medium and a device for software automation test, which are used to solve the problems of the prior art.
To achieve the above and other related objects, the present invention provides a software automation test method, including:
acquiring a test case corresponding to a target project from a test case database, wherein the test case comprises a test script generated in a recording mode;
packaging the test case into a target test application installation package, and loading the target test application into equipment to be tested;
analyzing the target test application installation package through equipment to be tested to obtain one or more test scripts;
creating a test queue and importing the one or more test scripts into the test queue;
and running the test script based on the test sequence represented by the test queue to complete the test task, thereby obtaining the test result.
In an embodiment of the present invention, the method further includes: constructing a test case database at a PC (personal computer) end, wherein the step of constructing the test case database comprises the following steps:
creating an empty database;
creating a test item table in the empty database;
creating a test case according to the test project table, and generating a test script in a recording mode;
and storing the test cases and the tested scripts in the database so as to complete the construction of the test case database.
In an embodiment of the present invention, the method further includes: and after the target test application is installed in the equipment to be tested, disconnecting the equipment to be tested from the PC terminal.
In an embodiment of the present invention, the step of generating the test script by recording includes:
establishing connection between a PC (personal computer) end and equipment to be tested;
running a screen projection application at the PC end;
recording each operation executed by a user on the equipment to be tested aiming at the application program and interface element information displayed by the application program when each operation is executed through the screen projection application;
and recording the test script through the PC terminal according to each recorded operation executed by the user aiming at the application program and the interface element information displayed by the application program when each operation is executed.
In an embodiment of the present invention, the step of packaging the test case into a target test application installation package and loading the target test application into a device to be tested includes:
generating a JSON file corresponding to the test case;
packaging the JSON file to a set directory of a target test application;
compiling back and re-signing the target test application;
and installing the target test application subjected to re-signing into the equipment to be tested by an adb install method.
In an embodiment of the present invention, when the test script is run, a test thread is created for each test case, the test script is run based on multiple threads to obtain a test result, and the test result is stored in a test database.
In an embodiment of the present invention, when the test script is running, the test queue is traversed, whether the test queue is empty is determined, if the test queue is empty, the test task is continuously executed, and if the test queue is not empty, the test task is ended.
To achieve the above and other related objects, the present invention provides an automated software testing apparatus, comprising:
the test case acquisition module is used for acquiring a test case of a corresponding target project from a test case database, wherein the test case comprises a test script generated in a recording mode;
the application loading module is used for packaging the test case into a target test application installation package and loading the target test application into equipment to be tested;
the analysis module is used for analyzing the target test application installation package through the equipment to be tested to obtain one or more test scripts;
the queue creating module is used for creating a test queue and importing the one or more test scripts into the test queue;
and the test module is used for running the test script based on the test sequence represented by the test queue to complete the test task so as to obtain a test result.
To achieve the above and other related objects, the present invention also provides an apparatus comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform one or more of the methods described previously.
To achieve the above and other related objects, the present invention also provides one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform one or more of the methods described above.
As described above, the software automation test method, device, machine readable medium and apparatus provided by the present invention have the following advantages:
the invention discloses a software automation testing method, which comprises the following steps: acquiring a test case corresponding to a target project from a test case database, wherein the test case comprises a test script generated in a recording mode; packaging the test case into a target test application installation package, and loading the target test application into equipment to be tested; analyzing the target test application installation package through equipment to be tested to obtain one or more test scripts; creating a test queue and importing the one or more test scripts into the test queue; and running the test script based on the test sequence represented by the test queue to complete the test task, thereby obtaining the test result. The invention can record the operation steps of the PC terminal or/and the equipment to be tested by recording the test case; the method is used for recording the test script without compiling the test code, the test script is automatically generated through the operation of the equipment to be tested, and the PC end can be matched with the equipment to be tested to realize off-line and on-line automatic tests; in the actual test process, the PC end and the equipment to be tested do not interact, the whole process is completed by the equipment to be tested, the power consumption of the system is reduced, and the use scene of a user can be completely simulated.
After the automatic software testing is realized, the automatic testing task can be easily operated, more tests can be operated in less time, the complicated testing task is automated, the accuracy and the enthusiasm of testing personnel can be improved, and testing technicians can put more energy into designing better test cases. Meanwhile, the tester can be more concentrated on the manual testing part, and the efficiency of manual testing is improved. Testing is automated and consistent and repeatable. Because the test is automatically executed, the consistency of the result of each test and the executed content can be guaranteed, and the repeatable effect of the test can be achieved. Can be applied to various test scenarios such as: case test, pressure test, endurance test, regression test and the like, and the number of test cycles can be increased according to the test requirements; and a test report is generated, so that the labor input cost brought by some complexity tests in daily tests is effectively reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a schematic diagram of an implementation environment of a software automation testing method according to an exemplary embodiment of the present application;
FIG. 2 is a flow chart illustrating a method for automated testing of software according to an exemplary embodiment of the present application;
FIG. 3 is a flow chart illustrating the generation of a test script by way of recording according to an exemplary embodiment of the present application;
FIG. 4 is a flow diagram illustrating loading of a target test application into a device under test in accordance with an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a hardware configuration of a software automation test device according to an exemplary embodiment of the present application;
fig. 6 is a schematic diagram illustrating a hardware structure of a device under test according to an exemplary embodiment of the present application.
Detailed Description
Other advantages and effects of the present invention will become apparent to those skilled in the art from the disclosure herein, wherein the embodiments of the present invention are described in detail with reference to the accompanying drawings and preferred embodiments. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be understood that the preferred embodiments are only for illustrating the present invention, and are not intended to limit the scope of the present invention.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
In the following description, numerous details are set forth to provide a more thorough explanation of embodiments of the present invention, however, it will be apparent to one skilled in the art that embodiments of the present invention may be practiced without these specific details, and in other embodiments, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring embodiments of the present invention.
FIG. 1 is a schematic diagram of an exemplary software automated testing method implementation environment of the present application. Referring to fig. 1, the implementation environment includes a device under test 101 and a PC terminal 102, and the device under test 101 and the PC terminal 102 communicate with each other through a wired or wireless network. The method comprises the steps that a PC terminal obtains a test case corresponding to a target project from a test case database, wherein the test case comprises a test script generated in a recording mode; and packaging the test case into a target test application installation package, and loading the target test application into equipment to be tested. The equipment to be tested analyzes the target test application installation package to obtain one or more test scripts; creating a test queue and importing the one or more test scripts into the test queue; and running the test script based on the test sequence represented by the test queue to complete the test task, thereby obtaining a test result. The invention can record the operation steps of the PC terminal or/and the equipment to be tested by recording the test case; the method is used for recording the test script without compiling the test code, the test script is automatically generated through the operation of the equipment to be tested, and the PC end can be matched with the equipment to be tested to realize off-line and on-line automatic tests; in the actual test process, the PC end and the equipment to be tested do not interact, and the whole process is finished by the equipment to be tested. The power consumption of the system is reduced, and the use scene of a user can be completely simulated.
According to the invention, after the software automatic test is realized, the automatic test task can be easily operated, more tests can be operated in less time, the complicated test task is automated, the accuracy and the enthusiasm of a tester can be improved, and a test technician can put more energy into designing a better test case. Meanwhile, the tester can be more concentrated on the manual testing part, and the efficiency of manual testing is improved. Testing is automated and consistent and repeatable. Because the test is automatically executed, the consistency of the result of each test and the executed content can be ensured, and the repeatable effect of the test can be achieved. Can be applied to various test scenarios such as: the test cycle times can be increased according to the test requirements, such as case test, pressure test, endurance test, regression test and the like. And generating a test report. The human input cost brought by some complex tests in daily tests is effectively reduced.
It should be understood that the number of devices under test 101 and PC terminals 102 in fig. 1 is merely illustrative. There may be any number of devices under test 101 and PC terminals 102 according to actual needs.
The device to be tested 101 corresponds to a client, which may be any electronic device having a user input interface, including but not limited to a smart phone, a tablet, a notebook computer, a vehicle-mounted computer, and the like, wherein the user input interface includes but not limited to a touch screen, a keyboard, a physical key, an audio pickup device, and the like.
The PC end 102 corresponds to a server end, and may be a PC end providing various services, which may be an independent physical PC end, or a PC end cluster or distributed system formed by a plurality of physical PC ends, or a cloud PC end providing basic cloud computing services such as cloud services, a cloud database, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a CDN (Content delivery network), and a big data and artificial intelligence platform, which is not limited herein.
The device 101 to be tested may communicate with the server 102 through a wireless network such as 3G (third generation mobile information technology), 4G (fourth generation mobile information technology), 5G (fifth generation mobile information technology), and the like, which is not limited herein.
Embodiments of the present application respectively provide a software automation testing method, a software automation testing apparatus, an electronic device, and a computer-readable storage medium, and will be described in detail below.
Referring to fig. 2, fig. 2 is a flowchart illustrating a software automation testing method according to an exemplary embodiment of the present application. The method may be applied to the implementation environment shown in fig. 1 and specifically executed by the device under test 101 in the implementation environment. It should be understood that the method may be applied to other exemplary implementation environments and is specifically executed by devices in other implementation environments, and the embodiment does not limit the implementation environment to which the method is applied.
Referring to fig. 2, fig. 2 is a flowchart illustrating an exemplary software automation testing method according to the present application, where the software automation testing method at least includes steps S210 to S270, and the following detailed description is provided:
step S210, obtaining a test case corresponding to the target project from a test case database, wherein the test case comprises a test script generated in a recording mode;
step S220, packaging the test case into a target test application installation package, and loading the target test application into equipment to be tested;
step S230, analyzing the target test application installation package through the equipment to be tested to obtain one or more test scripts;
step S240, a test queue is established, and the one or more test scripts are led into the test queue;
and step S250, running the test script based on the test sequence represented by the test queue to complete the test task, thereby obtaining the test result.
The invention can record the operation steps of the PC terminal or/and the equipment to be tested by recording the test case; the method is used for recording the test script without compiling the test code, the test script is automatically generated through the operation of the equipment to be tested, and the PC end can be matched with the equipment to be tested to realize off-line and on-line automatic tests; in the actual test process, the PC end and the equipment to be tested do not interact, and the whole process is finished by the equipment to be tested. The power consumption of the system is reduced, and the use scene of a user can be completely simulated.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
The specific steps of this example are described in detail below.
In step S210, a test case corresponding to the target item is obtained from a test case database, where the test case includes a test script generated by recording;
it should be noted that the test case database may be pre-constructed, a plurality of test cases are pre-stored in the test case database, and the test cases include test scripts generated in a recording manner.
In this embodiment, the test case database is pre-constructed, and the step of constructing the test case database includes: creating an empty database; creating a test item table in the empty database; creating a test case according to the test item table, and generating a test script in a recording mode; and storing the test cases and the tested scripts in the database so as to complete the construction of the test case database.
In this embodiment, the empty database is used to store test cases of all items, and the test item table is used to store behavior data corresponding to the recorded test cases. The behavior data represents action data of a user on the PC side or/and the device to be tested, such as click data, drag data and the like.
The construction of a test case database is based on a sqlite3 small relational database module, the creation, generation and instantiation of the database are carried out through a standardized interface sqlite3.Connect of the module, a cursor object is obtained by taking the database instance, then a standardized interface execute is called to transfer into a table creation statement to create a test item data table, a main key constraint is added to an id in the table, and the table is used for storing all test cases and all software parameters of a corresponding item.
Referring to fig. 3, fig. 3 is a flowchart illustrating a test script generated by a recording method according to an exemplary embodiment of the present application. As shown in fig. 3, the step of generating the test script by recording includes:
step S310, establishing connection between the PC end and the equipment to be tested;
specifically, the connection between the PC end and the device to be tested can be established through adb (adb is called Android Debug Bridge as a whole, and functions as a Debug Bridge). Android programs can be debugged by DDMS conveniently in Eclipse through adb.
Step S320, running a screen projection application at the PC end;
throw the screen application through operation SCRCPY, can throw the screen to PC with the equipment to be tested and serve, can accomplish 1 through throwing the screen: and 1, displaying, and synchronizing the operation screen of the PC end to the equipment to be tested in real time. After the SCRCPY screen projection application is run, the SCRCPY behavior monitoring service is started, and the SCRCPY screen projection application monitors and acquires all operations on a screen projection screen by acquiring the SCRCPY handle and generates related test data.
Step S330, recording each operation executed by the user on the equipment to be tested aiming at the application program and interface element information displayed by the application program when executing each operation through the screen projection application;
specifically, after the PC starts the application program to be tested, that is, the target test application, the tester executes a series of operations based on the application program, and the application program responds to the operations executed by the tester, so that the PC identifies each operation executed by the tester based on the application program, and records the element information in the interface displayed when the application program responds to each operation.
Specifically, when recording a script, whether the current operation is a click or a slide can be determined by the operation of a finger. If the click is made, the information of the currently clicked element is recorded. When the finger operation is judged to be a slip, two coordinate values, namely a start point and an end point of the slip, are recorded.
Step S340, recording, by the PC, the test script according to each recorded operation executed by the user for the application program and the interface element information displayed by the application program when each operation is executed.
In an embodiment, after the project is created, a PC end establishes a connection with a device to be tested through adb, clicks a recording behavior event, calls os.pop to start a screen projection, obtains a handle of a background window of the screen through win32gui.findwindow, and returns a device environment of the handle window through win32gui.getwindowdc, covers the whole window, including a non-client area, a title bar, a menu, a frame, imports a pynput.keyboard library, monitors behaviors of screen projection window key codes, mouse movement, clicking, mouse rolling and the like by using a Listener function in the library and on _ press and on _ release in win.getmouse, obtains a click coordinate, a sliding start coordinate, a sliding end coordinate, input content, a key message corresponding to the screen window, triggers an interval time of the event, and generates a corresponding operation record format operation after monitoring the window action: the event type { event click screen coordinate or event sliding start coordinate and end coordinate or key code value or input text } is separated from the last event trigger interval time, in order to reduce the reading and writing of the database, a first record can be temporarily stored in a dictionary of tmpEventRecord, the record is written into the database after the recording is finished, and a software interface can directly modify an operation record for the second time, so that the recording of the test script is finished.
In step S220, packaging the test case into a target test application installation package, and loading the target test application into a device to be tested;
referring to fig. 4, fig. 4 is a flowchart illustrating loading of a target test application into a device under test according to an exemplary embodiment of the present application. As shown in fig. 4, the step of packaging the test case into a target test application installation package and loading the target test application into the device to be tested includes:
step S410, generating a JSON file corresponding to the test case;
step S420, packaging the JSON file into a set directory of a target test application;
step S430, performing decompiling and re-signing on the target test application;
step S440, installing the target test application subjected to re-signing into the device to be tested by an adb install method.
Specifically, all test cases of the project are taken out from a case database, and a corresponding JSON file is generated for each test case; packaging the generated test case JSON file under a res/attributes/directory of a target test application (test APP), and performing compilation and re-signature on the target test application; finally, the target test application is installed in the device to be tested through an adb install method, the target test application in the device to be tested is opened through adb shell am start-n "com.
In this embodiment, the test case data is packaged into an APK (target test application installation package) to install an APP (target test application) for the device to be tested; jar is used for decompiling a test APP (target test application), and the generated test data json file is placed under a/res/assets directory; and compiling the test APP (target test application) by utilizing the apt. Jar to regenerate the completed APP, then carrying out platform signature on the APP by using a jarsegner tool, mounting the APP on the equipment to be tested by using an adb install command, and starting a test interface by using an adb shell start com.
In step S230, the target test application installation package is analyzed by the device to be tested to obtain one or more test scripts;
after the test is started, the target test application is opened, the device to be tested automatically initializes the test JSON file under res/assets, analyzes the target test application installation package, generates a sqlite test database, and generates a corresponding test script for each test case.
In step S240, a test queue is created, and the one or more test scripts are imported into the test queue;
the test queue comprises a plurality of test tasks, each test task corresponds to one test case and one test script. The test queue may also represent the execution order of the test tasks, i.e., the order in which the test scripts run.
In step S250, the test script is run based on the test sequence represented by the test queue, and the test task is completed, so as to obtain a test result.
Specifically, when the test script is operated, the test queue is traversed, whether the test queue is empty or not is judged, if the test queue is empty, the test task is continuously executed, and if the test queue is not empty, the test task is ended.
More specifically, during test execution, the test queue is traversed first, and sequential tests are performed according to the queue order and a first-in first-out principle to obtain a result. And when the test script is operated, creating a test thread aiming at each test case, operating the test script based on multiple threads to obtain a test result, and storing the test result in a test database.
And when all the test cases in the test queue are tested, reading the test database, summarizing all the test results, generating a test report in an excel table form, and exporting the test report to a PC (personal computer) end through adb pull to finish the test of one project.
In one embodiment, in conducting the test, the following steps are performed:
step one, reading res/assets AssetsUtil by calling an encapsulated AssetsUtil function in a MainActivity onCreate method after a test interface is started, reading a test case under a res/assets directory, namely a JSON file named by a project name, constructing an independent entity class CaseBean for each test case, and adding the independent entity class CaseBean to an array List < CaseBean >;
step two, creating a database instance db through the SQLiteDatabase of sqlite, calling a db. ExecSQL create table statement in the SQLiteOpenhelper callback function onCreate, and creating a database testinfo.db and a data table field according to the CaseBean attribute; the CaseBean object is composed of the following basic attributes, namely testId, testName, testEvent, testDuration, testCount, testStatus and testTime, and single case data access is carried out on the CaseBean object through set and get;
polling the List < CaseBean >, storing each test case CaseBean object into a database through a db.execSQL insert statement, and initializing test data to complete construction and import;
step four, after the test is triggered through a PC or a device to be tested, a ForegroundService is created firstly, a test Service InnovatehAuuttRunner is started to inherit the ForegroundService and serves as an InnovatehAuuttRunner parent class, and a startForceground starting Notification Service is called in an onstartCommand method for rewriting the Service in the parent class, so that the purpose of ensuring that the InnovatehAuuttRunner test Service continuously runs when the test is executed in the background and is not recycled by the system is achieved. Then, initializing a test task queue LineUpTestTaskHelp example, wherein the function has the following key methods LinkedTestList which is a queue container, and the test task queue is added into the list; the OnTaskTestListener monitors the execution state of the tasks, and when the execution of the first task is finished, the test task still exists in the queue, and the next test task is continuously executed; addtest task is used for task addition; the checkTestTask checks whether a test task is executing in the list; consumerTestTask returns the next test task to be executed, and returns null, which means that no test task can be executed; deleting the test tasks in the queue after the deleteTestTask is successfully executed; the exTestTask test executes a task, which receives a task object consumtiontesttask within the method.
Step five, obtaining all test cases by calling db.query, generating a ConsumertTask object for each test case, adding all cases to a test queue through LineUpTestTaskHelp.addTestTask, registering a task through OnTaskTestListener to monitor all test tasks in a queue container, calling extTestTask, creating a Thread in the extTestTask method and rewriting a run method, and obtaining test information in the ConsumertTask object in the run method, wherein the test information comprises the following steps: testId, testName, testEvent, testDuration through the IO operation of File class according to these information at/data/local/tmp/generate auto _ test script;
step six, calling an execution/data/local/tmp/auto _ test script in an execcomand method in the packaged tool type ShellUtils for testing, wherein the script executes a command executed based on a Google native tool type Monkey: after single case testing is finished, execCommand returns a test result db, execcosql insert statement is inserted into a database, a deletesttask is called back to delete a task in a queue, checking of the task in the queue is carried out through the checkTestTask, if the returned result indicates null in the queue, the exexttesttask method is continuously called to continuously execute the next task, and if the checkTestTask returns to null, the testtask in the queue is indicated to be absent
And step seven, the execution of the test task is completed, the InnovatehAuutoTestRunner calls startActivity to start a test interface and calls stopBelf to finish the current service, the test result is displayed after the test interface is started, and the test result is stored in an SD storage medium in an Excel table form by a WritableShet.
The invention can record the operation steps of the PC terminal or/and the equipment to be tested by recording the test case; the method is used for recording the test script without compiling the test code, the test script is automatically generated through the operation of the equipment to be tested, and the PC end can be matched with the equipment to be tested to realize off-line and on-line automatic tests; in the actual test process, the PC end and the equipment to be tested do not interact, the whole process is completed by the equipment to be tested, the power consumption of the system is reduced, and the use scene of a user can be completely simulated.
According to the invention, after the software automatic test is realized, the automatic test task can be easily operated, more tests can be operated in less time, the complicated test task is automated, the accuracy and the enthusiasm of a tester can be improved, and a test technician can put more energy into designing a better test case. Meanwhile, testers can be more concentrated on the manual testing part, and the efficiency of manual testing is improved. Testing is automated and consistent and repeatable. Because the test is automatically executed, the consistency of the result of each test and the executed content can be ensured, and the repeatable effect of the test can be achieved. Can be applied to various test scenarios such as: case test, pressure test, endurance test, regression test and the like, and the number of test cycles can be increased according to the test requirements; and a test report is generated, so that the human input cost brought by some complex tests in daily tests is effectively reduced.
Fig. 5 is a block diagram of a software automation test device shown in an exemplary embodiment of the present application. The device can be applied to the implementation environment shown in fig. 1 and is specifically configured in an intelligent terminal. The apparatus may also be applied to other exemplary implementation environments, and is specifically configured in other devices, and the embodiment does not limit the implementation environment to which the apparatus is applied.
As shown in fig. 5, a software automation test apparatus includes:
a test case obtaining module 510, configured to obtain a test case corresponding to the target item from a test case database, where the test case includes a test script generated in a recording manner;
an application loading module 520, configured to package the test case into a target test application installation package, and load the target test application into a device to be tested;
the analysis module 530 is configured to analyze the target test application installation package through the device to be tested to obtain one or more test scripts;
a queue creating module 540, configured to create a test queue and import the one or more test scripts into the test queue;
and the test module 550 is configured to run the test script based on the test sequence indicated by the test queue to complete a test task, thereby obtaining a test result.
It should be noted that the software automation test apparatus provided in the foregoing embodiment and the software automation test method provided in the foregoing embodiment belong to the same concept, and specific ways of performing operations by each module and unit have been described in detail in the method embodiment, and are not described herein again. In practical applications, the above functions may be distributed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions, which is not limited herein.
An embodiment of the present application further provides an electronic device, including: one or more processors; the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the electronic equipment is enabled to realize the software automation testing method provided in the above embodiments.
FIG. 6 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application. It should be noted that the computer system 600 of the electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU) 601, which can perform various appropriate actions and processes, such as executing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 602 or a program loaded from a storage portion 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for system operation are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An Input/Output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output section 607 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 608 including a hard disk and the like; and a communication section 607 including a network interface card such as a LAN (Local area network) card, a modem, or the like. The communication section 607 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted into the storage section 608 as necessary.
In particular, according to embodiments of the present application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method illustrated in flowchart 2. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. When the computer program is executed by a Central Processing Unit (CPU) 601, various functions defined in the system of the present application are executed.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable storage medium may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer-readable signal medium may comprise a propagated data signal with a computer-readable computer program embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
Another aspect of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor of a computer, causes the computer to execute the automated testing method for an Android software platform as described above. The computer-readable storage medium may be included in the electronic device described in the above embodiment, or may exist alone without being assembled into the electronic device.
Another aspect of the application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the automated testing method for the Android software platform provided in the above embodiments.
The foregoing embodiments are merely illustrative of the principles of the present invention and its efficacy, and are not to be construed as limiting the invention. Those skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention are covered by the claims of the present invention.

Claims (10)

1. A software automation test method is characterized by comprising the following steps:
acquiring a test case corresponding to a target project from a test case database, wherein the test case comprises a test script generated in a recording mode;
packaging the test case into a target test application installation package, and loading the target test application into equipment to be tested;
analyzing the target test application installation package through equipment to be tested to obtain one or more test scripts;
creating a test queue and importing the one or more test scripts into the test queue;
and running the test script based on the test sequence represented by the test queue to complete the test task, thereby obtaining a test result.
2. The automated software testing method of claim 1, further comprising: constructing a test case database at a PC (personal computer) end, wherein the step of constructing the test case database comprises the following steps:
creating an empty database;
creating a test item table in the empty database;
creating a test case according to the test project table, and generating a test script in a recording mode;
and storing the test cases and the tested scripts in the database so as to complete the construction of the test case database.
3. The automated software testing method of claim 2, wherein the step of generating the test script by recording comprises:
establishing connection between a PC (personal computer) end and equipment to be tested;
running a screen projection application at the PC end;
recording each operation executed by a user on the application program on the equipment to be tested through the screen projection application and interface element information displayed by the application program when each operation is executed;
and recording the test script by the PC terminal according to the recorded user aiming at each operation executed by the application program and the interface element information displayed by the application program when each operation is executed.
4. The method for automatically testing software according to claim 3, wherein the step of packaging the test case into a target test application installation package and loading the target test application into the device under test comprises:
generating a JSON file corresponding to the test case;
packaging the JSON file into a set directory of a target test application;
compiling back and re-signing the target test application;
and installing the target test application subjected to re-signing into the equipment to be tested by an adb install method.
5. The automated software testing method of claim 3, further comprising: and after the target test application is installed in the equipment to be tested, disconnecting the equipment to be tested from the PC terminal.
6. The automated software testing method according to claim 1, wherein when the test script is run, a test thread is created for each test case, the test script is run based on multiple threads to obtain a test result, and the test result is stored in a test database.
7. The automated software testing method of claim 1, wherein when the test script is running, a test queue is traversed to determine whether the test queue is empty, if so, the testing task is continuously executed, and if not, the testing task is ended.
8. An automated software testing device, comprising:
the test case acquisition module is used for acquiring a test case of a corresponding target project from a test case database, wherein the test case comprises a test script generated in a recording mode;
the application loading module is used for packaging the test case into a target test application installation package and loading the target test application into equipment to be tested;
the analysis module is used for analyzing the target test application installation package through the equipment to be tested to obtain one or more test scripts;
the queue creating module is used for creating a test queue and importing the one or more test scripts into the test queue;
and the test module is used for running the test script based on the test sequence represented by the test queue to complete the test task so as to obtain a test result.
9. An apparatus, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the device to perform the method recited by one or more of claims 1-7.
10. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method recited by one or more of claims 1-7.
CN202211184685.0A 2022-09-27 2022-09-27 Software automation test method, device, machine readable medium and equipment Pending CN115509913A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211184685.0A CN115509913A (en) 2022-09-27 2022-09-27 Software automation test method, device, machine readable medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211184685.0A CN115509913A (en) 2022-09-27 2022-09-27 Software automation test method, device, machine readable medium and equipment

Publications (1)

Publication Number Publication Date
CN115509913A true CN115509913A (en) 2022-12-23

Family

ID=84505826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211184685.0A Pending CN115509913A (en) 2022-09-27 2022-09-27 Software automation test method, device, machine readable medium and equipment

Country Status (1)

Country Link
CN (1) CN115509913A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117149638A (en) * 2023-09-01 2023-12-01 镁佳(北京)科技有限公司 UI (user interface) automatic testing method and device, computer equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117149638A (en) * 2023-09-01 2023-12-01 镁佳(北京)科技有限公司 UI (user interface) automatic testing method and device, computer equipment and storage medium
CN117149638B (en) * 2023-09-01 2024-09-03 镁佳(北京)科技有限公司 UI (user interface) automatic testing method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
US20230367559A1 (en) Development environment for real-time dataflow programming language
US9697108B2 (en) System, method, and apparatus for automatic recording and replaying of application executions
US9436449B1 (en) Scenario-based code trimming and code reduction
US20130263090A1 (en) System and method for automated testing
CN110457211B (en) Script performance test method, device and equipment and computer storage medium
CN110013672B (en) Method, device, apparatus and computer-readable storage medium for automated testing of machine-run games
CN109815119B (en) APP link channel testing method and device
CN115422063A (en) Low-code interface automation system, electronic equipment and storage medium
CN113868126A (en) Application debugging method, device and storage medium of equipment
Tuovenen et al. MAuto: Automatic mobile game testing tool using image-matching based approach
CN117370203B (en) Automatic test method, system, electronic equipment and storage medium
CN109284222B (en) Software unit, project testing method, device and equipment in data processing system
CN111767209A (en) Code testing method, device, storage medium and terminal
CN115509913A (en) Software automation test method, device, machine readable medium and equipment
CN113419738A (en) Interface document generation method and device and interface management equipment
CN116028108B (en) Method, device, equipment and storage medium for analyzing dependent package installation time
CN113656044B (en) Android installation package compression method and device, computer equipment and storage medium
CN115827457A (en) Browser compatibility testing method and related equipment
CN111459547B (en) Method and device for displaying function call link
CN116414607A (en) Fault detection method and device for application program
EP3436948A1 (en) System for monitoring and reporting performance and correctness issues across design, compile and runtime
Park et al. Gesto: Mapping UI events to gestures and voice commands
CN111694729A (en) Application testing method and device, electronic equipment and computer readable medium
CN118519920B (en) Automatic test method, device, equipment and storage medium
CN117155628B (en) Method, system, device and readable storage medium for interactive security test of containerized application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination