Nothing Special   »   [go: up one dir, main page]

CN110532056B - Control identification method and device applied to user interface - Google Patents

Control identification method and device applied to user interface Download PDF

Info

Publication number
CN110532056B
CN110532056B CN201910838588.0A CN201910838588A CN110532056B CN 110532056 B CN110532056 B CN 110532056B CN 201910838588 A CN201910838588 A CN 201910838588A CN 110532056 B CN110532056 B CN 110532056B
Authority
CN
China
Prior art keywords
control
screenshot
sub
interface
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910838588.0A
Other languages
Chinese (zh)
Other versions
CN110532056A (en
Inventor
王明星
黄晶
罗勇冠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910838588.0A priority Critical patent/CN110532056B/en
Publication of CN110532056A publication Critical patent/CN110532056A/en
Application granted granted Critical
Publication of CN110532056B publication Critical patent/CN110532056B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a control identification method and a control identification device applied to a user interface, which are used for acquiring the user interface and control node information corresponding to the user interface in real time; then determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots; and determining interface controls corresponding to the control nodes to establish association relations between the interface controls and the sub-screenshot at all times, and then identifying the interface controls in the current user interface according to the association relations to obtain interface control information of the current user interface, thereby improving accuracy and efficiency of control identification.

Description

Control identification method and device applied to user interface
Technical Field
The present application relates to the field of computer technologies, and in particular, to a control identification method and device applied to a user interface.
Background
In the process of interaction between a user and a terminal device, interactive operation is often performed through a user interface (userinterface, UI), wherein the interactive operation is realized based on UI controls, such as buttons, text fields, positioning columns, check boxes, zoom buttons, switch buttons and the like, in order to ensure normal operation of the control, UI automation test is required to be performed on the controls, differences between actual UI display results and expected results are compared, and specific information of the current control is first identified before the differences are compared.
In general, the identification of UI controls is based on the attributes of the controls, such as appium test tool views the attributes of the controls through uiautoformater.
However, as the attributes of the control are not always present, sometimes the condition that the attribute column is empty is caused, and a person is required to screen which attributes can be used for positioning and identifying the control, so that the identification efficiency of the control is affected; moreover, due to the non-uniqueness of the control attribute, the problem of error in identifying the control can occur, and the accuracy of identifying the control is affected.
Disclosure of Invention
In view of the foregoing, a first aspect of the present application provides a control identifying method applied to a user interface, which may be applied to a system or a program process of the user interface, and specifically includes: acquiring a user interface and control node information corresponding to the user interface in real time, wherein the user interface is used for determining an interface screenshot;
determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots;
determining an interface control corresponding to the control node to establish an association relationship between the interface control and the sub-screenshot at each moment;
and identifying the interface control in the current user interface according to the association relation to obtain the interface control information of the current user interface.
Preferably, in some possible implementations of the present application, after determining the positions of a plurality of control nodes according to the control node information and splitting the interface screenshot based on the control nodes to obtain a plurality of sub-shots, the method further includes:
acquiring attribute information of the sub-screenshot;
and determining a measurement value or a picture fingerprint of the sub-screenshot according to the attribute information, wherein the measurement value comprises a length, a width or an area.
Preferably, in some possible implementations of the present application, the identifying interface control information in the current user interface according to the association relationship includes:
acquiring a current user interface and current control node information corresponding to the current user interface, wherein the current user interface is used for determining a current interface screenshot;
determining the positions of a plurality of current control nodes according to the current control node information, and segmenting the current interface screenshot based on the current control nodes to obtain a plurality of current sub-screenshots;
Traversing the sub-screenshot with the similarity meeting the preset condition with the current sub-screenshot, and determining corresponding interface control information according to the association relation, wherein the preset condition is set based on the similarity of the attribute information.
Preferably, in some possible implementations of the present application, the traversing the sub-screenshot having a similarity with the current sub-screenshot satisfying a preset condition includes:
determining measurement information of the current sub-screenshot;
and if the measurement information of the current sub-screenshot is compared with the measurement information of the sub-screenshot to meet the preset condition, associating the current sub-screenshot with the sub-screenshot.
Preferably, in some possible implementations of the present application, the traversing the sub-screenshot having a similarity with the current sub-screenshot satisfying a preset condition includes:
Generating a hash value of the current sub-screenshot according to a hash function, wherein the hash value is used for indicating a picture fingerprint of the current sub-screenshot;
and if the picture fingerprint of the current sub-screenshot is compared with the picture fingerprint of the sub-screenshot to meet a preset condition, associating the current sub-screenshot with the sub-screenshot.
Preferably, in some possible implementations of the present application, the determining the positions of a plurality of control nodes according to the control node information, and slicing the interface screenshot based on the control nodes to obtain a plurality of sub-shots includes:
acquiring weight information of the control node, wherein the weight information is used for indicating the identification priority of the control node;
determining the positions of a plurality of control nodes according to the control node information;
And sequentially segmenting the interface screenshot based on the control node according to the weight information to obtain a plurality of sub-screenshots.
Preferably, in some possible implementations of the present application, the identifying an interface control in the current user interface according to the association relationship includes:
Determining operation instruction information for the current user interface;
determining an interface control applied to the current user interface according to the operation instruction information;
And identifying interface control information of the interface control in the current user interface according to the association relation.
A second aspect of the present application provides an apparatus for control identification, comprising: the system comprises an acquisition unit, a control node information acquisition unit and a control node information acquisition unit, wherein the acquisition unit is used for acquiring a user interface and control node information corresponding to the user interface in real time, and the user interface is used for determining an interface screenshot;
The segmentation unit is used for determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots;
The association unit is used for determining an interface control corresponding to the control node so as to establish an association relationship between the interface control and the sub-screenshot at each moment;
and the identification unit is used for identifying the interface control in the current user interface according to the association relation so as to obtain the interface control information of the current user interface.
Preferably, in some possible implementations of the present application, the splitting unit is further configured to obtain attribute information of the sub-screenshot;
and the segmentation unit is further used for determining a measurement value or a picture fingerprint of the sub-screenshot according to the attribute information, wherein the measurement value comprises a length, a width or an area.
Preferably, in some possible implementation manners of the present application, the identifying unit is specifically configured to obtain a current user interface and current control node information corresponding to the current user interface, where the current user interface is used to determine a current screenshot;
The identification unit is specifically configured to determine positions of a plurality of current control nodes according to the current control node information, and segment the current interface screenshot based on the current control nodes to obtain a plurality of current sub-shots;
the identification unit is specifically configured to traverse the sub-screenshot with similarity to the current sub-screenshot satisfying a preset condition, and determine corresponding interface control information according to the association relationship, where the preset condition is set based on the similarity of the attribute information.
Preferably, in some possible implementations of the present application, the slicing unit is specifically configured to determine metric information of the current sub-screenshot;
the segmentation unit is specifically configured to associate the current sub-screenshot with the sub-screenshot if the measurement information of the current sub-screenshot meets a preset condition compared with the measurement information of the sub-screenshot.
Preferably, in some possible implementations of the present application, the splitting unit is specifically configured to generate a hash value of the current sub-screenshot according to a hash function, where the hash value is used to indicate a picture fingerprint of the current sub-screenshot;
The segmentation unit is specifically configured to associate the current sub-screenshot with the sub-screenshot if the picture fingerprint of the current sub-screenshot meets a preset condition compared with the picture fingerprint of the sub-screenshot.
Preferably, in some possible implementations of the present application, the splitting unit is specifically configured to obtain weight information of the control node, where the weight information is used to indicate an identification priority of the control node;
The segmentation unit is specifically configured to determine positions of a plurality of control nodes according to the control node information;
The segmentation unit is specifically configured to segment the interface screenshot based on the control node in sequence according to the weight information, so as to obtain a plurality of sub-shots.
Preferably, in some possible implementations of the present application, the identifying unit is specifically configured to determine operation instruction information for the current user interface;
The identification unit is specifically used for determining an interface control applied to the current user interface according to the operation instruction information;
The identification unit is specifically configured to identify interface control information of an interface control in a current user interface according to the association relationship.
A third aspect of the present application provides a computer apparatus comprising: a memory, a processor, and a bus system; the memory is used for storing program codes; the processor is configured to perform the method of the first aspect or any one of the first aspects applied to control recognition in a user interface according to instructions in the program code.
A fourth aspect of the application provides a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the method of the first aspect or any of the first aspects described above applied to control recognition in a user interface.
From the above technical solutions, the embodiment of the present application has the following advantages:
Acquiring a user interface and control node information corresponding to the user interface in real time; then determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots; and determining an interface control corresponding to the control node to establish an association relation between the interface control and the sub-screenshot at each moment, and then identifying the interface control in the current user interface according to the association relation to obtain the interface control information of the current user interface. Due to the stability of the nodes, stable grabbing of the control is realized, and the control identification efficiency is improved; and the control information of the current interface is determined through the corresponding relation between the screenshot, so that the condition of inaccurate identification caused by interface change is eliminated, and the accuracy of control identification is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a control properties schematic of a user interface;
FIG. 2 is a flow diagram of control identification;
FIG. 3 is a flowchart of a method for identifying a control according to an embodiment of the present application;
FIG. 4 is a flowchart of another method for identifying controls provided by an embodiment of the present application;
fig. 5 is a schematic flow chart of an application scenario provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface display of a control identification program according to an embodiment of the present application;
Fig. 7 is a schematic structural diagram of a control identifying device according to an embodiment of the present application;
Fig. 8 is a schematic structural diagram of another control identifying device according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a control identification method and a related device, which can be applied to the running process of an application supporting a user interface, and particularly acquire the user interface and control node information corresponding to the user interface in real time; then determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots; and determining an interface control corresponding to the control node to establish an association relation between the interface control and the sub-screenshot at each moment, and then identifying the interface control in the current user interface according to the association relation to obtain the interface control information of the current user interface. Due to the stability of the nodes, stable grabbing of the control is realized, and the control identification efficiency is improved; and the control information of the current interface is determined through the corresponding relation between the screenshot, so that the condition of inaccurate identification caused by interface change is eliminated, and the accuracy of control identification is improved.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "includes" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
In the process of interaction between a user and a terminal device, interactive operation is often performed through a user interface (userinterface, UI), wherein the interactive operation is realized based on UI controls, such as buttons, text fields, positioning columns, check boxes, zoom buttons, switch buttons and the like, in order to ensure normal operation of the control, UI automation test is required to be performed on the controls, differences between actual UI display results and expected results are compared, and specific information of the current control is first identified before the differences are compared.
In general, the identification of UI controls is based on the attribute of the control, as shown in fig. 1, and is a control attribute schematic diagram of a user interface, in which appium test tools view the attribute of the control through uiautoformationviewer.
However, as the attributes of the control are not always present, sometimes the condition that the attribute column is empty is caused, and a person is required to screen which attributes can be used for positioning and identifying the control, so that the identification efficiency of the control is affected; moreover, due to the non-uniqueness of the control attribute, the problem of error in identifying the control can occur, and the accuracy of identifying the control is affected.
In order to solve the above problems, the present application provides a method for identifying a control, which is applied to a flow frame of identifying a control shown in fig. 2, and is a flow frame diagram of identifying a control, wherein the diagram includes processes of determining an interface, cutting a screenshot, recording a label, storing a history control database, and similar identification, as shown in fig. 2; when a user interacts with the interface, firstly determining the interface and nodes on the interface, and generating a screenshot; then, capturing the screen of the control according to the node, generating corresponding label information to record the attribute of the screen capturing, and generating a corresponding relation with the related control; recording the labels and the corresponding relations in a history control database, and comparing the similarity between the screenshot of the interface and the screenshot of the history control database to obtain the corresponding control information when the interface needs to determine the related control information.
The method can be applied to all Android applications including mobile phone QQ, weChat, mobile browser and other Android devices, and can simulate the operations of manually sliding up and unlocking, skipping a guide page, logging in, entering a test page, clicking a control, checking a UI control and the like on the device.
It will be appreciated that three controls are illustrated herein, and that a specific number of controls may be more or less, and the specific number of controls is determined by the actual scenario, which is not limited herein.
It can be appreciated that the control identification system can be operated on a personal mobile terminal, a server, and a third party device to provide an interface automated monitoring process for the terminal device.
It can be understood that the method provided by the application can be a program writing method, which can be used as a processing logic in a hardware system, and can also be used as a control identification device, and the processing logic can be realized in an integrated or external mode. As an implementation manner, the control identification device acquires a user interface and control node information corresponding to the user interface in real time; then determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots; and determining an interface control corresponding to the control node to establish an association relation between the interface control and the sub-screenshot at each moment, and then identifying the interface control in the current user interface according to the association relation to obtain the interface control information of the current user interface. Due to the stability of the nodes, stable grabbing of the control is realized, and the control identification efficiency is improved; and the control information of the current interface is determined through the corresponding relation between the screenshot, so that the condition of inaccurate identification caused by interface change is eliminated, and the accuracy of control identification is improved.
With reference to the foregoing flowchart, a method for identifying a control in the present application will be described, referring to fig. 3, and fig. 3 is a flowchart of a method for identifying a control provided in an embodiment of the present application, where the embodiment of the present application at least includes the following steps:
301. and acquiring the user interface and control node information corresponding to the user interface in real time.
In this embodiment, the user interface is used to determine an interface screenshot; the control node information can be obtained according to UI.xml corresponding to the user interface, namely, an XML layout structure document which is derived by a test tool uiAutomaker provided by an Android SDK and contains attribute information of all control nodes of the current interface.
Alternatively, the process of acquiring the user interface in real time may be initiated when it is monitored that the user needs to perform an interactive operation to generate an operation command, for example, if the operation command keyword is an Application (APP) start, acquiring an interface when the APP is started to operate; if the operation command keyword is a click control, acquiring an interface when the operation of clicking the control is executed; if the operation command keyword is waiting, acquiring an interface when waiting operation is executed; if the operation command keyword is an input word, acquiring an interface for executing the operation of the input word; if the operation command keyword is the page up, acquiring an interface for executing page up operation; if the operation command keyword is an automatic check or check element, acquiring an interface for executing automatic check operation; the specific scenario is not limited herein, depending on the actual situation.
302. And determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots.
In this embodiment, the position of the control node may be obtained according to bounds in ui.xml; the segmentation of the interface screenshot can be performed according to the position of the control node, namely, the position of the control is determined according to the control node, then the determination is performed on a plurality of nodes or one node included in the control, and the nodes are connected in a closed mode to determine the screenshot area.
In one possible scenario, there may be multiple controls in one interface, where weight information may be set for different controls, that is, weight information is set for control nodes included in the controls, where the weight information is used to indicate an identification priority of the control nodes; then determining the positions of a plurality of control nodes according to the control node information; and then, the interface screenshot is segmented based on the control node according to the weight information in turn to obtain a plurality of sub-shots. The integrity of the main control and the applicability in different scenes are ensured by cutting the interface according to the priority order; for example: for the login interface, the highest weight of the account input node can be set, and in the screenshot process, the integrity of account input can be guaranteed preferentially, and the embodiment of a control function is guaranteed.
Optionally, after the sub-screenshot is acquired according to the plurality of control nodes, the information of the sub-screenshot can be stored, specifically, the sub-screenshot can be classified and marked and named, so that the attribute information of the history can be queried by inputting the name when the sub-screenshot is called.
It will be appreciated that classifying and marking sub-shots and naming them may be based on attribute information of the sub-shots, where the attribute information may include label information set manually or automatically, for example: setting a label as input for a sub-screenshot of a login interface; setting a label as unlocking for the sub-screenshot of the unlocking interface; the specific label naming scheme may be a functional representation of the associated control, and is not limited herein.
In addition, because similar controls may exist in the similar labels, for identifying the controls, attribute information of the sub-screenshot may further include a metric value of the sub-screenshot or a picture fingerprint, where the metric value may include: the length, width, area, aspect ratio or area-to-interface ratio of the sub-screenshot, etc., and the specific measurement mode is determined according to the actual scene, and is not limited herein.
For the picture fingerprint of the sub-screenshot, the sub-screenshot is input into a hash function to generate a unique hash value, and when the sub-screenshot needs to be called and related information is needed, the hash value is directly input for searching.
303. And determining an interface control corresponding to the control node to establish an association relationship between the interface control and the sub-screenshot at each moment.
In this embodiment, the interface control may include a plurality of control nodes, and the same label may be set for the control nodes in the same interface control.
In one possible scenario, if the background of the UI changes, or the identified control is a button control, a problem arises that is not identified with picture similarity when the button state changes. In order to solve the problem, the sub-screenshot can be acquired in real time, and accordingly, the association relationship between the interface control and the sub-screenshot at each time is established, and when the control needs to be identified, the normal operation of the identification process can be ensured by selecting the association relationship at a plurality of times.
Optionally, for the obtained correspondence of each moment, the correspondence may be stored in a history control database, and classified according to bar labels, or a naming rule of first-pass is set, for example: 0 represents an input control, 1 represents an unlock control, etc.
304. And identifying the interface control in the current user interface according to the association relation to obtain the interface control information of the current user interface.
In this embodiment, through the obtaining and storing of the association relationship in the step 303, when the interface control information of the current interface needs to be identified, the interface control information can be directly called through the similarity between the screenshots; and performing screenshot operation on the current user interface according to the control node, and then determining corresponding interface control information according to the similarity between the screenshot and the stored screenshot.
Optionally, since the process of setting the label for the sub-screenshot is described in the above steps, preliminary screenshot screening may be performed according to the label of the current user interface, for example, if the current user interface is a user login interface, keyword input may be traversed in the history control database, and then a process of identifying similarity of the screenshot may be performed in the screening result. Specifically, firstly, determining operation instruction information for the current user interface; then determining an interface control applied to the current user interface according to the operation instruction information; and identifying interface control information of the interface control in the current user interface according to the association relation.
By combining the above embodiments, the user interface and the control node information corresponding to the user interface are obtained in real time; then determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots; and determining an interface control corresponding to the control node to establish an association relation between the interface control and the sub-screenshot at each moment, and then identifying the interface control in the current user interface according to the association relation to obtain the interface control information of the current user interface. Due to the stability of the nodes, stable grabbing of the control is realized, and the control identification efficiency is improved; and the control information of the current interface is determined through the corresponding relation between the screenshot, so that the condition of inaccurate identification caused by interface change is eliminated, and the accuracy of control identification is improved.
Because of the correspondence between the second type of data and the first type of data, some optimizations may be performed on the process of acquiring the second read path, as shown in fig. 4, fig. 4 is a flowchart of another control identification method provided in the embodiment of the present application, where the embodiment of the present application at least includes the following steps:
401. And acquiring the user interface and control node information corresponding to the user interface in real time.
402. And determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots.
403. And determining an interface control corresponding to the control node to establish an association relationship between the interface control and the sub-screenshot at each moment.
In this embodiment, steps 401-403 are similar to steps 301-303 in fig. 3, and the related feature descriptions may be referred to, and will not be described here.
404. And establishing an association relation between the interface control and the sub-screenshot, and setting label information to establish a history control database.
In this embodiment, the history control database includes the association relationship between the interface control and the sub-screenshot, that is, the corresponding interface control and the related information thereof can be called by determining the sub-screenshot; and tag information can be set for the sub-screenshot, and the tag information can be set based on the function, the metric value, the picture fingerprint or other sign with the indication of the sub-screenshot.
Optionally, the sub-screenshot in the history control database may be classified according to the tag information, so as to facilitate the screening process of searching the object in the calling process.
405. And acquiring the current user interface and the current control node information corresponding to the current user interface.
In this embodiment, the information obtaining process for the current user interface may be initiated when it is monitored that the user needs to perform the interactive operation to generate the operation command, for example, if the operation command keyword is an APP, the interface when the APP is started is obtained; if the operation command keyword is a click control, acquiring an interface when the operation of clicking the control is executed; if the operation command keyword is waiting, acquiring an interface when waiting operation is executed; if the operation command keyword is an input word, acquiring an interface for executing the operation of the input word; if the operation command keyword is the page up, acquiring an interface for executing page up operation; if the operation command keyword is an automatic check or check element, acquiring an interface for executing automatic check operation; the specific scenario is not limited herein, depending on the actual situation.
406. And determining the positions of a plurality of current control nodes according to the current control node information, and segmenting the current interface screenshot based on the current control nodes to obtain a plurality of current sub-screenshots.
In this embodiment, the process of splitting the screenshot of the current interface may be performed on a main control in the current interface, that is, when the current interface is unlocked, determining a plurality of nodes corresponding to the unlocked control and performing screenshot, which may be understood that the selection process of the control may be preset or may be determined according to the identification information of the current interface.
407. Traversing the interface control with the similarity meeting the preset condition with the current sub-screenshot in the history control database, and determining the interface control information.
In this embodiment, the process of traversing the interface control with the similarity to the current sub-screenshot satisfying the preset condition in the history control database may be performed based on the following aspects.
Firstly, the preset condition can be determined based on the measurement information of the current sub-screenshot, namely when the similarity between the measurement information of the current sub-screenshot and the measurement information of the sub-screenshot reaches a certain threshold value, the current sub-screenshot is associated with the similar sub-screenshot and related control information is called. Wherein for the metric information may be aspect ratio, for example: the aspect ratio of the current sub-screenshot is 0.5, namely traversing the sub-screenshot with the aspect ratio of 0.5 in the Shi Kongjian database, and then comparing the labels; the measurement information may be an area ratio, that is, a ratio of area occupied by the face of the current sub-screenshot, for example: the area ratio of the current sub-screenshot is 0.1, namely traversing Shi Kongjian sub-shots with the area ratio of 0.1 in the database, and then comparing the labels.
And secondly, the preset condition can be determined based on the picture fingerprint of the current sub-screenshot, namely, when the picture fingerprint of the current sub-screenshot meets the preset condition compared with the picture fingerprint of the sub-screenshot, the current sub-screenshot is associated with the sub-screenshot and related control information is called.
It will be appreciated that the setting method for the preset condition may be one of the examples or a combination of a plurality of examples in practical application, and the specific form is determined by the practical scenario, which is not limited herein.
The following describes a process of reading related resources according to a second reading path according to the present application with reference to a specific usage scenario, as shown in fig. 5, which is a schematic flow diagram of an application scenario provided by an embodiment of the present application; firstly, executing an operation command, acquiring an operation command keyword, and optionally executing an operation of starting the app if the operation command keyword is the starting app; if the operation command keyword is a click control, executing the operation of clicking the control; if the operation command keyword is waiting, executing waiting operation; if the operation command keyword is an input word, executing the word input operation; if the operation command keyword is a page up, executing page up operation; if the operation command keyword is an automatic check or check element, executing an automatic check operation; otherwise, the error execution command is not identifiable. The operations can be recorded and stored in data so as to facilitate the storage process when corresponding operations are performed.
After the operation is determined, acquiring current activity information, namely selection information containing an interface; and then acquiring a UI (user interface) xml file, acquiring a current page screenshot, converting a node in the UI xml file into a ui_list, extracting bounds values of the node from the ui_list, cutting a control represented by the node into a control screenshot by using a cutting algorithm, adding the area ratio, the aspect ratio and the attribute of 3 dimensions of the picture fingerprint of the control, adding the attribute into the ui_list, storing the attribute into a node. Json file, and converting the node. Json into the node. Xlsx.
For the sub-screenshot information, the node json file of each page can be traversed, and an additional tag attribute alias and a weight attribute xx_weight are added to the control represented by each node. And then generating a summer file: traversing nodes.json in all pages in the data folder, and writing all the controls of all the pages into a summer.xlsx to be stored as a history control database.
When the identification of the current interface control is needed, comparing the summer. Xlsx file according to the current interface name page and the control label alias information, and finding out the unique element on the current activity. Firstly, acquiring nodes corresponding to pages and alias in execution command parameters from a summer file, comparing and searching nodes most similar to the nodes acquired in the summer file from ui_list nodes of a current page, namely the control nodes to be identified, and performing UI automatic monitoring on a current interface according to the attribute of the nodes.
In one possible display manner, a display manner as shown in fig. 6 may be adopted, and fig. 6 is a schematic interface display diagram of a control identification program according to an embodiment of the present application. The interface can be operated in the UI automatic monitoring program set on the basis of the method, the interface can comprise matching processes of a plurality of controls in a plurality of interfaces, and a user can know label information, length-width ratio and picture fingerprint of the corresponding control by clicking a detail button so as to verify the control information, so that on one hand, the accuracy of the matching process is improved, and on the other hand, the monitoring process of related personnel on the control information is ensured, namely whether the control information is real or not is determined.
It will be appreciated that the parameters or steps designed in the above embodiments may be displayed in the interface, and are not limited herein.
In order to better implement the above-described aspects of the embodiments of the present application, the following provides related apparatuses for implementing the above-described aspects. Referring to fig. 7, fig. 7 is a schematic structural diagram of a control identifying apparatus according to an embodiment of the present application, and a control identifying apparatus 700 includes:
the acquiring unit 701 is configured to acquire a user interface and control node information corresponding to the user interface in real time, where the user interface is used to determine an interface screenshot;
The segmentation unit 702 is configured to determine positions of a plurality of control nodes according to the control node information, and segment the interface screenshot based on the control nodes to obtain a plurality of sub-shots;
the association unit 703 is configured to determine an interface control corresponding to the control node, so as to establish an association relationship between the interface control and the sub-screenshot at each moment;
And the identifying unit 704 is configured to identify an interface control in the current user interface according to the association relationship, so as to obtain interface control information of the current user interface.
Preferably, in some possible implementations of the present application, the splitting unit 702 is further configured to obtain attribute information of the sub-screenshot;
The slicing unit 702 is further configured to determine a metric value or a picture fingerprint of the sub-screenshot according to the attribute information, where the metric value includes a length, a width, or an area.
Preferably, in some possible implementations of the present application, the identifying unit 704 is specifically configured to obtain a current user interface and current control node information corresponding to the current user interface, where the current user interface is used to determine a current screenshot of the current interface;
The identifying unit 704 is specifically configured to determine positions of a plurality of current control nodes according to the current control node information, and segment the current interface screenshot based on the current control nodes to obtain a plurality of current sub-shots;
The identifying unit 704 is specifically configured to traverse the sub-screenshot having a similarity with the current sub-screenshot satisfying a preset condition, and determine corresponding interface control information according to the association relationship, where the preset condition is set based on the similarity of the attribute information.
Preferably, in some possible implementations of the present application, the slicing unit 702 is specifically configured to determine metric information of the current sub-screenshot;
The slicing unit 702 is specifically configured to associate the current sub-screenshot with the sub-screenshot if the measurement information of the current sub-screenshot meets a preset condition compared with the measurement information of the sub-screenshot.
Preferably, in some possible implementations of the present application, the splitting unit 702 is specifically configured to generate a hash value of the current sub-screenshot according to a hash function, where the hash value is used to indicate a picture fingerprint of the current sub-screenshot;
the slicing unit 702 is specifically configured to associate the current sub-screenshot with the sub-screenshot if the picture fingerprint of the current sub-screenshot meets a preset condition compared with the picture fingerprint of the sub-screenshot.
Preferably, in some possible implementations of the present application, the splitting unit 702 is specifically configured to obtain weight information of the control node, where the weight information is used to indicate an identification priority of the control node;
The slicing unit 702 is specifically configured to determine positions of a plurality of control nodes according to the control node information;
the splitting unit 702 is specifically configured to sequentially split the interface screenshot based on the control node according to the weight information, so as to obtain a plurality of sub-shots.
Preferably, in some possible implementations of the present application, the identifying unit 704 is specifically configured to determine operation instruction information for the current user interface;
the identifying unit 704 is specifically configured to determine an interface control applied to the current user interface according to the operation instruction information;
the identifying unit 704 is specifically configured to identify interface control information of an interface control in the current user interface according to the association relationship.
Acquiring a user interface and control node information corresponding to the user interface in real time; then determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots; and determining an interface control corresponding to the control node to establish an association relation between the interface control and the sub-screenshot at each moment, and then identifying the interface control in the current user interface according to the association relation to obtain the interface control information of the current user interface. Due to the stability of the nodes, stable grabbing of the control is realized, and the control identification efficiency is improved; and the control information of the current interface is determined through the corresponding relation between the screenshot, so that the condition of inaccurate identification caused by interface change is eliminated, and the accuracy of control identification is improved.
Referring to fig. 8, fig. 8 is a schematic structural diagram of another control identifying apparatus according to an embodiment of the present application, where the control identifying apparatus 800 may have a relatively large difference due to different configurations or performances, and may include one or more central processing units (central processing units, CPU) 822 (e.g., one or more processors) and a memory 832, and one or more storage mediums 830 (e.g., one or more mass storage devices) storing application programs 842 or data 844. Wherein the memory 832 and the storage medium 830 may be transitory or persistent. The program stored on the storage medium 830 may include one or more modules (not shown), each of which may include a series of instruction operations in the control recognition device. Still further, the central processor 822 may be configured to communicate with the storage medium 830 to execute a series of instruction operations in the storage medium 830 on the control recognition device 800.
Control recognition device 800 may also include one or more power supplies 826, one or more wired or wireless network interfaces 850, one or more input/output interfaces 858, and/or one or more operating systems 841, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, and the like.
The steps performed by the control identifying means in the above-described embodiments may be based on the control identifying means structure shown in fig. 8.
Embodiments of the present application also provide a computer-readable storage medium having control identifying instructions stored therein, which when executed on a computer, cause the computer to perform the steps performed by the control identifying apparatus in the method described in the embodiments of fig. 2 to 6.
There is also provided in an embodiment of the application a computer program product comprising control identification instructions which, when run on a computer, cause the computer to perform the steps performed by the control identification means in the method described in the embodiments of figures 2 to 6 as described above.
The embodiment of the application also provides a control identification system, which can comprise the control identification device in the embodiment shown in fig. 7 or the control identification device shown in fig. 8.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in whole or in part in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a control recognition device, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (15)

1. A control recognition method applied to a user interface, comprising:
Acquiring a user interface and control node information corresponding to the user interface in real time, wherein the user interface is used for determining an interface screenshot;
determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots;
determining an interface control corresponding to the control node to establish an association relationship between the interface control and the sub-screenshot at each moment;
Identifying an interface control in a current user interface according to the association relation to obtain interface control information of the current user interface;
The method further comprises the steps of after determining positions of a plurality of control nodes according to the control node information and cutting the interface screenshot based on the control nodes to obtain a plurality of sub-shots:
acquiring attribute information of the sub-screenshot;
and determining a measurement value or a picture fingerprint of the sub-screenshot according to the attribute information, wherein the measurement value comprises a length, a width or an area.
2. The method of claim 1, wherein the identifying interface control information in the current user interface according to the association relationship comprises:
acquiring a current user interface and current control node information corresponding to the current user interface, wherein the current user interface is used for determining a current interface screenshot;
determining the positions of a plurality of current control nodes according to the current control node information, and segmenting the current interface screenshot based on the current control nodes to obtain a plurality of current sub-screenshots;
Traversing the sub-screenshot with the similarity meeting the preset condition with the current sub-screenshot, and determining corresponding interface control information according to the association relation, wherein the preset condition is set based on the similarity of the attribute information.
3. The method of claim 2, wherein traversing the sub-shots for which a similarity to the current sub-shot satisfies a preset condition comprises:
determining measurement information of the current sub-screenshot;
and if the measurement information of the current sub-screenshot is compared with the measurement information of the sub-screenshot to meet the preset condition, associating the current sub-screenshot with the sub-screenshot.
4. The method of claim 2, wherein traversing the sub-shots for which a similarity to the current sub-shot satisfies a preset condition comprises:
Generating a hash value of the current sub-screenshot according to a hash function, wherein the hash value is used for indicating a picture fingerprint of the current sub-screenshot;
and if the picture fingerprint of the current sub-screenshot is compared with the picture fingerprint of the sub-screenshot to meet a preset condition, associating the current sub-screenshot with the sub-screenshot.
5. The method according to any one of claims 1-4, wherein determining the locations of a plurality of control nodes according to the control node information, and slicing the interface screenshot based on the control nodes to obtain a plurality of sub-shots includes:
acquiring weight information of the control node, wherein the weight information is used for indicating the identification priority of the control node;
determining the positions of a plurality of control nodes according to the control node information;
And sequentially segmenting the interface screenshot based on the control node according to the weight information to obtain a plurality of sub-screenshots.
6. The method of any of claims 1-4, wherein the identifying an interface control in a current user interface according to the association relationship comprises:
Determining operation instruction information for the current user interface;
determining an interface control applied to the current user interface according to the operation instruction information;
And identifying interface control information of the interface control in the current user interface according to the association relation.
7. A control recognition device for use in a user interface, comprising:
The system comprises an acquisition unit, a control node information acquisition unit and a control node information acquisition unit, wherein the acquisition unit is used for acquiring a user interface and control node information corresponding to the user interface in real time, and the user interface is used for determining an interface screenshot;
The segmentation unit is used for determining the positions of a plurality of control nodes according to the control node information, and segmenting the interface screenshot based on the control nodes to obtain a plurality of sub-screenshots;
The association unit is used for determining an interface control corresponding to the control node so as to establish an association relationship between the interface control and the sub-screenshot;
The identification unit is used for identifying the interface control in the current user interface according to the association relation so as to obtain the interface control information of the current user interface;
the segmentation unit is also used for acquiring attribute information of the sub-screenshot;
and the segmentation unit is further used for determining a measurement value or a picture fingerprint of the sub-screenshot according to the attribute information, wherein the measurement value comprises a length, a width or an area.
8. The apparatus according to claim 7, wherein the identifying unit is specifically configured to obtain a current user interface and current control node information corresponding to the current user interface, where the current user interface is used to determine a current screenshot;
The identification unit is specifically configured to determine positions of a plurality of current control nodes according to the current control node information, and segment the current interface screenshot based on the current control nodes to obtain a plurality of current sub-shots;
the identification unit is specifically configured to traverse the sub-screenshot with similarity to the current sub-screenshot satisfying a preset condition, and determine corresponding interface control information according to the association relationship, where the preset condition is set based on the similarity of the attribute information.
9. The apparatus according to claim 8, wherein the slicing unit is specifically configured to determine metric information of the current sub-screenshot;
the segmentation unit is specifically configured to associate the current sub-screenshot with the sub-screenshot if the measurement information of the current sub-screenshot meets a preset condition compared with the measurement information of the sub-screenshot.
10. The apparatus according to claim 8, wherein the splitting unit is specifically configured to generate a hash value of the current sub-screenshot according to a hash function, where the hash value is used to indicate a picture fingerprint of the current sub-screenshot;
The segmentation unit is specifically configured to associate the current sub-screenshot with the sub-screenshot if the picture fingerprint of the current sub-screenshot meets a preset condition compared with the picture fingerprint of the sub-screenshot.
11. The apparatus according to any one of claims 8-10, wherein the splitting unit is configured to obtain weight information of the control node, where the weight information is used to indicate an identification priority of the control node;
The segmentation unit is specifically configured to determine positions of a plurality of control nodes according to the control node information;
The segmentation unit is specifically configured to segment the interface screenshot based on the control node in sequence according to the weight information, so as to obtain a plurality of sub-shots.
12. The apparatus according to any of the claims 8-10, characterized by the segmentation unit, the identification unit being in particular adapted to determine operational instruction information for the current user interface;
The identification unit is specifically used for determining an interface control applied to the current user interface according to the operation instruction information;
The identification unit is specifically configured to identify interface control information of an interface control in a current user interface according to the association relationship.
13. A computer device, the computer device comprising a processor and a memory:
The memory is used for storing program codes; the processor is configured to perform the control recognition method of any one of claims 1 to 6 according to instructions in the program code.
14. A computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the control recognition method of any one of the preceding claims 1 to 6.
15. A computer program product comprising instructions which, when run on a computer device, cause the computer device to perform the method of any of claims 1 to 6.
CN201910838588.0A 2019-09-05 2019-09-05 Control identification method and device applied to user interface Active CN110532056B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910838588.0A CN110532056B (en) 2019-09-05 2019-09-05 Control identification method and device applied to user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910838588.0A CN110532056B (en) 2019-09-05 2019-09-05 Control identification method and device applied to user interface

Publications (2)

Publication Number Publication Date
CN110532056A CN110532056A (en) 2019-12-03
CN110532056B true CN110532056B (en) 2024-04-26

Family

ID=68667064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910838588.0A Active CN110532056B (en) 2019-09-05 2019-09-05 Control identification method and device applied to user interface

Country Status (1)

Country Link
CN (1) CN110532056B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111078552A (en) * 2019-12-16 2020-04-28 腾讯科技(深圳)有限公司 Method and device for detecting page display abnormity and storage medium
CN112162930B (en) * 2020-10-21 2022-02-08 腾讯科技(深圳)有限公司 Control identification method, related device, equipment and storage medium
CN115048309B (en) * 2022-06-27 2023-03-07 广州掌动智能科技有限公司 Non-intrusive APP software performance test method and system
CN117033239B (en) * 2023-09-04 2024-06-11 镁佳(北京)科技有限公司 Control matching method and device, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105843494A (en) * 2015-01-15 2016-08-10 中兴通讯股份有限公司 Method and device for realizing region screen capture, and terminal
CN108496150A (en) * 2016-10-18 2018-09-04 华为技术有限公司 A kind of method and terminal of screenshot capture and reading

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105843494A (en) * 2015-01-15 2016-08-10 中兴通讯股份有限公司 Method and device for realizing region screen capture, and terminal
CN108496150A (en) * 2016-10-18 2018-09-04 华为技术有限公司 A kind of method and terminal of screenshot capture and reading

Also Published As

Publication number Publication date
CN110532056A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN110532056B (en) Control identification method and device applied to user interface
CN110069463B (en) User behavior processing method, device electronic equipment and storage medium
CN109710508B (en) Test method, test device, test apparatus, and computer-readable storage medium
US9292311B2 (en) Method and apparatus for providing software problem solutions
CN107133165B (en) Browser compatibility detection method and device
CN106095866B (en) The optimization method and device of application program recommended method, program starting speed
US11106916B2 (en) Identifying segment starting locations in video compilations
CN112817866A (en) Recording playback method, device, system, computer equipment and storage medium
CN106959919B (en) Software testing method and device based on testing path diagram
US10346450B2 (en) Automatic datacenter state summarization
JPWO2010064317A1 (en) Operation management support program, recording medium recording the program, operation management support device, and operation management support method
CN112395189A (en) Method, device and equipment for automatically identifying test video and storage medium
CN109359042B (en) Automatic testing method based on path search algorithm
CN106227502A (en) A kind of method and device obtaining hard disk firmware version
US20230221847A1 (en) Cognitive detection of user interface errors
CN109101297B (en) Page identification method and device
CN115981901A (en) Fault positioning method, equipment and medium for automatic test of switch
CN115481025A (en) Script recording method and device for automatic test, computer equipment and medium
CN112988457B (en) Data backup method, device, system and computer equipment
CN113378525A (en) PDF document paragraph presentation method, device, storage medium and equipment
CN112882937A (en) Test case processing method and device, computer equipment and storage medium
CN106293897B (en) Automatic scheduling system of subassembly
CN105359111A (en) User-interface review method, device, and program
CN112417252B (en) Crawler path determination method and device, storage medium and electronic equipment
CN114048147B (en) Test case generation method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant