Nothing Special   »   [go: up one dir, main page]

US20070005299A1 - Systems and methods providing a declarative screen model for automated testing - Google Patents

Systems and methods providing a declarative screen model for automated testing Download PDF

Info

Publication number
US20070005299A1
US20070005299A1 US11/421,407 US42140706A US2007005299A1 US 20070005299 A1 US20070005299 A1 US 20070005299A1 US 42140706 A US42140706 A US 42140706A US 2007005299 A1 US2007005299 A1 US 2007005299A1
Authority
US
United States
Prior art keywords
screen
test
navigation map
under test
device under
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/421,407
Inventor
David Haggerty
Alex Elkin
Scott Opitz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bsquare Corp
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/421,407 priority Critical patent/US20070005299A1/en
Assigned to TESTQUEST, INC. reassignment TESTQUEST, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HAGGERTY, DAVID, ELKIN, ALEX, OPITZ, SCOTT
Publication of US20070005299A1 publication Critical patent/US20070005299A1/en
Priority to US12/272,652 priority patent/US20090125826A1/en
Assigned to BSQUARE CORPORATION reassignment BSQUARE CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TESTQUEST, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/2273Test methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/26Functional testing
    • G06F11/263Generation of test inputs, e.g. test vectors, patterns or sequences ; with adaptation of the tested hardware for testability with external testers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable

Definitions

  • An information-processing system is tested several times over the course of its life cycle, starting with its initial design and being repeated every time the product is modified.
  • Typical information-processing systems include personal and laptop computers, personal data assistants (PDAs), cellular phones, medical devices washing machines, wristwatches, pagers, and automobile information displays.
  • testing is conducted by a test engineer who identifies defects by manually running the product through a defined series of steps and observing the result after each step. Because the series of steps is intended to both thoroughly exercise product functions as well as re-execute scenarios that have identified problems in the past, the testing process can be rather lengthy and time consuming. Add on the multiplicity of tests that must be executed due to system size, platform and configuration requirements, and language requirements, and one will see that testing has become a time consuming and extremely expensive process.
  • test automation has begun replacing manual testing procedures.
  • the benefits of test automation include reduced test personnel costs, better test coverage, and quicker time to market.
  • an effective automated testing product can be costly and time consuming to implement.
  • the software methods, interfaces and procedures required to thoroughly test an information processing system can be nearly as complicated as the information processing system itself. For example, many information processing systems provide user interfaces that require navigation through a series of screens, with each screen potentially requiring input data. In previous systems, each test method required the test developer to provide code to navigate to the desired screen. If the interface changes in subsequent versions of the information processing system, the test procedure also typically must be modified to reflect the change. Such changes can be costly and time consuming to implement.
  • Some embodiments of the invention provide a runtime environment providing canonical definition for commonly found GUI components and other man machine interfaces (physical buttons, audio input/output, touch screen etc.)
  • Some embodiments of the invention provide a run-time environment providing navigation maps using declarative model defined at design-time by recording actual manipulation of a physical device through a virtual device interface.
  • the navigation maps may be used to automatically navigate to application screens eliminating need for test engineer to provide navigation code.
  • FIG. 1A is a block diagram illustrating a system incorporating embodiments of the invention.
  • FIG. 1B is a block diagram providing further details of a system incorporating embodiments of the invention.
  • FIG. 1C is a block diagram showing the logical relationship of various components in a system according to embodiments of the invention.
  • FIG. 2 is a block diagram illustrating object management components according to embodiments of the invention.
  • FIG. 3 illustrates an example main screen of a test development environment according to embodiments of the invention.
  • FIG. 4 provides a flow illustrating a test design process according to embodiments of the invention.
  • FIGS. 5A-5D illustrate example navigation map panels of a test development environment according to embodiments of the invention.
  • FIG. 6 illustrates an example test case panel of a test development environment according to embodiments of the invention.
  • FIG. 7 is a flowchart illustrating methods according to embodiments of the invention.
  • the functions or algorithms described herein are implemented in hardware, and/or software in embodiments.
  • the software comprises computer executable instructions stored on computer readable media such as memory or other types of storage devices.
  • computer readable media is also used to represent software-transmitted carrier waves.
  • modules which are software, hardware, firmware, or any combination thereof. Multiple functions are performed in one or more modules as desired, and the embodiments described are merely examples.
  • a digital signal processor, ASIC, microprocessor, or any other type of processor operating on a system, such as a personal computer, server, a router, or any other device capable of processing data including network interconnection devices executes the software.
  • Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit.
  • the example process flow is applicable to software, firmware, and hardware implementations.
  • design time and “run time” may be used to describe aspects of the operation of various embodiments.
  • design time refers to activities and/or operations that take place during the design of a particular test case or test set.
  • run time refers to activities and/or operations that take place during the execution of a test case or a test set of test cases.
  • FIG. 1A is a block diagram illustrating a system 100 incorporating embodiments of the invention.
  • system 100 includes a test development environment 102 , development repository 120 , test management repository 130 and a device under test (DUT) 140 .
  • DUT device under test
  • DUT 140 may be any type of device incorporating processing logic. Such device include but are not limited to personal digital assistants (PDAs), cellular telephones, mobile computing devices, laptop computers, handheld computers, personal computers, server computers mainframe computers, workstation computers and combinations of the above. DUT 140 may include one or more systems under test, including applications that may be tested using embodiments of the invention.
  • PDAs personal digital assistants
  • cellular telephones mobile computing devices
  • laptop computers handheld computers
  • personal computers personal computers mainframe computers
  • workstation computers workstation computers and combinations of the above.
  • DUT 140 may include one or more systems under test, including applications that may be tested using embodiments of the invention.
  • Test interface 142 provides an interface between the DUT 140 and the test development environment 102 .
  • the test interface communicates commands and stimulus from the test development environment 102 to the DUT 140 and communicates the DUT's response to the commands and stimulus back to the test development environment 102 .
  • the test interface is a minimally invasive software test agent that resides on the DUT.
  • the minimally invasive software test agent provides stimulus to the device and provides a bitmap comprising a screen on the device displayed in response to the stimulus. Further details on such a minimally invasive software test agent may be found in U.S. patent application Ser. No. 10/322824 entitled “Software Test Agents” which is hereby incorporate by reference herein for all purposes.
  • test interface 142 may be a more invasive test interface that examines the user interface code and data structures resident on a DUT and uses the current state of the code and data to determine how to stimulate the DUT and how to interpret responses from the DUT.
  • the DUT may be a physical device that is communicably coupled to the test system either directly or through a network path.
  • the DUT may be an emulated device in which the characteristics of a physical device are emulated by software operating on a computing system that is communicably coupled to the test system either directly or though a network path.
  • Development repository 120 comprises a database of various objects that may be created, read, updated and/or deleted using the test development environment 102 .
  • development repository 120 may be a relational database such as the Microsoft SQL Server database.
  • repository 120 may be a set of files on a file system, an object oriented database, a hierarchical database, or an XML database.
  • Development repository 120 may contain a variety of objects, include one or more of test sets 121 , automated test cases 122 , test verbs 123 , screen definitions 124 , navigation maps 125 , platform data 126 , resource sets 127 , virtual device definitions 128 , device data 129 , global variables 130 , data sources 131 , platform roles 132 and templates 133 .
  • Various embodiments will maintain varying combinations of the above-named components, and no embodiment need necessarily contain all of the above-named components.
  • Test sets 121 comprise sets of one or more automated test cases 122 along with other logic to control the execution or invocation of the automated test cases 122 .
  • a test set references test cases only, and does not reference other types of objects in the repository 120 .
  • Test sets provide a convenient way to manage groups of test cases that may be applicable for a particular type of DUT, or groups of DUTs.
  • Automated test case 122 comprises logic and data that provides a discrete test unit for a DUT, or group of DUTs.
  • An automated test case 122 is a series of one or more test steps and may utilize or reference one or more test verbs 123 , screen definitions 124 and external code 150 . The steps in an automated test case may be defined using a test case editor as described below.
  • Test verbs 123 define logic and actions that may be performed on the DUT or group of DUTs. Further details on test verbs may be found in U.S. patent application Ser. No. 10/323,095 entitled “Method and Apparatus for Making and Using Test Verbs” and in U.S. patent application Ser. No. 10/323,595 entitled “Method and Apparatus for Making and Using Wireless Test Verbs”, each of which are hereby incorporated by reference herein for all purposes.
  • Screen definitions 124 comprises data that may be used to define one or more screens displayed by software executing on the DUT.
  • the screen definitions may comprise application screens for email applications, contact manager applications, calendar applications etc. that execute on a PDA.
  • the data for a screen definition may include a bitmap of all or a portion of the screen, and references to screen components 133 .
  • a screen may be identified by presence of a screen component 133 which is a unique identifier for a particular screen.
  • Screen components 133 define buttons, menus, dialog boxes, icons, and other user interface elements that may appear on a screen.
  • a navigation map 125 defines screen transitions describing the device interactions that may be provided to cause the DUT to move from screen to screen.
  • a navigation map may include the commands needed to move from one screen to another, or from one screen to many other screens and vice versa. Further details on navigation maps are described below.
  • Platform data 126 comprises data that defines properties for abstractions related to platforms, including platform types, platform groups, and platform definitions.
  • a platform type definition describes properties of a platform that are independent of, or generic to, devices that execute the platform system. Examples of various platform types include Windows Operating System (OS) platforms, Symbian OS platforms, PocketPC OS platforms, and SmartPhone OS platforms.
  • a platform group describes properties of a platform within a platform type that are generic to groups of operating systems within the type.
  • a PocketPC OS platform type may include a group defining properties for a Pocket PC 2003 operating system and a group defining properties for a Pocket PC 2003 SE operating system.
  • a platform definition describes the properties of a platform within a platform group that are generic to one or more devices within the platform group.
  • the PocketPC 2003 operating system group may include a platform definition that defines the properties of one or more devices utilizing the PocketPC 2003 operating system.
  • a platform definition that defines the properties of one or more devices utilizing the PocketPC 2003 operating system.
  • platforms types, groups, and definitions now exist and may be developed in the future, and that such platforms are within the scope of the inventive subject matter.
  • the boundaries and data that define a type or group may vary in various embodiments.
  • the data provided as part of a platform group definition may include one or more of platform name, platform OS, platform GUI (Graphical User Interface) style, platform language, platform screen resolution, platform screen color depth and other device attributes needed to characterize devices. Identification of the type of virtual device to be used to represent the platform may be also specified.
  • Virtual Device 127 comprises data that defines properties of an abstraction of a type of device.
  • a virtual device may comprise an iPaq type device, or a Treo type device.
  • the virtual device data in some embodiments includes data that is common or generic to all of the devices of a particular platform definition.
  • the data defining a virtual device may include one or more of a device skin, keyboard area, hotspots (including possible button states), keyboard properties, touch screen properties, glyph drawing properties, and screen capture parameters such as the screen capture poll rate.
  • Device 128 comprises data that defines the properties of a particular DUT.
  • these properties include a device name, a device OS version, a device GUI style, a device language, a device screen resolution, a device screen color depth, and one or more virtual devices that may be associated with the device 128 .
  • device properties may include connection related properties such as a host name identifying a host in a network where the device is connected, a connection type (e.g. serial, USB, network etc.), a device connection address, and other connection related properties such as message timeout values and command response timeout values.
  • the DUT may be a physical device or an emulated device.
  • Resource sets 129 comprise data that may be used by the screen components described above.
  • text strings may be defined in various languages.
  • a resource set for the text strings for a particular language may be used by the above-described components to provide language independence or to customize the system for a particular language.
  • Global variables 130 comprise data that may be used within each logical unit of a test set or test case.
  • Data sources 131 comprises data that may be used as a source to provide input data for a test case or test set. For example, assume that a test case requires the input of multiple contact names and addresses. A data source 131 may be used to provide the data for the names and addresses. A data source 131 may be a table or tables in an external RDBMs, or it may be an external file such as a text file, or a spreadsheet file.
  • Platform roles 132 represent unique instances of a platform definition. Platform roles may have several functions. In some embodiments, they make it possible to invoke more than one instance of a platform type at one time. Additionally, they may make it possible to identify which platform definitions are the same device and which are different devices across one or more test sets, test cases and test verbs. Further, they may provide an indication of the test role of a particular platform definition or device in a test. In some embodiments, a DUT may be assigned roles comprising “primary” and “secondary”. In alternative embodiments, the user may declare the platform roles. In these embodiments there may be an arbitrary number of roles, and roles may be arbitrarily labeled providing a user a great deal of flexibility in defining automated tests.
  • Component Templates 134 comprise standard (or canonical) definitions for various user interface elements for a platform type, groups, and definition; such as menus, icons, buttons, text fields etc.
  • the standard definitions may be used as a basis for defining specialized instances of the user interface elements for a particular platform within a platform type, group, or definition.
  • component templates automate the process of identifying the component instance on the screen. As an example, the component template is dragged onto the screen and the component is automatically recognized by the template.
  • Test development environment 102 is a set of one or more software applications that provide a user interface for managing the components in repository 120 and for designing and managing automated test cases.
  • test development environment 102 includes a user interface 104 , a code generator 110 , and a debugger 112 .
  • the test development environment 102 reads one or more objects from the repository 120 that become resident objects 114 .
  • Resident objects 114 may be created, read, updated or deleted using user interface 104 . Resident objects may be saved back to the repository 120 .
  • user interface 114 is a graphical use interface that may be used to manipulate objects in repository 120 . Further details on a user interface according to embodiments of the invention are provided below with reference to FIGS. 3-6 .
  • a virtual device interface 106 is part of user interface 104 .
  • Virtual device interface 106 provides a representation of a DUT (i.e. a skin) that may be used to provide a graphical image of a DUT.
  • the virtual device may have hot spots that correspond to buttons on the DUT, and may also emulate other aspects of the user interface of a DUT.
  • the virtual device 106 may thus be used to provide stimulus that is related to the DUT through the test interface 142 . Responses, from the DUT to the stimulus may then be provided through the test interface 142 and any screen changes may be shown on the virtual device.
  • the response may include an updated screen bitmap that reflects changes on a screen of the DUT.
  • Virtual devices provide a mechanism to develop tests on devices located at remote locations, and designed to provide an interface that may be more convenient or easy to use than the actual device.
  • Debugger 112 provides an interface to debug automated test cases 122 .
  • debugger 112 provides an interface that allows breakpoints to be set at certain points in the execution of a test case.
  • debugger 112 may provide the ability to single step through one or more steps in a test case, including stepping into or stepping over a sub-component of a test case (e.g. a test verb).
  • debugger 112 may provide an interface allowing a user to watch for changes in the local, interface, and global variables used by a test case.
  • Code generator 110 operates to generate executable code for a test case.
  • a user utilizes user interface 104 to develop and debug a test case 122 .
  • the code generator automatically and transparently generates and compiles test logic code, for example when the test code is ready to be executed or debugged.
  • the code generator reads the test case logic and data (including navigation maps, screen definitions, test verbs, and calls to external code) and generates and compiles executable code that performs the logic of the test case.
  • code is generated in the C# programming language and is targeted for a Microsoft .net framework.
  • Test management repository 130 may be used in some embodiments to store previously generated test that are deployed or otherwise readied for operational use.
  • the execution module or modules 132 are generated and compiled by code generator 110 and linked into a framework.
  • test management repository 150 may manage test session data 154 . Session data 154 tracks which DUTs are in use, and which are available for testing.
  • FIG. 1B is a block diagram providing further details of a system incorporating embodiments of the invention.
  • the system includes test development environment 102 , repository service 165 , test execution service 170 , test management service 175 , web service 180 , and target service 160 , which in some embodiments may be communicably coupled via a network 185 .
  • Network 185 may be a corporate LAN, intranet, WAN or other network. Additionally, network 185 may include the Internet. It should be noted that some or all of the services shown in FIG. 1B may be included on a single system, in which case a network may not be required in order for such collocated services to communicate.
  • Repository service 165 is a repository manager that manages development repository 12 and test management repository 150 .
  • repository server 165 may include a database management service such as Microsoft SQL server.
  • Repository service 165 provides interfaces for other service to create, read, update and delete data in the repositories.
  • test development environment 102 may be used to design and test a test application.
  • Web service 180 provides a web based interface to a browser to allow users to control test execution for tests that have been deployed. Test may be invoked immediately, or may be scheduled for future execution using a browser in communication with web service 180 .
  • Test management service 175 controls test scheduling. This may include time of day, day of week scheduling, and may also include scheduling test at matching devices and platforms become available. When a test is scheduled to run, test management service 175 sends a test execution command to test execution service 170 .
  • Test execution service 170 receives commands to execute one or more tests.
  • Test execution server selects an appropriate target service based on test parameters, including searching for a matching test platform, and causes a test to be executed on a device under test 140 that includes a matching platform.
  • FIG. 1C is a block diagram showing the logical relationship of the various components described above in a system according to embodiments of the invention.
  • FIG. 2 is a block diagram illustrating object management components according to embodiments of the invention.
  • the object management components include editor 202 , layout object 210 and data objects 220 .
  • data objects 220 include repository objects 121 - 131 above, such as automated test cases 122 , test verbs 123 , navigation maps 125 , and screen definitions 124 (including screen component and component type definition).
  • Editor 202 comprises any user interface component that may be used to visualizes or change the properties a data object 220 .
  • Editor 202 may include a view 204 and/or a model 206 .
  • Model 206 is a data object that defines the business logic of editor 202 .
  • View 204 is a visual representation of editor 202 that is presented to the user.
  • Layout object 210 comprises a stored description of how the data object 220 is shown in the given editor 202 . Not every editor needs a layout object 210 .
  • FIG. 3 illustrates an example main screen 300 of a test development environment 102 according to embodiments of the invention.
  • screen 300 includes one or more of an explorer pane 302 , a properties pane 306 and a detail pane 304 .
  • Explorer pane 302 in some embodiments provides a list of categories and objects that may exist in a repository, and provides an interface to select categories and items for editing.
  • explorer pane 302 includes user interface elements for test sets, test cases, test verbs, navigation maps, global variables, data sources, resource tables, custom code, and deployment modules. Selection of a user interface element will typically cause the explorer pane to show expanded or further details or a listing of particular data objects that may be selected for view, execution and/or editing.
  • a Pocket PC 2003 platform definition have been selected.
  • property pane 306 has been displayed to show property values for various properties of the platform definition.
  • Properties pane 306 in some embodiments displays a list of properties associated with a selected item.
  • the properties may be selected, and values associated with the properties may be changed. Any manner of editing a property value may be used, including direct text entry, selection from a drop-down box, selection from a check box, or other user interface editing device known in the art.
  • Detail pane 304 is typically used to provide a graphical representation of an interface element selected from explorer pane 302 . Further details on such graphical representations are illustrated below with reference to FIGS. 5-6 .
  • FIG. 4 illustrates an example flow diagram for test design.
  • a typical design flow 400 starts at 402 by providing templates for components used in defining a screen.
  • the example flow 400 for a test design is a bottom-up design flow and is not the only flow possible. Top-down flows and inside-out flows are possible and productive as well. An example of a top-down flow would be to define a test case first followed by navigation maps and test verbs.
  • the purpose of templates is to automate the process of creating screen components.
  • the templates may provide standard property values and/or bitmap definitions for various user interface elements of a device under test.
  • the templates may include buttons, boxes, menus, icons, tables, text fields, track bars, and keyboard elements.
  • a component instance is created.
  • the component instance may be created from a template provided at 402 by dragging the template over or near the relevant portion of a screen image as will be further described below.
  • components are automatically recognized without user action when the screen is added to the navigation map.
  • instances of components are related to the template used to create the component instance. This is desirable, because a change in the template may be automatically propagated to all of the component instances that refer to the template. Templates may significantly reduce the user workload when creating components by automatically recognizing the component on the screen, and by automatically setting all the properties of the component.
  • a screen is defined using components defined at block 404 .
  • a screen is defined using one or more components, some or all of which must be present in order for the screen to be recognized and operate properly.
  • a user interface for defining a screen according to embodiments of the invention is described below.
  • a navigation map is defined using one or more screens.
  • a navigation map comprises a set of one or more screens as defined at block 406 , together with commands and data that define the transitions from one screen to another.
  • a user interface for creating and maintaining a navigation map is provided below.
  • a test verb may be defined using test logic and the navigation maps defined at block 408 .
  • a test case may be defined using test logic the navigation maps defined at block 408 .
  • the test logic for a test case may invoke a test verb.
  • test set may be defines using one or more test cases.
  • test sequence may be defined using one or more test cases, one or more test sets, or a combination of test cases and test sets.
  • a test sequence may include an order of execution for the test cases and test sets in the test sequence.
  • test session defines the global variables, and other test parameters for a particular execution of a test sequence, test set, or test case.
  • FIGS. 5A-5C illustrate example navigation map panels and screen editor panels of a test development environment according to embodiments of the invention that may be used in the test design process illustrated in FIG. 4 .
  • FIG. 5A illustrates example screen interfaces 502 and 504 for selecting a platform definition and device for a navigation map.
  • a platform definition is selected in the case that multiple polymorphic versions of the navigation map have been defined.
  • Screen interface 502 provides an example platform definition selection screen.
  • a list of available platform types is provided in a list. The desired platform definition may be selected and used to provide a platform definition context for the navigation map.
  • Screen interface 504 provides a device selection interface.
  • Screen 504 provides a list of available devices defined within the repository. Those devices having attributes that match the currently selected platform definition may be highlighted on one manner, while those devices whose attributes do not match the currently selected platform type may be highlighted differently or not highlighted at all. For example, in some embodiments, devices having attributes that match the currently selected platform definition are displayed using green text, while devices that do not match the currently selected platform are displayed using red text.
  • FIG. 5B illustrates an example main navigation map screen 506 according to embodiments of the invention.
  • a navigation map has been selected from explorer pane 302 , and detail pane 304 is updated to display the selected navigation map 508 .
  • the navigation map 508 includes data for navigating between five screens 510 . 1 - 510 . 7 .
  • Screen 510 . 1 comprises a main screen on the DUT
  • screen 510 . 2 comprises a tasks screen on the DUT
  • screen 510 . 3 comprises a clock screen
  • screen 510 . 4 comprises a main notes screen on the DUT
  • screen 510 . 5 comprises an owner information screen on the DUT
  • screen 510 . 6 comprises a email screen, an screen 510 .
  • FIG. 7 comprises an add notes screen.
  • Each of screens 510 . 2 - 510 . 6 is reachable from the main screen 510 . 1 , and each screen 510 . 2 - 510 . 6 can return to the main screen. However, screen 510 . 7 is reachable only through main notes screen 510 . 4 .
  • Connecting lines 512 illustrate the navigation between the screens. In some embodiments, arrows indicate the direction in which navigation is permitted. As shown in FIG. 5B , Navigation between screens is not limited to screens that are directly connected. For example, to navigate from screen 510 . 1 to screen 510 . 7 , the system automatically recognizes that screen 510 . 6 may be used as an intermediate screen to get from screen 510 . 1 to 510 . 7 .
  • recording buttons 514 may be used to add screens to a navigation map.
  • the recording buttons may include a record button, pause button, and stop button.
  • stimulus originating from a virtual device 519 is sent to the DUT and is recorded.
  • the user may press the add screen button 518 .
  • the add screen button 518 is press the screen is added to the navigation map, displayed on navigation map 506 , and may be saved.
  • a connecting line 512 called a screen transition is placed on navigation map 502 connecting the new screens to the originating screen.
  • the connecting line 512 represents the stimulus (transition steps) that caused the screen transition.
  • the user may view or edit the transition step by double-clicking on the line, if for any reason the steps need to be changed.
  • selecting a screen 510 from navigation map 502 causes the system to issue the appropriate commands to navigate to the selected screen on the DUT.
  • the current screen may be highlighted on the navigation map 506 using any mechanism for highlighting known in the art.
  • the screen border may be displayed in a different or brighter color then non-selected screens, or the screen 510 may be made brighter or made to blink etc.
  • a screen may be designated as the anchored screen.
  • the anchored screen is designated by an anchor icon 517 .
  • a screen anchor is a set of input steps that will cause the DUT to go to the anchored screen regardless of what screen the DUT is currently on.
  • the system uses screen anchors when/if it cannot recognize the current screen on the DUT. For example, most systems have a mechanism for returning a screen to a home or initial screen, no matter what screen a DUT may currently be displaying.
  • a screen anchor is useful in placing the system in a known initial state when a test case begins.
  • an anchor screen may be used to place the system at a known screen in the event that the system cannot locate a screen during a screen navigation operation. This capability makes it possible to continue test execution even though a DUT failure has occurred.
  • a visual indication is shown for screens which have screen identifier defined. In the example, screens with identifiers are shown with a green checkmark icon 515 .
  • some embodiments implement a declarative screen model for automated testing.
  • declarative models describe what something is like, rather than how to create it.
  • a declarative model describes the properties of the objects in the model.
  • FIG. 5C illustrates a screen editor 520 according to embodiments of the invention.
  • Screen editor 520 may be entered upon selecting a screen from navigation map 506 when the test environment is not in record mode.
  • Screen editor 520 includes a component explorer 528 , screen bitmap pane 522 , and a screen component list 524 .
  • Screen bitmap pane 522 displays a bitmap of the currently selected screen.
  • Component explorer 528 contains a list of templates for each type of component that may be used to define a screen of a DUT, and which may be automatically recognized by the system.
  • Screen component list 524 displays a list of components defined for the screen.
  • the components may be recognized components or not recognized components.
  • a recognized component is one that the system has successfully detected as being present on the screen bitmap 522 .
  • recognized components are highlighted on the screen bitmap 522 .
  • a recognized component such as a button may be surrounded by a highlighted box.
  • owner button 526 is among the components that have been recognized as present in screen bitmap 522 .
  • Not recognized components are components that have been defined as possibly being present, but are not currently found on screen bitmap 522 .
  • property panel 524 lists the recognized components and the unrecognized components defined for a particular screen.
  • Components in the list may be designated a screen identifiers through the use of a right-click menu. Components that have been designated as identifiers for the screen are shown by using bolded type for component name 529 .
  • a screen identifier is a component or set of components that may be used to uniquely recognize a screen. The system then uses the screen identifiers to determine which screen a DUT is currently on, and may also use the screen identifiers to determine that the system has navigated to the correct screen.
  • a user selects the desired component type from component explorer pane 528 .
  • a recognition rule editor is then invoked that allows a user to define how the component will be recognized on a screen. Or, the user may use a component template that is listed under the component type. When a component template is used the component is automatically created, and therefore the recognition rule editor is not invoked.
  • FIG. 5D illustrates a recognition rule editor 530 according to embodiments of the invention.
  • a user has indicated that a button is to be recognized.
  • the recognition rule editor 530 includes a current screen bitmap 522 , a recognition mode pane 532 , and a bitmap 534 .
  • Recognition mode 532 controls how the button will be recognized.
  • recognition may be position based, text based, or icon based. In position based recognition, a particular bit pattern is expected at a particular position on the screen. The position may be defined using four comers, a border search, a line based search, or an absolute position. Additionally, recognized components may be highlighted on the screen. In some embodiments, recognized components are highlighted by placing a colored rectangle around the recognized portion.
  • text on a button label may be used to recognize the button.
  • the user may select a bitmap for the text, upon which the system may perform optical character recognition in order to determine the actual text contained in the bitmap. The user may then verify the correct text for the button.
  • a bitmap area for the button is selected from screen bitmap 522 .
  • the user may define bits within the bitmap area as being significant to the button recognition, or as “don't care bits” in which the system will ignore the value for purposes of button recognition. For example, a square area may be selected that contains a circular button. Pixels in the four comers of the selected area that are outside of the circular button bitmap may be designated as “don't care” bits because they are not significant in detecting the presence of the button.
  • bits that may be set as part of a background color may be designated as “don't care” bits in order to allow for the same button to be detected no matter what background a user has selected.
  • a screen recognition component may be designated as a template component.
  • a template is defined for recognizing the component.
  • the template may be applied to multiple screens where the same button may appear. Thus a user does not have to redefine a recognition rule for each screen a button may appear on.
  • a component template After a component template has been defined it appears under the parent component type in the component explorer list 528 .
  • FIG. 6A illustrates an example test design panel 600 of a test development environment according to embodiments of the invention.
  • the test design panel is used to create, edit, debug, and run test cases and test verbs.
  • the test design panel has three basic modes of operation. They are design, debug, and run modes.
  • the design mode is used to create and edit test cases and verbs through recording on the virtual device 519 , or by drag and drop construction using test logic blocks.
  • test panel 600 includes a test case flow chart 602 and a test logic block list 604 .
  • Test logic block list 604 provides a list of the differing test symbols and operations that may be used to define a test case or test verb. Each symbol may comprise one or more steps in the test logic.
  • the list includes test logic blocks to assign values to variables, invoke other test cases, catch exceptions, test conditions or values, invoke custom written code, fill in variables or screen input areas with data from a data source, execute steps in the test case as part of a loop, execute an expression, go to a screen on a platform, log text to a log file, or end the test case.
  • test logic blocks to assign values to variables, invoke other test cases, catch exceptions, test conditions or values, invoke custom written code, fill in variables or screen input areas with data from a data source, execute steps in the test case as part of a loop, execute an expression, go to a screen on a platform, log text to a log file, or end the test case.
  • a user selects the desired test symbol from symbol list 604 and drags the symbol to flow chart 602 .
  • a new symbol may be connected to a previously existing symbol either based on the relative positions of the symbols, or a user may explicitly drag a connecting line from one symbol to another.
  • Connection points on the symbols define the entry point for a test step and exit points for the test step. In some embodiments, two exit connection points may be provided. One connection point may be used to proceed to the next symbol if the current step successfully exits. A second connection point may be used to proceed to a differing symbol if the current step results in an error condition.
  • the virtual device 519 may be used to add or insert test logic blocks into the test logic flow.
  • DUT stimulus commands such as click on button or select menu item may be recorded into the test case or verb.
  • components defined on the current screen are outlined to give the user feedback as to the presence of the screen component. The use then can use a right-click menu to record stimulus, read, or verify commands associated with the highlighted component. After command selection the DUT is stimulated, if the command is a stimulation command, and the corresponding test logic block is inserted into the test case or test verb.
  • GoTo Screen and GoTo Platform commands may be recorded as well.
  • the system When a GoTo screen command is selected/recorded the system automatically navigates to the selected screen on the DUT by using the associated navigation map. The corresponding test logic block is inserted into the test case or test verb.
  • a GoTo Platform command When a GoTo Platform command is selected the system automatically switches the DUT context to the DUT associated with the selected platform. Bubbles or clouds are used as background to show the test logic commands that are within the context of GoTo Screen and GoTo Platform commands.
  • Within a GoTo Screen context bubble all the test logic blocks contained within are related to, or are in the context of, the selected screen.
  • Within a GoTo Platform context bubble all the test logic blocks contained within are related to, or are in the context of, the selected screen, including GoTo Screen blocks.
  • icons within the symbols in test case flow chart 602 indicate the type of symbol (loop, assignment etc.) and may also be selected to set break points at the symbol to be used for debugging purposes.
  • variable window 606 may be displayed and used to examine the values of variables at various points in the execution of a test case.
  • FIG. 7 illustrates methods 700 for creating and using navigation maps according to embodiments of the invention.
  • the method begins in some embodiments at block 702 , where a virtual device representing a device under test may be displayed.
  • the system receives a specification of a source screen, where the source screen may be an application (tasks, e-mail, owner etc.) of a device under test.
  • the specification may be made by selecting the screen from the virtual device.
  • the source screen may be selected from a screen image already present on a navigation map.
  • the system receives a specification of a target screen.
  • the specification will be in the form of a selection from a virtual device.
  • the system records the transition commands that cause the transition from the source screen to the target screen.
  • a graphical rendition of the navigation map is displayed, reflecting the addition of the target screen to the navigation map. Additionally, an indication of the transition may be displayed. In some embodiments, the indication is a line connecting the source screen and the target screen.
  • Blocks 704 - 708 may be repeated as desired to record and add multiple source and target screens and along with their associated transitions to a navigation map.
  • the navigation map may be stored in a repository for later use or editing.
  • the actions at blocks 702 - 712 may be referred to as design-time activities. That is, they take place when an automated test procedure is being created or edited. After the design-time activities have taken place, the run-time actions (blocks 714 - 718 ) using the previously created navigation map may take place.
  • a system using the navigation map receives a command specifying a target screen.
  • the command may be part of test logic for an automated test procedure such as a test case, or a test verb procedure within a test case.
  • the command may be received as a selection from a graphical user interface indicating the user wishes to work with a different screen than is currently being displayed, for example, by a navigation map editor or a test case editor.
  • the system determines the current screen.
  • the system reads transition commands from the navigation map to determine which commands or stimulus must be applied to the device under test to cause a transition from the current screen to the target screen. It should be noted that there may be one or more intermediate screens that will be transitioned through to reach the target screen. The commands to cause the transition may then be issued to the device under test.
  • the system may verify the target screen has been reached, and may verify intermediate screens to insure that navigation has not been lost. Further, in some embodiments, if navigation is lost while attempting to transition from a source screen to a target screen, the system may issue commands to the device under test to cause the device to return to a previously specified anchor screen. The system can then treat the anchor screen as a source screen and issue commands to cause the device under test to transition to the target screen.
  • a system may include multiple target services 160 , each of which may manage multiple devices under test 140 . It should be noted that the number of devices under test 140 can grow quit large. In addition, many various types and versions of devices under test may be coupled to the system via target services 160 , and that many different software environments may exist on the devices under test. As a result, the number of platform types, and platform groups, and platform definitions may be quite large and vary from system to system.
  • an automated test procedure may need to deal with more than one device under test during the design, debugging, and execution of the automated test procedure.
  • the system manages both pools of similar devices under test and groups of disparate devices under test.
  • platform definitions, platform types, and platform roles as described above may be used to determine which of the many devices under test that may be available within a system should be selected when designing, debugging, or executing an automated test procedure.
  • a system selects a device under test based on whether the candidate device is available and whether the candidate device attributes match the parameters defined by the platform definition, the platform group, the platform type, and/or any platform roles utilized by an automated test procedure.
  • a further aspect of a multi-device system of some embodiments includes polymorphism for various aspects of the system.
  • polymorphism refers to ability to process objects differently depending on their platform type, platform group, platform definition, or role.
  • the system provides polymorphic navigation maps and polymorphic test verbs.
  • the embodiments are not limited to applying polymorphic operation to any particular element of the system, and in alternative embodiments, polymorphism may be applied to other elements.
  • some embodiments may provide polymorphic component templates.
  • a polymorphic navigation map comprises a navigation map that may be reused across all of the platforms within a platform group.
  • a polymorphic navigation map provides an abstraction of the presentation layer for the platforms within a group.
  • the navigation map may remain constant (and be reused) for platforms within a platform group because the underlying behavior of the operating systems and applications within a platform group does not change across devices in a platform group.
  • a polymorphic test verb comprises a test verb that may be reused across platform types.
  • a polymorphic test verb provides an abstraction of the behavior of devices. For example, two devices may implement task list in very different ways, using different device behaviors to add tasks to the task list.
  • a single polymorphic test verb labeled “CreateTask” may be called from an automated test procedure.
  • the interface to the test verb may remain constant, however different version of the test verb may be implemented for various platform types.
  • the particular instance of the test verb called may vary depending on the platform type for a current device under test.
  • the automated test procedure need only call the polymorphic test verb, the system determines the particular instance of the test verb depending on the platform context. Thus it may not be necessary to alter the test logic of an automated test procedure when a new platform type is added, a designer may only need to supply an instance of the polymorphic test verb for the new platform type.
  • a further aspect of various embodiments includes managing test sessions in which multiple test cases may need to access multiple devices. This can occur because a single test case may access more than one device under test, or different test cases may access different devices under test.
  • the test cases may specify a platform role.
  • the platform role may be associated with particular platform type, platform group, and/or platform definition.
  • the system aggregates all of the roles within the potentially many test cases within a test session, and determines how many unique roles exist for the test session.
  • each test case that refers to the same platform role will access the same device under test. This is desirable, because it allows a test designer to avoid having to specify particular devices under test for each of what may be many test cases.
  • Machine-readable media includes any mechanism that provides (i.e., stores and/or transmits) information in a form readable by a machine (e.g. a computer).
  • tangible machine-readable media includes read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory machines, etc.
  • Machine-readable media also includes any media suitable for transmitting software over a network.
  • Apps that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computer systems.
  • the elements, materials, geometries, dimensions, and sequence of operations can all be varied to suit particular packaging requirements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

Systems and methods for a design and runtime environment provide navigation maps using a declarative model defined by recording actual manipulation of a physical or virtual device. The systems and methods may be used to automatically navigate to application screens eliminating need for a test engineer to provide navigation code.

Description

    RELATED APPLICATIONS
  • This application claims the priority benefit of U.S. Provisional Application Ser. No. 60/685,958 filed May 31, 2005, the contents of which are incorporated herein by reference in their entirety.
  • This application is related to application serial number ______, filed even date herewith, entitled “SYSTEMS AND METHODS PROVIDING A NORMALIZED GUI FOR TESTING DISPARATE DEVICES” (Attorney Docket No.: 1642.006US1), application serial number ______, filed even date herewith, entitled “SYSTEMS AND METHODS FOR GRAPHICALLY DEFINING AUTOMATED TEST PROCEDURES” (Attorney Docket No.: 1642.008US1), application serial number ______, filed even date herewith, entitled “SYSTEMS AND METHODS PROVIDING REUSABLE TEST LOGIC”, (Attorney Docket No.: 1642.009US1), and application serial number ______, filed even date herewith, entitled “SYSTEMS AND METHODS FOR MANAGING MULTI-DEVICE TEST SESSIONS” (Attorney Docket No.: 1642.011US1), all of the above of which are hereby incorporated by reference in their entirety.
  • LIMITED COPYRIGHT WAIVER
  • A portion of the disclosure of this patent document contains material to which the claim of copyright protection is made. The copyright owner has no objection to the facsimile reproduction by any person of the patent document or the patent disclosure, as it appears in the U.S. Patent and Trademark Office file or records, but reserves all other rights whatsoever. Copyright 2006, TestQuest, Inc.
  • BACKGROUND
  • An information-processing system is tested several times over the course of its life cycle, starting with its initial design and being repeated every time the product is modified. Typical information-processing systems include personal and laptop computers, personal data assistants (PDAs), cellular phones, medical devices washing machines, wristwatches, pagers, and automobile information displays.
  • Many of these information-processing systems operate with minimal amounts of memory, storage, and processing capability. Because products today commonly go through a sizable number of revisions and because testing typically becomes more sophisticated over time, this task becomes a larger and larger proposition. Additionally, the testing of such information-processing systems is becoming more complex and time consuming because an information-processing system may run on several different platforms with different configurations, and in different languages. Because of this, the testing requirements in today s information-processing system development environment continue to grow.
  • For some organizations, testing is conducted by a test engineer who identifies defects by manually running the product through a defined series of steps and observing the result after each step. Because the series of steps is intended to both thoroughly exercise product functions as well as re-execute scenarios that have identified problems in the past, the testing process can be rather lengthy and time consuming. Add on the multiplicity of tests that must be executed due to system size, platform and configuration requirements, and language requirements, and one will see that testing has become a time consuming and extremely expensive process.
  • In today's economy, manufacturers of technology solutions are facing new competitive pressures that are forcing them to change the way they bring products to market. Being first-to-market with the latest technology is more important than ever before. But customers require that defects be uncovered and corrected before new products get to market. Additionally, there is pressure to improve profitability by cutting costs anywhere possible.
  • Product testing has become the focal point where these conflicting demands collide. Manual testing procedures, long viewed as the only way to uncover product defects, effectively delay delivery of new products to the market, and the expense involved puts tremendous pressure on profitability margins. Additionally, by their nature, manual testing procedures often fail to uncover all defects.
  • Automated testing of information-processing system products has begun replacing manual testing procedures. The benefits of test automation include reduced test personnel costs, better test coverage, and quicker time to market. However, an effective automated testing product can be costly and time consuming to implement. The software methods, interfaces and procedures required to thoroughly test an information processing system can be nearly as complicated as the information processing system itself. For example, many information processing systems provide user interfaces that require navigation through a series of screens, with each screen potentially requiring input data. In previous systems, each test method required the test developer to provide code to navigate to the desired screen. If the interface changes in subsequent versions of the information processing system, the test procedure also typically must be modified to reflect the change. Such changes can be costly and time consuming to implement.
  • It is common for independent software developers to write software programs that must operate on a diverse set of computing devices. An example of this is software developed for mobile phones. Mobile phones are very heterogeneous with different operating systems, form factors, input mechanisms, screen sizes and color, and GUI styles. This causes applications to look different and to some extent operate differently on each mobile platform, even though the basic function of the software is preserved.
  • In most software development companies the technicians and engineers responsible for the testing software applications are not software engineers. This is because testing technicians and engineers have historically worked in organizations where manual testing methods have been the principal method of test. Consequently, the level of software engineering skill in a software testing organization is typical low.
  • In view of the above problems and issues, there is a need in the art for the present invention.
  • SUMMARY
  • Some embodiments of the invention provide a runtime environment providing canonical definition for commonly found GUI components and other man machine interfaces (physical buttons, audio input/output, touch screen etc.)
  • Some embodiments of the invention provide a run-time environment providing navigation maps using declarative model defined at design-time by recording actual manipulation of a physical device through a virtual device interface. The navigation maps may be used to automatically navigate to application screens eliminating need for test engineer to provide navigation code.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A is a block diagram illustrating a system incorporating embodiments of the invention.
  • FIG. 1B is a block diagram providing further details of a system incorporating embodiments of the invention;
  • FIG. 1C is a block diagram showing the logical relationship of various components in a system according to embodiments of the invention.
  • FIG. 2 is a block diagram illustrating object management components according to embodiments of the invention.
  • FIG. 3 illustrates an example main screen of a test development environment according to embodiments of the invention.
  • FIG. 4 provides a flow illustrating a test design process according to embodiments of the invention.
  • FIGS. 5A-5D illustrate example navigation map panels of a test development environment according to embodiments of the invention.
  • FIG. 6 illustrates an example test case panel of a test development environment according to embodiments of the invention.
  • FIG. 7 is a flowchart illustrating methods according to embodiments of the invention.
  • DETAILED DESCRIPTION
  • In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration, specific embodiments in which the inventive subject matter may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice them, and it is to be understood that other embodiments may be utilized and that structural, logical, and electrical changes may be made without departing from the scope of the inventive subject matter. Such embodiments of the inventive subject matter may be referred to, individually and/or collectively, herein by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.
  • The following description is, therefore, not to be taken in a limited sense, and the scope of the inventive subject matter is defined by the appended claims of the invention.
  • In the Figures, the same reference number is used throughout to refer to an identical component which appears in multiple Figures. Signals and connections may be referred to by the same reference number or label, and the actual meaning will be clear from its use in the context of the description.
  • The functions or algorithms described herein are implemented in hardware, and/or software in embodiments. The software comprises computer executable instructions stored on computer readable media such as memory or other types of storage devices. The term “computer readable media” is also used to represent software-transmitted carrier waves. Further, such functions correspond to modules, which are software, hardware, firmware, or any combination thereof. Multiple functions are performed in one or more modules as desired, and the embodiments described are merely examples. A digital signal processor, ASIC, microprocessor, or any other type of processor operating on a system, such as a personal computer, server, a router, or any other device capable of processing data including network interconnection devices executes the software.
  • Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit. Thus, the example process flow is applicable to software, firmware, and hardware implementations.
  • In the discussion below, the terms “design time” and “run time” may be used to describe aspects of the operation of various embodiments. In general, the term “design time” refers to activities and/or operations that take place during the design of a particular test case or test set. In general, the term “run time” refers to activities and/or operations that take place during the execution of a test case or a test set of test cases.
  • SYSTEM OVERVIEW
  • As noted above, it is common for independent software developers to write software programs that must operate on a diverse set of heterogeneous computing devices where applications to look different and to some extent operate differently on each mobile platform, even though the basic function of the software is preserved. In these types of software development environments it is essential for an automated test system to facilitate and maximize the reuse of test logic. Without a systematic and automated approach for managing the complexity of test logic reuse, automated software testing is not economically viable.
  • Further, as noted above, in most software development companies the technicians and engineers responsible for the testing software applications are not software engineers. Given this situation, it is desirable that a productive and viable tool for automated testing be simple to understand, intuitive in its operation, and hide and manage as much complexity as possible.
  • FIG. 1A is a block diagram illustrating a system 100 incorporating embodiments of the invention. In some embodiments, system 100 includes a test development environment 102, development repository 120, test management repository 130 and a device under test (DUT) 140.
  • DUT 140 may be any type of device incorporating processing logic. Such device include but are not limited to personal digital assistants (PDAs), cellular telephones, mobile computing devices, laptop computers, handheld computers, personal computers, server computers mainframe computers, workstation computers and combinations of the above. DUT 140 may include one or more systems under test, including applications that may be tested using embodiments of the invention.
  • Test interface 142 provides an interface between the DUT 140 and the test development environment 102. The test interface communicates commands and stimulus from the test development environment 102 to the DUT 140 and communicates the DUT's response to the commands and stimulus back to the test development environment 102. In some embodiments, the test interface is a minimally invasive software test agent that resides on the DUT. In general, the minimally invasive software test agent provides stimulus to the device and provides a bitmap comprising a screen on the device displayed in response to the stimulus. Further details on such a minimally invasive software test agent may be found in U.S. patent application Ser. No. 10/322824 entitled “Software Test Agents” which is hereby incorporate by reference herein for all purposes.
  • In alternative embodiments, test interface 142 may be a more invasive test interface that examines the user interface code and data structures resident on a DUT and uses the current state of the code and data to determine how to stimulate the DUT and how to interpret responses from the DUT.
  • The DUT may be a physical device that is communicably coupled to the test system either directly or through a network path. Alternatively, the DUT may be an emulated device in which the characteristics of a physical device are emulated by software operating on a computing system that is communicably coupled to the test system either directly or though a network path.
  • Development repository 120 comprises a database of various objects that may be created, read, updated and/or deleted using the test development environment 102. In some embodiments, development repository 120 may be a relational database such as the Microsoft SQL Server database. In alternative embodiments, repository 120 may be a set of files on a file system, an object oriented database, a hierarchical database, or an XML database. Development repository 120 may contain a variety of objects, include one or more of test sets 121, automated test cases 122, test verbs 123, screen definitions 124, navigation maps 125, platform data 126, resource sets 127, virtual device definitions 128, device data 129, global variables 130, data sources 131, platform roles 132 and templates 133. Various embodiments will maintain varying combinations of the above-named components, and no embodiment need necessarily contain all of the above-named components.
  • Test sets 121 comprise sets of one or more automated test cases 122 along with other logic to control the execution or invocation of the automated test cases 122. In some embodiments of the invention, a test set references test cases only, and does not reference other types of objects in the repository 120. Test sets provide a convenient way to manage groups of test cases that may be applicable for a particular type of DUT, or groups of DUTs.
  • Automated test case 122 comprises logic and data that provides a discrete test unit for a DUT, or group of DUTs. An automated test case 122 is a series of one or more test steps and may utilize or reference one or more test verbs 123, screen definitions 124 and external code 150. The steps in an automated test case may be defined using a test case editor as described below.
  • Test verbs 123 define logic and actions that may be performed on the DUT or group of DUTs. Further details on test verbs may be found in U.S. patent application Ser. No. 10/323,095 entitled “Method and Apparatus for Making and Using Test Verbs” and in U.S. patent application Ser. No. 10/323,595 entitled “Method and Apparatus for Making and Using Wireless Test Verbs”, each of which are hereby incorporated by reference herein for all purposes.
  • Screen definitions 124 comprises data that may be used to define one or more screens displayed by software executing on the DUT. For example, the screen definitions may comprise application screens for email applications, contact manager applications, calendar applications etc. that execute on a PDA. The data for a screen definition may include a bitmap of all or a portion of the screen, and references to screen components 133. In some embodiments, a screen may be identified by presence of a screen component 133 which is a unique identifier for a particular screen.
  • Screen components 133 define buttons, menus, dialog boxes, icons, and other user interface elements that may appear on a screen.
  • A navigation map 125 defines screen transitions describing the device interactions that may be provided to cause the DUT to move from screen to screen. A navigation map may include the commands needed to move from one screen to another, or from one screen to many other screens and vice versa. Further details on navigation maps are described below.
  • Platform data 126 comprises data that defines properties for abstractions related to platforms, including platform types, platform groups, and platform definitions. For example a platform type definition describes properties of a platform that are independent of, or generic to, devices that execute the platform system. Examples of various platform types include Windows Operating System (OS) platforms, Symbian OS platforms, PocketPC OS platforms, and SmartPhone OS platforms. A platform group describes properties of a platform within a platform type that are generic to groups of operating systems within the type. For example, a PocketPC OS platform type may include a group defining properties for a Pocket PC 2003 operating system and a group defining properties for a Pocket PC 2003 SE operating system. A platform definition describes the properties of a platform within a platform group that are generic to one or more devices within the platform group. For example, the PocketPC 2003 operating system group may include a platform definition that defines the properties of one or more devices utilizing the PocketPC 2003 operating system. Those of skill in the art will appreciate that various platforms types, groups, and definitions now exist and may be developed in the future, and that such platforms are within the scope of the inventive subject matter. Further, the boundaries and data that define a type or group may vary in various embodiments. In some embodiments, the data provided as part of a platform group definition may include one or more of platform name, platform OS, platform GUI (Graphical User Interface) style, platform language, platform screen resolution, platform screen color depth and other device attributes needed to characterize devices. Identification of the type of virtual device to be used to represent the platform may be also specified.
  • Virtual Device 127 comprises data that defines properties of an abstraction of a type of device. For example, a virtual device may comprise an iPaq type device, or a Treo type device. The virtual device data in some embodiments includes data that is common or generic to all of the devices of a particular platform definition. In some embodiments, the data defining a virtual device may include one or more of a device skin, keyboard area, hotspots (including possible button states), keyboard properties, touch screen properties, glyph drawing properties, and screen capture parameters such as the screen capture poll rate.
  • Device 128 comprises data that defines the properties of a particular DUT. In some embodiments, these properties include a device name, a device OS version, a device GUI style, a device language, a device screen resolution, a device screen color depth, and one or more virtual devices that may be associated with the device 128. In addition, device properties may include connection related properties such as a host name identifying a host in a network where the device is connected, a connection type (e.g. serial, USB, network etc.), a device connection address, and other connection related properties such as message timeout values and command response timeout values. As noted above, the DUT may be a physical device or an emulated device.
  • Resource sets 129 comprise data that may be used by the screen components described above. For example, in some embodiment text strings may be defined in various languages. A resource set for the text strings for a particular language may be used by the above-described components to provide language independence or to customize the system for a particular language.
  • Global variables 130 comprise data that may be used within each logical unit of a test set or test case.
  • Data sources 131 comprises data that may be used as a source to provide input data for a test case or test set. For example, assume that a test case requires the input of multiple contact names and addresses. A data source 131 may be used to provide the data for the names and addresses. A data source 131 may be a table or tables in an external RDBMs, or it may be an external file such as a text file, or a spreadsheet file.
  • Platform roles 132 represent unique instances of a platform definition. Platform roles may have several functions. In some embodiments, they make it possible to invoke more than one instance of a platform type at one time. Additionally, they may make it possible to identify which platform definitions are the same device and which are different devices across one or more test sets, test cases and test verbs. Further, they may provide an indication of the test role of a particular platform definition or device in a test. In some embodiments, a DUT may be assigned roles comprising “primary” and “secondary”. In alternative embodiments, the user may declare the platform roles. In these embodiments there may be an arbitrary number of roles, and roles may be arbitrarily labeled providing a user a great deal of flexibility in defining automated tests.
  • Component Templates 134 comprise standard (or canonical) definitions for various user interface elements for a platform type, groups, and definition; such as menus, icons, buttons, text fields etc. The standard definitions may be used as a basis for defining specialized instances of the user interface elements for a particular platform within a platform type, group, or definition. In addition to setting the properties for a screen component 133 instance, component templates automate the process of identifying the component instance on the screen. As an example, the component template is dragged onto the screen and the component is automatically recognized by the template.
  • Test development environment 102 is a set of one or more software applications that provide a user interface for managing the components in repository 120 and for designing and managing automated test cases. In some embodiments, test development environment 102 includes a user interface 104, a code generator 110, and a debugger 112. In operation, the test development environment 102 reads one or more objects from the repository 120 that become resident objects 114. Resident objects 114 may be created, read, updated or deleted using user interface 104. Resident objects may be saved back to the repository 120. In some embodiments, user interface 114 is a graphical use interface that may be used to manipulate objects in repository 120. Further details on a user interface according to embodiments of the invention are provided below with reference to FIGS. 3-6.
  • In some embodiments, a virtual device interface 106 is part of user interface 104. Virtual device interface 106 provides a representation of a DUT (i.e. a skin) that may be used to provide a graphical image of a DUT. The virtual device may have hot spots that correspond to buttons on the DUT, and may also emulate other aspects of the user interface of a DUT. The virtual device 106 may thus be used to provide stimulus that is related to the DUT through the test interface 142. Responses, from the DUT to the stimulus may then be provided through the test interface 142 and any screen changes may be shown on the virtual device. The response may include an updated screen bitmap that reflects changes on a screen of the DUT. Virtual devices provide a mechanism to develop tests on devices located at remote locations, and designed to provide an interface that may be more convenient or easy to use than the actual device.
  • Debugger 112 provides an interface to debug automated test cases 122. In some embodiments, debugger 112 provides an interface that allows breakpoints to be set at certain points in the execution of a test case. In addition, in debugger 112 may provide the ability to single step through one or more steps in a test case, including stepping into or stepping over a sub-component of a test case (e.g. a test verb). Additionally, debugger 112 may provide an interface allowing a user to watch for changes in the local, interface, and global variables used by a test case.
  • Code generator 110 operates to generate executable code for a test case. In some embodiments, a user utilizes user interface 104 to develop and debug a test case 122. In some embodiments, the code generator automatically and transparently generates and compiles test logic code, for example when the test code is ready to be executed or debugged. The code generator reads the test case logic and data (including navigation maps, screen definitions, test verbs, and calls to external code) and generates and compiles executable code that performs the logic of the test case. In some embodiments, code is generated in the C# programming language and is targeted for a Microsoft .net framework.
  • Test management repository 130 may be used in some embodiments to store previously generated test that are deployed or otherwise readied for operational use. In some embodiments, the execution module or modules 132 are generated and compiled by code generator 110 and linked into a framework. In addition, test management repository 150 may manage test session data 154. Session data 154 tracks which DUTs are in use, and which are available for testing.
  • FIG. 1B is a block diagram providing further details of a system incorporating embodiments of the invention. In some embodiments, the system includes test development environment 102, repository service 165, test execution service 170, test management service 175, web service 180, and target service 160, which in some embodiments may be communicably coupled via a network 185. Network 185 may be a corporate LAN, intranet, WAN or other network. Additionally, network 185 may include the Internet. It should be noted that some or all of the services shown in FIG. 1B may be included on a single system, in which case a network may not be required in order for such collocated services to communicate.
  • Repository service 165 is a repository manager that manages development repository 12 and test management repository 150. In some embodiments, repository server 165 may include a database management service such as Microsoft SQL server. Repository service 165 provides interfaces for other service to create, read, update and delete data in the repositories.
  • As noted above, test development environment 102 may be used to design and test a test application.
  • Web service 180 provides a web based interface to a browser to allow users to control test execution for tests that have been deployed. Test may be invoked immediately, or may be scheduled for future execution using a browser in communication with web service 180.
  • Test management service 175 controls test scheduling. This may include time of day, day of week scheduling, and may also include scheduling test at matching devices and platforms become available. When a test is scheduled to run, test management service 175 sends a test execution command to test execution service 170.
  • Test execution service 170 receives commands to execute one or more tests. Test execution server selects an appropriate target service based on test parameters, including searching for a matching test platform, and causes a test to be executed on a device under test 140 that includes a matching platform.
  • FIG. 1C is a block diagram showing the logical relationship of the various components described above in a system according to embodiments of the invention.
  • FIG. 2 is a block diagram illustrating object management components according to embodiments of the invention. In some embodiments the object management components include editor 202, layout object 210 and data objects 220. data objects 220 include repository objects 121-131 above, such as automated test cases 122, test verbs 123, navigation maps 125, and screen definitions 124 (including screen component and component type definition).
  • Editor 202 comprises any user interface component that may be used to visualizes or change the properties a data object 220. Editor 202 may include a view 204 and/or a model 206. Model 206 is a data object that defines the business logic of editor 202. View 204 is a visual representation of editor 202 that is presented to the user.
  • Layout object 210 comprises a stored description of how the data object 220 is shown in the given editor 202. Not every editor needs a layout object 210.
  • FIG. 3 illustrates an example main screen 300 of a test development environment 102 according to embodiments of the invention. In some embodiments, screen 300 includes one or more of an explorer pane 302, a properties pane 306 and a detail pane 304. Explorer pane 302 in some embodiments provides a list of categories and objects that may exist in a repository, and provides an interface to select categories and items for editing. In the example shown, explorer pane 302 includes user interface elements for test sets, test cases, test verbs, navigation maps, global variables, data sources, resource tables, custom code, and deployment modules. Selection of a user interface element will typically cause the explorer pane to show expanded or further details or a listing of particular data objects that may be selected for view, execution and/or editing. In the example shown, a Pocket PC 2003 platform definition have been selected. As a result of the selection, property pane 306 has been displayed to show property values for various properties of the platform definition.
  • Properties pane 306 in some embodiments displays a list of properties associated with a selected item. The properties may be selected, and values associated with the properties may be changed. Any manner of editing a property value may be used, including direct text entry, selection from a drop-down box, selection from a check box, or other user interface editing device known in the art.
  • Detail pane 304 is typically used to provide a graphical representation of an interface element selected from explorer pane 302. Further details on such graphical representations are illustrated below with reference to FIGS. 5-6.
  • FIG. 4 illustrates an example flow diagram for test design. A typical design flow 400 starts at 402 by providing templates for components used in defining a screen. The example flow 400 for a test design is a bottom-up design flow and is not the only flow possible. Top-down flows and inside-out flows are possible and productive as well. An example of a top-down flow would be to define a test case first followed by navigation maps and test verbs. The purpose of templates is to automate the process of creating screen components. The templates may provide standard property values and/or bitmap definitions for various user interface elements of a device under test. In some embodiments, the templates may include buttons, boxes, menus, icons, tables, text fields, track bars, and keyboard elements.
  • Next, at 404, a component instance is created. The component instance may be created from a template provided at 402 by dragging the template over or near the relevant portion of a screen image as will be further described below. In some embodiments components are automatically recognized without user action when the screen is added to the navigation map. In general, instances of components are related to the template used to create the component instance. This is desirable, because a change in the template may be automatically propagated to all of the component instances that refer to the template. Templates may significantly reduce the user workload when creating components by automatically recognizing the component on the screen, and by automatically setting all the properties of the component.
  • At block 406, a screen is defined using components defined at block 404. In some embodiments, a screen is defined using one or more components, some or all of which must be present in order for the screen to be recognized and operate properly. A user interface for defining a screen according to embodiments of the invention is described below.
  • At bock 408, a navigation map is defined using one or more screens. In general, a navigation map comprises a set of one or more screens as defined at block 406, together with commands and data that define the transitions from one screen to another. A user interface for creating and maintaining a navigation map is provided below.
  • At block 410, a test verb may be defined using test logic and the navigation maps defined at block 408. Similarly, at block 412 a test case may be defined using test logic the navigation maps defined at block 408. The test logic for a test case may invoke a test verb. An interface screen for defining test logic and using navigation maps is described below.
  • At block 414, a test set may be defines using one or more test cases. Similarly, at block 416 test sequence may be defined using one or more test cases, one or more test sets, or a combination of test cases and test sets. A test sequence may include an order of execution for the test cases and test sets in the test sequence.
  • At block 418, a test session is defined. A test session defines the global variables, and other test parameters for a particular execution of a test sequence, test set, or test case.
  • FIGS. 5A-5C illustrate example navigation map panels and screen editor panels of a test development environment according to embodiments of the invention that may be used in the test design process illustrated in FIG. 4.
  • FIG. 5A illustrates example screen interfaces 502 and 504 for selecting a platform definition and device for a navigation map. In some embodiments, a platform definition is selected in the case that multiple polymorphic versions of the navigation map have been defined. Screen interface 502 provides an example platform definition selection screen. In some embodiments, a list of available platform types is provided in a list. The desired platform definition may be selected and used to provide a platform definition context for the navigation map.
  • Screen interface 504 provides a device selection interface. Screen 504 provides a list of available devices defined within the repository. Those devices having attributes that match the currently selected platform definition may be highlighted on one manner, while those devices whose attributes do not match the currently selected platform type may be highlighted differently or not highlighted at all. For example, in some embodiments, devices having attributes that match the currently selected platform definition are displayed using green text, while devices that do not match the currently selected platform are displayed using red text.
  • FIG. 5B illustrates an example main navigation map screen 506 according to embodiments of the invention. In the example shown, a navigation map has been selected from explorer pane 302, and detail pane 304 is updated to display the selected navigation map 508. In the example, the navigation map 508 includes data for navigating between five screens 510.1-510.7. Screen 510.1 comprises a main screen on the DUT, screen 510.2 comprises a tasks screen on the DUT, screen 510.3 comprises a clock screen, screen 510.4 comprises a main notes screen on the DUT, screen 510.5 comprises an owner information screen on the DUT, screen 510.6 comprises a email screen, an screen 510.7 comprises an add notes screen. Each of screens 510.2-510.6 is reachable from the main screen 510.1, and each screen 510.2-510.6 can return to the main screen. However, screen 510.7 is reachable only through main notes screen 510.4. Connecting lines 512 illustrate the navigation between the screens. In some embodiments, arrows indicate the direction in which navigation is permitted. As shown in FIG. 5B, Navigation between screens is not limited to screens that are directly connected. For example, to navigate from screen 510.1 to screen 510.7, the system automatically recognizes that screen 510.6 may be used as an intermediate screen to get from screen 510.1 to 510.7.
  • Screens may be added and deleted from the navigation map. In some embodiments, recording buttons 514 may be used to add screens to a navigation map. The recording buttons may include a record button, pause button, and stop button. Upon pressing the record button, stimulus originating from a virtual device 519 is sent to the DUT and is recorded. After the DUT has transitioned to new screens, as seen on the virtual device 519, the user may press the add screen button 518. After the add screen button 518 is press the screen is added to the navigation map, displayed on navigation map 506, and may be saved. A connecting line 512 called a screen transition is placed on navigation map 502 connecting the new screens to the originating screen. The connecting line 512 represents the stimulus (transition steps) that caused the screen transition. The user may view or edit the transition step by double-clicking on the line, if for any reason the steps need to be changed.
  • When not in record mode, selecting a screen 510 from navigation map 502 causes the system to issue the appropriate commands to navigate to the selected screen on the DUT. In addition, the current screen may be highlighted on the navigation map 506 using any mechanism for highlighting known in the art. For example, the screen border may be displayed in a different or brighter color then non-selected screens, or the screen 510 may be made brighter or made to blink etc.
  • In some embodiments, a screen may be designated as the anchored screen. In this example, the anchored screen is designated by an anchor icon 517. A screen anchor is a set of input steps that will cause the DUT to go to the anchored screen regardless of what screen the DUT is currently on. The system uses screen anchors when/if it cannot recognize the current screen on the DUT. For example, most systems have a mechanism for returning a screen to a home or initial screen, no matter what screen a DUT may currently be displaying. A screen anchor is useful in placing the system in a known initial state when a test case begins. In addition, an anchor screen may be used to place the system at a known screen in the event that the system cannot locate a screen during a screen navigation operation. This capability makes it possible to continue test execution even though a DUT failure has occurred. A visual indication is shown for screens which have screen identifier defined. In the example, screens with identifiers are shown with a green checkmark icon 515.
  • As can be seen from the above, some embodiments implement a declarative screen model for automated testing. In general, declarative models describe what something is like, rather than how to create it. For example, a declarative model describes the properties of the objects in the model.
  • FIG. 5C illustrates a screen editor 520 according to embodiments of the invention. Screen editor 520 may be entered upon selecting a screen from navigation map 506 when the test environment is not in record mode. Screen editor 520 includes a component explorer 528, screen bitmap pane 522, and a screen component list 524. Screen bitmap pane 522 displays a bitmap of the currently selected screen.
  • Component explorer 528 contains a list of templates for each type of component that may be used to define a screen of a DUT, and which may be automatically recognized by the system.
  • Screen component list 524 displays a list of components defined for the screen. The components may be recognized components or not recognized components. A recognized component is one that the system has successfully detected as being present on the screen bitmap 522. In some embodiments, recognized components are highlighted on the screen bitmap 522. For example, a recognized component such as a button may be surrounded by a highlighted box. In the example shown, owner button 526 is among the components that have been recognized as present in screen bitmap 522. Not recognized components are components that have been defined as possibly being present, but are not currently found on screen bitmap 522. In some embodiments, property panel 524 lists the recognized components and the unrecognized components defined for a particular screen. Components in the list may be designated a screen identifiers through the use of a right-click menu. Components that have been designated as identifiers for the screen are shown by using bolded type for component name 529. A screen identifier is a component or set of components that may be used to uniquely recognize a screen. The system then uses the screen identifiers to determine which screen a DUT is currently on, and may also use the screen identifiers to determine that the system has navigated to the correct screen.
  • In order to add a component to screen component list 524, a user selects the desired component type from component explorer pane 528. A recognition rule editor is then invoked that allows a user to define how the component will be recognized on a screen. Or, the user may use a component template that is listed under the component type. When a component template is used the component is automatically created, and therefore the recognition rule editor is not invoked.
  • FIG. 5D illustrates a recognition rule editor 530 according to embodiments of the invention. In the example shown, a user has indicated that a button is to be recognized. The recognition rule editor 530 includes a current screen bitmap 522, a recognition mode pane 532, and a bitmap 534. Recognition mode 532 controls how the button will be recognized. In some embodiments, recognition may be position based, text based, or icon based. In position based recognition, a particular bit pattern is expected at a particular position on the screen. The position may be defined using four comers, a border search, a line based search, or an absolute position. Additionally, recognized components may be highlighted on the screen. In some embodiments, recognized components are highlighted by placing a colored rectangle around the recognized portion.
  • In text based recognition, text on a button label may used to recognize the button. The user may select a bitmap for the text, upon which the system may perform optical character recognition in order to determine the actual text contained in the bitmap. The user may then verify the correct text for the button.
  • In icon based recognition, a bitmap area for the button is selected from screen bitmap 522. In some embodiments, the user may define bits within the bitmap area as being significant to the button recognition, or as “don't care bits” in which the system will ignore the value for purposes of button recognition. For example, a square area may be selected that contains a circular button. Pixels in the four comers of the selected area that are outside of the circular button bitmap may be designated as “don't care” bits because they are not significant in detecting the presence of the button. In addition, bits that may be set as part of a background color may be designated as “don't care” bits in order to allow for the same button to be detected no matter what background a user has selected.
  • A screen recognition component may be designated as a template component. In this case, a template is defined for recognizing the component. Once defined, the template may be applied to multiple screens where the same button may appear. Thus a user does not have to redefine a recognition rule for each screen a button may appear on. After a component template has been defined it appears under the parent component type in the component explorer list 528.
  • FIG. 6A illustrates an example test design panel 600 of a test development environment according to embodiments of the invention. In general, the test design panel is used to create, edit, debug, and run test cases and test verbs. The test design panel has three basic modes of operation. They are design, debug, and run modes. The design mode is used to create and edit test cases and verbs through recording on the virtual device 519, or by drag and drop construction using test logic blocks. In the example shown, test panel 600 includes a test case flow chart 602 and a test logic block list 604. Test logic block list 604 provides a list of the differing test symbols and operations that may be used to define a test case or test verb. Each symbol may comprise one or more steps in the test logic. In some embodiments, the list includes test logic blocks to assign values to variables, invoke other test cases, catch exceptions, test conditions or values, invoke custom written code, fill in variables or screen input areas with data from a data source, execute steps in the test case as part of a loop, execute an expression, go to a screen on a platform, log text to a log file, or end the test case. Those of skill in the art will appreciate that other symbols representing test logic may be used and are within the scope of the inventive subject matter
  • To create or edit a test case or test verb, a user selects the desired test symbol from symbol list 604 and drags the symbol to flow chart 602. A new symbol may be connected to a previously existing symbol either based on the relative positions of the symbols, or a user may explicitly drag a connecting line from one symbol to another. Connection points on the symbols define the entry point for a test step and exit points for the test step. In some embodiments, two exit connection points may be provided. One connection point may be used to proceed to the next symbol if the current step successfully exits. A second connection point may be used to proceed to a differing symbol if the current step results in an error condition. In addition to using drag and drop operations to create and edit test cases and test verbs, the virtual device 519 may be used to add or insert test logic blocks into the test logic flow. DUT stimulus commands such as click on button or select menu item may be recorded into the test case or verb. When in design-record mode, components defined on the current screen are outlined to give the user feedback as to the presence of the screen component. The use then can use a right-click menu to record stimulus, read, or verify commands associated with the highlighted component. After command selection the DUT is stimulated, if the command is a stimulation command, and the corresponding test logic block is inserted into the test case or test verb. In addition to recording commands associated with screen components, GoTo Screen and GoTo Platform commands may be recorded as well. When a GoTo screen command is selected/recorded the system automatically navigates to the selected screen on the DUT by using the associated navigation map. The corresponding test logic block is inserted into the test case or test verb. When a GoTo Platform command is selected the system automatically switches the DUT context to the DUT associated with the selected platform. Bubbles or clouds are used as background to show the test logic commands that are within the context of GoTo Screen and GoTo Platform commands. Within a GoTo Screen context bubble all the test logic blocks contained within are related to, or are in the context of, the selected screen. Within a GoTo Platform context bubble all the test logic blocks contained within are related to, or are in the context of, the selected screen, including GoTo Screen blocks.
  • In some embodiments, icons within the symbols in test case flow chart 602 indicate the type of symbol (loop, assignment etc.) and may also be selected to set break points at the symbol to be used for debugging purposes.
  • Further, in some embodiments, a variable window 606 may be displayed and used to examine the values of variables at various points in the execution of a test case.
  • FIG. 7 illustrates methods 700 for creating and using navigation maps according to embodiments of the invention. The method begins in some embodiments at block 702, where a virtual device representing a device under test may be displayed.
  • At block 704, the system receives a specification of a source screen, where the source screen may be an application (tasks, e-mail, owner etc.) of a device under test. In some embodiments, the specification may be made by selecting the screen from the virtual device. Alternatively, the source screen may be selected from a screen image already present on a navigation map.
  • At block 706, the system receives a specification of a target screen. Typically the specification will be in the form of a selection from a virtual device. At block 708, the system records the transition commands that cause the transition from the source screen to the target screen.
  • At block 710 a graphical rendition of the navigation map is displayed, reflecting the addition of the target screen to the navigation map. Additionally, an indication of the transition may be displayed. In some embodiments, the indication is a line connecting the source screen and the target screen.
  • Blocks 704-708 may be repeated as desired to record and add multiple source and target screens and along with their associated transitions to a navigation map.
  • At block 712 the navigation map may be stored in a repository for later use or editing.
  • The actions at blocks 702-712 may be referred to as design-time activities. That is, they take place when an automated test procedure is being created or edited. After the design-time activities have taken place, the run-time actions (blocks 714-718) using the previously created navigation map may take place.
  • At block 714 a system using the navigation map receives a command specifying a target screen. The command may be part of test logic for an automated test procedure such as a test case, or a test verb procedure within a test case. Alternatively, the command may be received as a selection from a graphical user interface indicating the user wishes to work with a different screen than is currently being displayed, for example, by a navigation map editor or a test case editor.
  • At block 716 the system determines the current screen. At block 718, the system reads transition commands from the navigation map to determine which commands or stimulus must be applied to the device under test to cause a transition from the current screen to the target screen. It should be noted that there may be one or more intermediate screens that will be transitioned through to reach the target screen. The commands to cause the transition may then be issued to the device under test.
  • In some embodiments, the system may verify the target screen has been reached, and may verify intermediate screens to insure that navigation has not been lost. Further, in some embodiments, if navigation is lost while attempting to transition from a source screen to a target screen, the system may issue commands to the device under test to cause the device to return to a previously specified anchor screen. The system can then treat the anchor screen as a source screen and issue commands to cause the device under test to transition to the target screen.
  • Multiple Device Environments
  • As discussed above, a system may include multiple target services 160, each of which may manage multiple devices under test 140. It should be noted that the number of devices under test 140 can grow quit large. In addition, many various types and versions of devices under test may be coupled to the system via target services 160, and that many different software environments may exist on the devices under test. As a result, the number of platform types, and platform groups, and platform definitions may be quite large and vary from system to system.
  • Additionally, an automated test procedure may need to deal with more than one device under test during the design, debugging, and execution of the automated test procedure.
  • Thus in some embodiments, the system manages both pools of similar devices under test and groups of disparate devices under test. In some embodiments, platform definitions, platform types, and platform roles as described above may be used to determine which of the many devices under test that may be available within a system should be selected when designing, debugging, or executing an automated test procedure.
  • In some embodiments, a system selects a device under test based on whether the candidate device is available and whether the candidate device attributes match the parameters defined by the platform definition, the platform group, the platform type, and/or any platform roles utilized by an automated test procedure.
  • A further aspect of a multi-device system of some embodiments includes polymorphism for various aspects of the system. In general, polymorphism refers to ability to process objects differently depending on their platform type, platform group, platform definition, or role. In some embodiments, the system provides polymorphic navigation maps and polymorphic test verbs. However, the embodiments are not limited to applying polymorphic operation to any particular element of the system, and in alternative embodiments, polymorphism may be applied to other elements. For example, some embodiments may provide polymorphic component templates.
  • A polymorphic navigation map comprises a navigation map that may be reused across all of the platforms within a platform group. In general a polymorphic navigation map provides an abstraction of the presentation layer for the platforms within a group. Thus the navigation map may remain constant (and be reused) for platforms within a platform group because the underlying behavior of the operating systems and applications within a platform group does not change across devices in a platform group.
  • A polymorphic test verb comprises a test verb that may be reused across platform types. In general a polymorphic test verb provides an abstraction of the behavior of devices. For example, two devices may implement task list in very different ways, using different device behaviors to add tasks to the task list. However, a single polymorphic test verb labeled “CreateTask” may be called from an automated test procedure. The interface to the test verb may remain constant, however different version of the test verb may be implemented for various platform types. The particular instance of the test verb called may vary depending on the platform type for a current device under test. The automated test procedure need only call the polymorphic test verb, the system determines the particular instance of the test verb depending on the platform context. Thus it may not be necessary to alter the test logic of an automated test procedure when a new platform type is added, a designer may only need to supply an instance of the polymorphic test verb for the new platform type.
  • A further aspect of various embodiments includes managing test sessions in which multiple test cases may need to access multiple devices. This can occur because a single test case may access more than one device under test, or different test cases may access different devices under test. In some embodiments, the test cases may specify a platform role. During a test session, the platform role may be associated with particular platform type, platform group, and/or platform definition. The system aggregates all of the roles within the potentially many test cases within a test session, and determines how many unique roles exist for the test session. Thus within a test session, each test case that refers to the same platform role will access the same device under test. This is desirable, because it allows a test designer to avoid having to specify particular devices under test for each of what may be many test cases.
  • The systems and methods described above can include hardware, firmware, and/or software for performing the operations described herein. Furthermore, any of the components can include machine-readable media including instructions for causing a machine to perform the operations described herein. Machine-readable media includes any mechanism that provides (i.e., stores and/or transmits) information in a form readable by a machine (e.g. a computer). For example, tangible machine-readable media includes read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory machines, etc. Machine-readable media also includes any media suitable for transmitting software over a network.
  • The illustrations of embodiments described herein are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use of the structures described herein. The accompanying drawings that form a part hereof show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced.
  • Applications that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computer systems. The elements, materials, geometries, dimensions, and sequence of operations can all be varied to suit particular packaging requirements.
  • Embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense.
  • The Abstract is provided to comply with 37 C.F.R. § 1.72(b) to allow the reader to quickly ascertain the nature and gist of the technical disclosure. The Abstract is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.
  • In the foregoing Detailed Description, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that any embodiment have more features than are expressly recited in a claim. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims (19)

1. A method comprising:
providing an interface to specify at least a source screen and a target screen for a device under test; and
storing in a navigation map a set of one or more commands that specify a transition from the source screen to the target screen.
2. The method of claim 1, further comprising recording the set of one or more commands that specify the transition from the source screen to the target screen.
3. The method of claim 2, further comprising:
displaying a virtual device representing the device under test;
recording stimulus from the virtual device to form the set of one or more commands.
4. The method of claim 1, wherein specifying the source screen and the target screen comprise selecting a graphical screen image of the source screen and a graphical image of the target screen.
5. The method of claim 1, wherein the navigation map comprises a plurality of transitions between one or more source screens and one or more target screens.
6. The method of claim 5, wherein the navigation map comprises a declarative model specifying the transitions.
7. The method of claim 1, further comprising displaying a graphical representation of the navigation map, the graphical representation including screen images of a set of screens in the navigation map.
8. The method of claim 1, wherein the graphical representation includes an indication of a current screen of the device under test.
9. The method of claim 1, wherein the graphical representation includes graphical indicators of the transitions between the set of screens in the navigation map.
10. A method comprising:
receiving a command that specifies a target screen for a device under test;
determining a current screen for the device under test;
interpreting a navigation map comprising a set of one or more stored commands that specify a transition from the current screen to the target screen.
11. The method of claim 10, wherein the command is received as a result of executing test logic.
12. The method of claim 10, wherein the command is received as a result of a screen selection within a navigation map editor.
13. The method of claim 10, wherein the set of one or more stored commands specify a transition to at least one intermediate screen between the current screen and the target screen.
14. The method of claim 10, further comprising issuing the commands to the device under test.
15. The method of claim 10, further comprising displaying a graphical representation of the navigation map, wherein the graphical representation includes an indication of the current screen of the device under test.
16. A system comprising:
a test development environment;
a repository communicably coupled to the test development environment;
a navigation map user interface operable to:
receive a specification of at least a source screen and a target screen for a device under test; and
store in the repository a navigation map including a set of one or more commands that specify a transition from the source screen to the target screen; and
a device under test communicably coupled to the test development environment.
17. The system of claim 16, further comprising a virtual device module to display a representation of a device under test, the representation including a current screen of the device under test.
18. The system of claim 17, wherein the virtual device is operable to receive stimulus that is forwarded to the device under test.
19. The system of claim 17, wherein the device under test is an emulated device.
US11/421,407 2005-05-31 2006-05-31 Systems and methods providing a declarative screen model for automated testing Abandoned US20070005299A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/421,407 US20070005299A1 (en) 2005-05-31 2006-05-31 Systems and methods providing a declarative screen model for automated testing
US12/272,652 US20090125826A1 (en) 2005-05-31 2008-11-17 Systems and methods providing a declarative screen model for automated testing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US68595805P 2005-05-31 2005-05-31
US11/421,407 US20070005299A1 (en) 2005-05-31 2006-05-31 Systems and methods providing a declarative screen model for automated testing

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/272,652 Continuation US20090125826A1 (en) 2005-05-31 2008-11-17 Systems and methods providing a declarative screen model for automated testing

Publications (1)

Publication Number Publication Date
US20070005299A1 true US20070005299A1 (en) 2007-01-04

Family

ID=37482262

Family Applications (6)

Application Number Title Priority Date Filing Date
US11/421,453 Abandoned US20060271322A1 (en) 2005-05-31 2006-05-31 Systems and Methods Providing A Normalized Graphical User Interface For Testing Disparate Devices
US11/421,476 Expired - Fee Related US7529990B2 (en) 2005-05-31 2006-05-31 Systems and methods for managing multi-device test sessions
US11/421,407 Abandoned US20070005299A1 (en) 2005-05-31 2006-05-31 Systems and methods providing a declarative screen model for automated testing
US11/421,464 Abandoned US20070005300A1 (en) 2005-05-31 2006-05-31 Systems and methods for graphically defining automated test procedures
US11/421,468 Abandoned US20070005281A1 (en) 2005-05-31 2006-05-31 Systems and Methods Providing Reusable Test Logic
US12/272,652 Abandoned US20090125826A1 (en) 2005-05-31 2008-11-17 Systems and methods providing a declarative screen model for automated testing

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US11/421,453 Abandoned US20060271322A1 (en) 2005-05-31 2006-05-31 Systems and Methods Providing A Normalized Graphical User Interface For Testing Disparate Devices
US11/421,476 Expired - Fee Related US7529990B2 (en) 2005-05-31 2006-05-31 Systems and methods for managing multi-device test sessions

Family Applications After (3)

Application Number Title Priority Date Filing Date
US11/421,464 Abandoned US20070005300A1 (en) 2005-05-31 2006-05-31 Systems and methods for graphically defining automated test procedures
US11/421,468 Abandoned US20070005281A1 (en) 2005-05-31 2006-05-31 Systems and Methods Providing Reusable Test Logic
US12/272,652 Abandoned US20090125826A1 (en) 2005-05-31 2008-11-17 Systems and methods providing a declarative screen model for automated testing

Country Status (2)

Country Link
US (6) US20060271322A1 (en)
WO (1) WO2006130684A2 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060271322A1 (en) * 2005-05-31 2006-11-30 David Haggerty Systems and Methods Providing A Normalized Graphical User Interface For Testing Disparate Devices
US20080016519A1 (en) * 2006-04-19 2008-01-17 Nec Corporation Screen transition program generating method and device
US20090006063A1 (en) * 2007-06-29 2009-01-01 Microsoft Corporation Tuning and optimizing distributed systems with declarative models
US20090113292A1 (en) * 2007-10-26 2009-04-30 Microsoft Corporation Flexibly editing heterogeneous documents
US20090113407A1 (en) * 2007-10-26 2009-04-30 Microsoft Corporation Managing software lifecycle
US7814198B2 (en) 2007-10-26 2010-10-12 Microsoft Corporation Model-driven, repository-based application monitoring system
US7926070B2 (en) 2007-10-26 2011-04-12 Microsoft Corporation Performing requested commands for model-based applications
US7974939B2 (en) 2007-10-26 2011-07-05 Microsoft Corporation Processing model-based commands for distributed applications
US8024396B2 (en) 2007-04-26 2011-09-20 Microsoft Corporation Distributed behavior controlled execution of modeled applications
US8099720B2 (en) 2007-10-26 2012-01-17 Microsoft Corporation Translating declarative models
US8181151B2 (en) 2007-10-26 2012-05-15 Microsoft Corporation Modeling and managing heterogeneous applications
US8230386B2 (en) 2007-08-23 2012-07-24 Microsoft Corporation Monitoring distributed applications
US8239505B2 (en) 2007-06-29 2012-08-07 Microsoft Corporation Progressively implementing declarative models in distributed systems
CN103324452A (en) * 2012-03-23 2013-09-25 百度在线网络技术(北京)有限公司 Control device and method for displaying content of terminal screen synchronously

Families Citing this family (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8141043B2 (en) 2005-01-11 2012-03-20 Worksoft, Inc. Automated business process testing that spans multiple platforms or applications
US7600220B2 (en) 2005-01-11 2009-10-06 Worksoft, Inc. Extensible execution language
US8589140B1 (en) 2005-06-10 2013-11-19 Wapp Tech Corp. System and method for emulating and profiling a frame-based application playing on a mobile device
US7813910B1 (en) 2005-06-10 2010-10-12 Thinkvillage-Kiwi, Llc System and method for developing an application playing on a mobile device emulated on a personal computer
US7676783B2 (en) * 2005-06-27 2010-03-09 Ikoa Corporation Apparatus for performing computational transformations as applied to in-memory processing of stateful, transaction oriented systems
JP4479664B2 (en) * 2006-01-24 2010-06-09 株式会社豊田中央研究所 Multiple test system
JP4148527B2 (en) * 2006-06-05 2008-09-10 インターナショナル・ビジネス・マシーンズ・コーポレーション Functional test script generator
US7559001B2 (en) * 2006-06-14 2009-07-07 Sapphire Infotech Inc. Method and apparatus for executing commands and generation of automation scripts and test cases
US7801050B2 (en) * 2006-12-12 2010-09-21 Cisco Technology, Inc. Remote testing of an electronic device via network connection
US8595784B2 (en) * 2007-01-05 2013-11-26 Verizon Patent And Licensing Inc. System for testing set-top boxes and content distribution networks and associated methods
US8538414B1 (en) 2007-07-17 2013-09-17 Google Inc. Mobile interaction with software test cases
US8707269B1 (en) * 2007-11-19 2014-04-22 The Mathworks, Inc. Dynamic test generation based on entities in a graphical environment
US7984335B2 (en) * 2008-03-20 2011-07-19 Microsoft Corporation Test amplification for datacenter applications via model checking
US9009666B1 (en) 2008-05-30 2015-04-14 United Services Automobile Association (Usaa) Systems and methods for testing software and for storing and tracking test assets with the software
CN101645983A (en) * 2008-08-08 2010-02-10 鸿富锦精密工业(深圳)有限公司 Network management system and method using same for testing network equipment
US20100115444A1 (en) * 2008-11-03 2010-05-06 James Adam Cataldo Plot-Driven Measurement
US8838819B2 (en) 2009-04-17 2014-09-16 Empirix Inc. Method for embedding meta-commands in normal network packets
US9298686B2 (en) * 2009-05-14 2016-03-29 Golub Capital, Llc System and method for simulating discrete financial forecast calculations
TW201101170A (en) * 2009-06-26 2011-01-01 Ibm Computer apparatus and method for processing graphic user interface (GUI) objects
US8402446B2 (en) * 2009-11-30 2013-03-19 International Business Machines Corporation Associating probes with test cases
US8386207B2 (en) * 2009-11-30 2013-02-26 International Business Machines Corporation Open-service based test execution frameworks
US8745727B2 (en) * 2010-04-23 2014-06-03 Verizon Patent And Licensing Inc. Graphical user interface tester
US8301937B2 (en) * 2010-05-26 2012-10-30 Ncr Corporation Heartbeat system
US9329908B2 (en) 2010-09-29 2016-05-03 International Business Machines Corporation Proactive identification of hotspots in a cloud computing environment
CN103502952B (en) * 2011-03-08 2017-06-09 惠普发展公司,有限责任合伙企业 Create test case
US20120246609A1 (en) 2011-03-24 2012-09-27 International Business Machines Corporation Automatic generation of user stories for software products via a product content space
AU2012203333A1 (en) 2011-06-15 2013-01-10 Agile Software Pty Limited Method and apparatus for testing data warehouses
AU2013200887B2 (en) * 2012-02-18 2015-09-17 Tata Consultancy Services Limited Multi-entity test case execution workflow
US8949673B2 (en) * 2012-05-23 2015-02-03 Sap Se Software systems testing interface
JP5833502B2 (en) * 2012-06-04 2015-12-16 株式会社アドバンテスト Test program
JP2013250250A (en) 2012-06-04 2013-12-12 Advantest Corp Tester hardware and test system using the same
JP2013250252A (en) * 2012-06-04 2013-12-12 Advantest Corp Test program
USD759704S1 (en) 2012-12-05 2016-06-21 Ivoclar Vivadent Ag Display screen or a portion thereof having an animated graphical user interface
US9104813B2 (en) * 2012-12-15 2015-08-11 International Business Machines Corporation Software installation method, apparatus and program product
US9111040B2 (en) 2013-01-15 2015-08-18 International Business Machines Corporation Integration of a software content space with test planning and test case generation
US9087155B2 (en) 2013-01-15 2015-07-21 International Business Machines Corporation Automated data collection, computation and reporting of content space coverage metrics for software products
US9396342B2 (en) 2013-01-15 2016-07-19 International Business Machines Corporation Role based authorization based on product content space
US9218161B2 (en) 2013-01-15 2015-12-22 International Business Machines Corporation Embedding a software content space for run-time implementation
US9063809B2 (en) 2013-01-15 2015-06-23 International Business Machines Corporation Content space environment representation
US9081645B2 (en) 2013-01-15 2015-07-14 International Business Machines Corporation Software product licensing based on a content space
US9075544B2 (en) 2013-01-15 2015-07-07 International Business Machines Corporation Integration and user story generation and requirements management
US9659053B2 (en) * 2013-01-15 2017-05-23 International Business Machines Corporation Graphical user interface streamlining implementing a content space
US9141379B2 (en) 2013-01-15 2015-09-22 International Business Machines Corporation Automated code coverage measurement and tracking per user story and requirement
US9069647B2 (en) 2013-01-15 2015-06-30 International Business Machines Corporation Logging and profiling content space data and coverage metric self-reporting
KR20140095882A (en) * 2013-01-25 2014-08-04 삼성전자주식회사 Test system for evaluating mobile device and driving method thereof
US9785542B2 (en) * 2013-04-16 2017-10-10 Advantest Corporation Implementing edit and update functionality within a development environment used to compile test plans for automated semiconductor device testing
CA2910977A1 (en) * 2013-05-02 2014-11-06 Amazon Technologies, Inc. Program testing service
JP2014235127A (en) * 2013-06-04 2014-12-15 株式会社アドバンテスト Test system, control program, and configuration data write method
US9529699B2 (en) * 2013-06-11 2016-12-27 Wipro Limited System and method for test data generation and optimization for data driven testing
EP3021225B1 (en) * 2014-11-14 2020-07-01 Mastercard International, Inc. Automated configuration code based selection of test cases for payment terminals
US9838295B2 (en) 2015-11-23 2017-12-05 Contec, Llc Wireless routers under test
US10320651B2 (en) 2015-10-30 2019-06-11 Contec, Llc Hardware architecture for universal testing system: wireless router test
US20170126536A1 (en) 2015-10-30 2017-05-04 Contec, Llc Hardware Architecture for Universal Testing System: Cable Modem Test
US9960989B2 (en) * 2015-09-25 2018-05-01 Contec, Llc Universal device testing system
US10277497B2 (en) 2015-09-25 2019-04-30 Contec, Llc Systems and methods for testing electronic devices using master-slave test architectures
US10122611B2 (en) 2015-09-25 2018-11-06 Contec, Llc Universal device testing interface
US9900116B2 (en) 2016-01-04 2018-02-20 Contec, Llc Test sequences using universal testing system
US9810735B2 (en) 2015-09-25 2017-11-07 Contec, Llc Core testing machine
US9992084B2 (en) 2015-11-20 2018-06-05 Contec, Llc Cable modems/eMTAs under test
US10291959B2 (en) 2015-09-25 2019-05-14 Contec, Llc Set top boxes under test
US9900113B2 (en) 2016-02-29 2018-02-20 Contec, Llc Universal tester hardware
US10649024B2 (en) * 2017-03-03 2020-05-12 Pioneer Decisive Solutions, Inc. System for providing ATE test programming by utilizing drag-and-drop workflow editing in a time domain environment
US10938687B2 (en) * 2017-03-29 2021-03-02 Accenture Global Solutions Limited Enabling device under test conferencing via a collaboration platform
US10820274B2 (en) * 2017-06-19 2020-10-27 T-Mobile Usa, Inc. Systems and methods for testing power consumption of electronic devices
US10725890B1 (en) 2017-07-12 2020-07-28 Amazon Technologies, Inc. Program testing service
US11550708B1 (en) 2018-05-17 2023-01-10 Konark Research, Inc. System, method and apparatus for selection of hardware and software for optimal implementation of one or more functionality or algorithm
US11080168B1 (en) 2018-05-17 2021-08-03 Konark Research, Inc. System, method and apparatus for selection of hardware and software for optimal implementation of one or more functionality or algorithm
CN109446074B (en) * 2018-09-28 2022-04-01 深圳市网心科技有限公司 Pressure testing method and device for on-demand distribution system, computer device and computer storage medium
US10824541B1 (en) 2018-10-18 2020-11-03 State Farm Mutual Automobile Insurance Company System and method for test data fabrication
CN109298984A (en) * 2018-10-30 2019-02-01 天津津航计算技术研究所 A kind of one-to-many test macro and test method based on Ethernet
US10353804B1 (en) * 2019-01-22 2019-07-16 Capital One Services, Llc Performance engineering platform and metric management
USD947219S1 (en) * 2019-09-12 2022-03-29 Lenovo (Beijing) Co., Ltd. Display screen or portion thereof with graphical user interface
TWI724742B (en) * 2020-01-09 2021-04-11 華碩電腦股份有限公司 Diagnostic system

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3082374A (en) * 1959-06-12 1963-03-19 Itt Automatic testing system and timing device therefor
US5956665A (en) * 1996-11-15 1999-09-21 Digital Equipment Corporation Automatic mapping, monitoring, and control of computer room components
US5973692A (en) * 1997-03-10 1999-10-26 Knowlton; Kenneth Charles System for the capture and indexing of graphical representations of files, information sources and the like
US6331864B1 (en) * 1997-09-23 2001-12-18 Onadime, Inc. Real-time multimedia visual programming system
US6449741B1 (en) * 1998-10-30 2002-09-10 Ltx Corporation Single platform electronic tester
US20030084429A1 (en) * 2001-10-26 2003-05-01 Schaefer James S. Systems and methods for table driven automation testing of software programs
US20030101025A1 (en) * 2001-08-15 2003-05-29 National Instruments Corporation Generating a configuration diagram based on user specification of a task
US20030208712A1 (en) * 2002-05-01 2003-11-06 Michael Louden Method and apparatus for making and using wireless test verbs
US20030208288A1 (en) * 2002-05-01 2003-11-06 Testquest, Inc. Method and apparatus for making and using test verbs
US20030208542A1 (en) * 2002-05-01 2003-11-06 Testquest, Inc. Software test agents
US20040027373A1 (en) * 2002-08-07 2004-02-12 Jacquot Bryan Joseph Linked screen demonstration program for computer application programs
US20040124860A1 (en) * 2001-12-20 2004-07-01 Hamdan Fadi Adel Automated test sequence editor and engine for transformer testing
US6948152B2 (en) * 2001-09-14 2005-09-20 Siemens Communications, Inc. Data structures for use with environment based data driven automated test engine for GUI applications
US6961873B2 (en) * 2001-09-14 2005-11-01 Siemens Communications, Inc. Environment based data driven automated test engine for GUI applications
US20060100991A1 (en) * 2004-10-21 2006-05-11 International Business Machines Corporation Method for dynamical determination of actions to perform on a selected item in a web portal GUI environment
US20060271322A1 (en) * 2005-05-31 2006-11-30 David Haggerty Systems and Methods Providing A Normalized Graphical User Interface For Testing Disparate Devices

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2996666A (en) * 1957-07-12 1961-08-15 Gen Motors Corp Automatic test apparatus
US4057847A (en) * 1976-06-14 1977-11-08 Sperry Rand Corporation Remote controlled test interface unit
US5291587A (en) * 1986-04-14 1994-03-01 National Instruments, Inc. Graphical system for executing a process and for programming a computer to execute a process, including graphical variable inputs and variable outputs
US4914568A (en) * 1986-10-24 1990-04-03 National Instruments, Inc. Graphical system for modelling a process and associated method
US4894829A (en) * 1988-04-21 1990-01-16 Honeywell Inc. Comprehensive design and maintenance environment for test program sets
US5390129A (en) * 1992-07-06 1995-02-14 Motay Electronics, Inc. Universal burn-in driver system and method therefor
JPH08241185A (en) * 1994-11-03 1996-09-17 Motorola Inc Integrated testing and measuring means as well as method foradoption of graphical user interface
US5953009A (en) * 1997-05-27 1999-09-14 Hewlett-Packard Company Graphical system and method for invoking measurements in a signal measurement system
JPH11118884A (en) * 1997-10-10 1999-04-30 Advantest Corp Testing system and method for controlling the same
US6526566B1 (en) * 1997-11-14 2003-02-25 National Instruments Corporation Graphical programming system and method including nodes for programmatically accessing data sources and targets
US6128759A (en) * 1998-03-20 2000-10-03 Teradyne, Inc. Flexible test environment for automatic test equipment
US6332211B1 (en) * 1998-12-28 2001-12-18 International Business Machines Corporation System and method for developing test cases using a test object library
US6624830B1 (en) * 1999-10-29 2003-09-23 Agilent Technologies, Inc. System and method for defining and grouping signals and buses of a signal measurement system using selection lists on a graphical user interface
US6606721B1 (en) * 1999-11-12 2003-08-12 Obsidian Software Method and apparatus that tracks processor resources in a dynamic pseudo-random test program generator
US6829733B2 (en) * 2001-05-07 2004-12-07 National Instruments Corporation System and method for graphically detecting differences between test executive sequence files
US7162387B2 (en) * 2001-06-29 2007-01-09 National Instruments Corporation Measurement system graphical user interface for easily configuring measurement applications
US8290762B2 (en) * 2001-08-14 2012-10-16 National Instruments Corporation Graphically configuring program invocation relationships by creating or modifying links among program icons in a configuration diagram
US7367028B2 (en) * 2001-08-14 2008-04-29 National Instruments Corporation Graphically deploying programs on devices in a system
US7594220B2 (en) * 2001-08-14 2009-09-22 National Instruments Corporation Configuration diagram with context sensitive connectivity
US6984152B2 (en) * 2001-10-30 2006-01-10 Texas Instruments Incorporated Multifunction passive socket for flash media cards
US7047442B2 (en) * 2002-04-23 2006-05-16 Agilent Technologies, Inc. Electronic test program that can distinguish results
US20040081346A1 (en) * 2002-05-01 2004-04-29 Testquest, Inc. Non-intrusive testing system and method
US7139979B2 (en) * 2002-06-10 2006-11-21 National Instruments Corporation Displaying operations in an application using a graphical programming representation
US20040027378A1 (en) * 2002-08-06 2004-02-12 Hays Grace L. Creation of user interfaces for multiple devices
KR100496861B1 (en) * 2002-09-26 2005-06-22 삼성전자주식회사 Test apparatus having two test boards to one handler and the test method
US7143361B2 (en) * 2002-12-16 2006-11-28 National Instruments Corporation Operator interface controls for creating a run-time operator interface application for a test executive sequence
US7460988B2 (en) * 2003-03-31 2008-12-02 Advantest Corporation Test emulator, test module emulator, and record medium storing program therein
US7552024B2 (en) * 2004-03-08 2009-06-23 Kelbon Richard G Circuit board diagnostic operating center

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3082374A (en) * 1959-06-12 1963-03-19 Itt Automatic testing system and timing device therefor
US6188973B1 (en) * 1996-11-15 2001-02-13 Compaq Computer Corporation Automatic mapping, monitoring, and control of computer room components
US5956665A (en) * 1996-11-15 1999-09-21 Digital Equipment Corporation Automatic mapping, monitoring, and control of computer room components
US5973692A (en) * 1997-03-10 1999-10-26 Knowlton; Kenneth Charles System for the capture and indexing of graphical representations of files, information sources and the like
US6944825B2 (en) * 1997-09-23 2005-09-13 Onadime, Inc. Real-time multimedia visual programming system
US6331864B1 (en) * 1997-09-23 2001-12-18 Onadime, Inc. Real-time multimedia visual programming system
US20020105538A1 (en) * 1997-09-23 2002-08-08 Onadime, Inc. Real-time multimedia visual programming system
US20060010384A1 (en) * 1997-09-23 2006-01-12 Onadime, Inc. Real-time multimedia visual programming system
US6449741B1 (en) * 1998-10-30 2002-09-10 Ltx Corporation Single platform electronic tester
US7191368B1 (en) * 1998-10-30 2007-03-13 Ltx Corporation Single platform electronic tester
US20030101025A1 (en) * 2001-08-15 2003-05-29 National Instruments Corporation Generating a configuration diagram based on user specification of a task
US6961873B2 (en) * 2001-09-14 2005-11-01 Siemens Communications, Inc. Environment based data driven automated test engine for GUI applications
US6948152B2 (en) * 2001-09-14 2005-09-20 Siemens Communications, Inc. Data structures for use with environment based data driven automated test engine for GUI applications
US6993748B2 (en) * 2001-10-26 2006-01-31 Capital One Financial Corporation Systems and methods for table driven automation testing of software programs
US20030084429A1 (en) * 2001-10-26 2003-05-01 Schaefer James S. Systems and methods for table driven automation testing of software programs
US20040124860A1 (en) * 2001-12-20 2004-07-01 Hamdan Fadi Adel Automated test sequence editor and engine for transformer testing
US6788077B2 (en) * 2001-12-20 2004-09-07 Abb Inc. Automated test sequence editor and engine for transformer testing
US20030208288A1 (en) * 2002-05-01 2003-11-06 Testquest, Inc. Method and apparatus for making and using test verbs
US20030208542A1 (en) * 2002-05-01 2003-11-06 Testquest, Inc. Software test agents
US6898704B2 (en) * 2002-05-01 2005-05-24 Test Quest, Inc. Method and apparatus for making and using test verbs
US20030208712A1 (en) * 2002-05-01 2003-11-06 Michael Louden Method and apparatus for making and using wireless test verbs
US6862682B2 (en) * 2002-05-01 2005-03-01 Testquest, Inc. Method and apparatus for making and using wireless test verbs
US20040027373A1 (en) * 2002-08-07 2004-02-12 Jacquot Bryan Joseph Linked screen demonstration program for computer application programs
US20060100991A1 (en) * 2004-10-21 2006-05-11 International Business Machines Corporation Method for dynamical determination of actions to perform on a selected item in a web portal GUI environment
US20060271322A1 (en) * 2005-05-31 2006-11-30 David Haggerty Systems and Methods Providing A Normalized Graphical User Interface For Testing Disparate Devices
US20060271327A1 (en) * 2005-05-31 2006-11-30 David Haggerty Systems and methods for managing multi-device test sessions
US20070005300A1 (en) * 2005-05-31 2007-01-04 David Haggerty Systems and methods for graphically defining automated test procedures
US20070005281A1 (en) * 2005-05-31 2007-01-04 David Haggerty Systems and Methods Providing Reusable Test Logic

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090125826A1 (en) * 2005-05-31 2009-05-14 David Haggerty Systems and methods providing a declarative screen model for automated testing
US20060271322A1 (en) * 2005-05-31 2006-11-30 David Haggerty Systems and Methods Providing A Normalized Graphical User Interface For Testing Disparate Devices
US20080016519A1 (en) * 2006-04-19 2008-01-17 Nec Corporation Screen transition program generating method and device
US8024396B2 (en) 2007-04-26 2011-09-20 Microsoft Corporation Distributed behavior controlled execution of modeled applications
US20110179151A1 (en) * 2007-06-29 2011-07-21 Microsoft Corporation Tuning and optimizing distributed systems with declarative models
US20090006063A1 (en) * 2007-06-29 2009-01-01 Microsoft Corporation Tuning and optimizing distributed systems with declarative models
US8239505B2 (en) 2007-06-29 2012-08-07 Microsoft Corporation Progressively implementing declarative models in distributed systems
US8099494B2 (en) 2007-06-29 2012-01-17 Microsoft Corporation Tuning and optimizing distributed systems with declarative models
US7970892B2 (en) 2007-06-29 2011-06-28 Microsoft Corporation Tuning and optimizing distributed systems with declarative models
US8230386B2 (en) 2007-08-23 2012-07-24 Microsoft Corporation Monitoring distributed applications
US20090113292A1 (en) * 2007-10-26 2009-04-30 Microsoft Corporation Flexibly editing heterogeneous documents
US20110219383A1 (en) * 2007-10-26 2011-09-08 Microsoft Corporation Processing model-based commands for distributed applications
US7974939B2 (en) 2007-10-26 2011-07-05 Microsoft Corporation Processing model-based commands for distributed applications
US7926070B2 (en) 2007-10-26 2011-04-12 Microsoft Corporation Performing requested commands for model-based applications
US8099720B2 (en) 2007-10-26 2012-01-17 Microsoft Corporation Translating declarative models
US8181151B2 (en) 2007-10-26 2012-05-15 Microsoft Corporation Modeling and managing heterogeneous applications
US8225308B2 (en) 2007-10-26 2012-07-17 Microsoft Corporation Managing software lifecycle
US20090113407A1 (en) * 2007-10-26 2009-04-30 Microsoft Corporation Managing software lifecycle
US7814198B2 (en) 2007-10-26 2010-10-12 Microsoft Corporation Model-driven, repository-based application monitoring system
US8306996B2 (en) 2007-10-26 2012-11-06 Microsoft Corporation Processing model-based commands for distributed applications
US8443347B2 (en) 2007-10-26 2013-05-14 Microsoft Corporation Translating declarative models
CN103324452A (en) * 2012-03-23 2013-09-25 百度在线网络技术(北京)有限公司 Control device and method for displaying content of terminal screen synchronously

Also Published As

Publication number Publication date
US20070005300A1 (en) 2007-01-04
US7529990B2 (en) 2009-05-05
US20060271322A1 (en) 2006-11-30
US20090125826A1 (en) 2009-05-14
WO2006130684A2 (en) 2006-12-07
US20070005281A1 (en) 2007-01-04
WO2006130684A3 (en) 2007-11-08
US20060271327A1 (en) 2006-11-30

Similar Documents

Publication Publication Date Title
US7529990B2 (en) Systems and methods for managing multi-device test sessions
US11126543B2 (en) Software test automation system and method
US7979849B2 (en) Automatic model-based testing
US7856619B2 (en) Method and system for automated testing of a graphic-based programming tool
US7913230B2 (en) Computer-implemented methods and systems for generating software testing documentation and test results management system using same
US6754850B2 (en) System and method for performing batch synchronization for a test sequence
US20060074737A1 (en) Interactive composition of workflow activities
US20020124241A1 (en) System and method for synchronizing execution of a batch of threads
CN111190598B (en) Gas turbine monitoring software picture configuration method based on control library drag-type development
US20090031226A1 (en) Method and System for Extending Task Models for Use In User-Interface Design
Canny et al. Engineering model-based software testing of WIMP interactive applications: a process based on formal models and the SQUAMATA tool
CN117215574A (en) Low code development system and method integrating flow design and UI design
US20080066005A1 (en) Systems and Methods of Interfacing with Enterprise Resource Planning Systems
Namaz Refactoring test automation framework using optical character recognition
CN115587036A (en) Test system and test case generation method
Yin Intelligent agent based automatic operating model
Jiang A new approach in GUI testing
JP5251863B2 (en) Program for recording user interface component information and recording / reproducing user interface operations using a tree structure
Sun Statecharts based GUI design
Bennett et al. Learning Objective-C and Xcode
Kyllönen Evaluating model based testing using robot platform for Nokia product software testing
Téllez Sánchez Application of user-centered design in the development of an atomated/assisted testing system for self-service devices
Chittora CAMPUS SELECTION PROCEDURE
Wallman et al. D5. 6.3. 1-How to Handle SEAMLESS-IF Prototype 1, A short how-to description
Kuver Testing Object-Based Applications

Legal Events

Date Code Title Description
AS Assignment

Owner name: TESTQUEST, INC., MINNESOTA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAGGERTY, DAVID;ELKIN, ALEX;OPITZ, SCOTT;REEL/FRAME:018226/0992;SIGNING DATES FROM 20060807 TO 20060829

AS Assignment

Owner name: BSQUARE CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TESTQUEST, INC.;REEL/FRAME:021924/0563

Effective date: 20081118

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION