TestStrategy Lesson 2
TestStrategy Lesson 2
TestStrategy Lesson 2
Prepared by:
Company Name
Project Management Office (PMO)
Contents
1. Revision History 3
2. Approvals 3
3. Definitions 4
4. Reference Documents 4
5. Project Overview 4
6. Test Plan Purpose 4
6.1 Testing Objectives 4
7. Scope 5
7.1 In Scope Requirements 5
7.2 Out of Scope Requirements 5
8. Integrations and Intersystem Interfaces 5
9. Test estimate 6
10. Testing Schedule 6
10.1 Unit 6
10.2 QA & SIT 6
10.3 UAT 6
11. Test Environments 6
11.1 Hardware Configuration 7
12. Test Data 7
13. Test Users 8
14. Testing Responsibilities 9
14.1 VMODEL with Accountability 9
14.2 Roles and Responsibilities for Major Test Events 9
15. Test Types 11
15.1 Unit testing 11
15.2 Functional Integration Test (QA) 11
15.3 System Test 13
15.4 User Acceptance Test (UAT) 14
16. Performance Testing 16
16.1 Performance Test Approach 16
16.1.1 Test Objective 16
16.2 Scope and Assumptions 16
16.2.1 In Scope 16
16.2.2 Out of Scope 16
16.3 Business Volume Metrics 16
16.3.1 Transaction SLAs 16
16.4 Performance Testing Methodology 16
16.4.1 Test Scripting Entrance Criteria 16
16.4.2 Test Execution Entrance Criteria 16
16.4.3 Performance Test Runs 16
16.4.4 Test Description 16
16.4.5 Workload Mix Distribution 16
16.5 Test Schedule 16
16.6 Key Milestones 16
16.7 Environment Setup 16
16.7.1 Application Environment Setup 16
16.7.2 Test Tool Setup 16
16.8 Tools used in the project 17
16.8.1 Silk Performer – Controller and Agent details 17
16.8.2 Tool Settings for Load Test 17
16.9 Performance Test Data 17
16.9.1 Test Data Definitions 17
16.10 6 Performance Metrics 17
16.10.1 Client-Side Metrics 17
16.10.2 Server-Side Metrics 17
16.10.3 Server-Side Applications 17
16.10.4 Server Side Monitoring Counters 17
16.11 Test Deliverables 17
16.12 Status and Issue Reporting 17
17. Project Testing Related Tools 17
18. Defect Management 18
19. Objectives of the defect review meetings 18
19.1 Purpose of the defect review meeting 18
19.2 Defect reporting and resolution process 18
19.3 Defect escalation procedure 19
19.4 Defect severity definitions 20
19.5 Defect life cycle stage 21
20. Results and metrics reporting 23
21. Communication and Escalation 24
22. ASSUMPTIONS/CONSTRAINTS/RISKS/ISSUES 25
22.1 Assumptions 25
22.2 Constraints 25
22.3 Issues 25
22.4 Risks 25
1. Revision History
Version No. Date Revised By Description of Change
2. Approvals
The undersigned acknowledge that they have reviewed the Master Test Plan and agree with the
information presented within this document. Changes to this plan will be coordinated with, and
approved by, the undersigned, or their designated representatives. The Project Sponsor will be notified
when approvals occur.
Signature: Date:
Print Name: Janina Johnson
Title: Deputy PMO Director
Role: Program PMO Director
Signature: Date:
Print Name:
Title:
Role: Program Director
Signature: Date:
Print Name:
Title:
Role: Test Manager
Signature: Date:
Print Name:
Title:
Role: PMO TCOE Auditor
3. Definitions
Term Meaning
4. Reference Documents
Documents Repository Path
5. Project Overview
6. Test Strategy Purpose
7. Application Scope
7.1 In Scope Applications
<List the applications or technologies that are covered by this Test Strategy >
Application Name Application Owner Technology
9. Test Environments
The following diagram identifies the environments used for testing.
Identify environment to be used for production fixes v. development
The following diagrams identify the environments used for testing compared to production. Identify
environment to be used for production fixes, development
● Productive Environment
<Insert Diagram Here>
● QA Environment
<Insert Diagram Here>
● Test Environment
<Insert Diagram Here>
● Stress Environment
<Insert Diagram Here>
10.Test Data
Test Lead(s) will define high level data requirements as needed for testing for the key areas. Identify
unique data required by the application such as User Ids and Passwords. Attention to detail such as
drop down boxes and user entered information will need to be passed along to the performance tester.
Also determine if DB needs to be seeded with additional data in order to more closely replicate
production. For example if the production DB has 13 million rows of data in production the same
amount should be used in test.
11.Test Users
Each test case will require one or more test users. Test users must be created to replicate real business
users allowing defects related to authorisation profiles and delegation of duties to be identified. Using
unrealistic role assignment for test users will invalidate all functional tests.
A catalog of test users should be maintained. Automatic provisioning of test users needs to be
established as part of the setup of the test environment.
12.Test Types
12.1 Unit testing
Purpose This preliminary test is performed by the development team for testing of
individual configuration, custom programs and/or technical services (e.g. Fax,
EDI etc.) to ensure that they function according to the detailed technical
specification.
Unit test is a white box test and should test all possible flows. Both positive
and negative conditions should be tested.
Development Phase Development and Testing
Test Scope All configurations, code validation, memory testing, integration, code
complexity, etc.
Test Environment Development Environment
Test Data Manual data created by developers
Interface NA
Requirements
Role Developer
Entry Criteria
● Formal reviews for process models, functional spes and technical
specifications have been completed
● All Inspection related defects have been corrected
● Test plans and test cases are reviewed and signed off
Purpose SIT validates a set of business processes that define a business scenario in a
comprehensive and self-contained manner on a macro level.
This is an end-to-end test of the business process. Typically a business
scenario will involve testing multiple SAP modules test cases together. The
primary objective of this testing, is to discover errors in the integration
between different modules and to verify that the modules work together
correctly as one function. E2E test validates the integration within SAP and
between SAP and Legacy (all Non-SAP) applications. All testing related to
validation of the data interchange between SAP and Legacy applications are
categorized as Interface testing.
Security role based authorization test is performed to ensure that all the
security profiles and roles are being implemented as designed. Security
profile is designed and built based on the job role (i.e., positions) of the end
users. Security roles are assigned at the business transaction level.
The objectives of security testing are
● Ensure that user has access to the required transactions to perform their job
● Ensure that the user does not have access to transactions other than what is required
for the role
● Ensure that accesses to critical system administration transactions are controlled.
● Ensure that only authorized person has the right to view the information on Screens
and Reports.
● Ensure that Delegation being done in SAP (where ever system allows user to delegate
his authority to other user/s) are tested from the viewpoint of the Delegator and to
whom it is being delegated.
Test Scope
● Full End to end business process
● Performance Testing
● Regression
● Test plans and test cases are reviewed and signed off
● No new defects have been discovered for a week prior to System Testing.
● Full Regression
Test Environment Pre-Prod or Implementation
Test Data Mock cutover or Test Data Management tool
Interface Interface connectivity required for all interfacing systems
Requirements
Role Process Team & Business Users
Entry Criteria
● The application works functionally as defined in the specifications
● All areas have had testing started on them unless pre agreed by UAT stakeholder/Test
and Project managers
● Entire system functioning and all new components available unless previously agreed
between UAT stakeholder/Test manager and project managers
● All test cases are documented and reviewed prior to the commencement of UAT
Exit Criteria
● The Acceptance Tests must be completed, with a pass rate of not less than 98%.
● No new defects have been discovered for a week prior to Production Implementation.
15.Defect Management
Defect Review Meetings will be held on a daily basis with SME leads, test leads from all location, test
managers, business leads and integration manager. The goal of this meeting is to ensure that defects
are being resolved in a timely fashion and that any issues or questions are resolved. It is at these
meetings that progress tracking of defect resolution and closure is communicated.
Below diagram help in understand the defect life cycle process quickly and easily. Defect severity
definition and recommended SLAs are available in appendix. Various stages of defects and the
subsequent stages are also listed in appendix in detail.
Defect life cycle
Daily status review meeting: Along with the test execution status discussions, all the outstanding
defects would be discussed in the meeting. Development team, business team, basis team, QA
management and other stakeholders as appropriate would join the meeting. Defect details and
estimated time of fix would be documented in the quality center accordingly.
Defect disposition meeting: this is a twice a week meeting where in only high impact defects as
identified are the candidates for escalation would be discussed in detail. Development team
management, QA team management along with respective leads would discuss the finer details and put
an action plan to resolve them.
Escalation email to development team/SME team manager: QA Manager from would send an email
with details of defects which need immediate attention to development team/SME team manager and
on need bases a triage call involving senior management would be organized to discuss associated risks,
have a resolution plan, and to review the status.
Note:
1. Above mentioned escalation criteria can be adjusted during execution based on number
of days left for the release go-no go decision.
1.
OR
OR
Potential show stopper for Go/ No-Go decision to enter next stage or
Cutover without closure
Major The software system, or subsystem, or software unit (program or 2 Business days
module) within the system produces Incorrect, Incomplete, or
Inconsistent results
OR
Daily Status Report Test Execution status of all the tracks. This report Every working
during test execution contains day
phase
● Test execution planned Vs completed
● Test Case pass/fail numbers,
● Defects open/close , age and severity
● Risks and Issues
● Milestone achievements
● Action tracker
Daily status Project QA stakeholders Email Daily reporting of tasks and
reporting progress of the same against
IBM QA team plan
Escalation hierarchy –
19. ASSUMPTIONS/CONSTRAINTS/RISKS/
ISSUES
Assumptions, constraints, risks and issues are external factors that the decision maker has little or no
control over, thus they inhibit the decision making process.
19.1 Assumptions
Assumptions are entered using Microsoft® SharePoint Decision Data Entry Form that populates the
Decision Log.
19.2 Constraints
Constraints are entered using Microsoft® SharePoint Decision Data Entry Form that populates the
Decision Log.
19.3 Issues
Issues are entered using Microsoft® SharePoint Issue Data Entry Form that populates the Issue
Register/Log.
19.4 Risks
Risks are entered using Microsoft® SharePoint Risk Data Entry Form that populates the Risk
Register/Log.