Test Plan
Test Plan
Revision History
Version Date Author(s) Reviewer(s) Change Description
Copyright Information
This document is the exclusive property of XXX Corporation (XXX); the recipient agrees that he/she may not copy, transmit, use or disclose the confidential and propriety information set forth herein by any means without the expressed written consent of XXX. By accepting a copy hereof, the recipient agrees to adhere to and be bound by these conditions to the confidentiality of XXX's practices and procedures; and to use these documents solely for responding to XXXs operations methodology. All rights reserved XXX Corporation, 2000. XXX IT reserves the right to revisit Business Requirements and Functional Specifications Documents if approval to proceed is not received within 90 days of the issue date.
1 2 3 3.1 3.2 3.3 4 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 5 5.1 5.2 6 6.1 6.2 6.3 7 7.1 7.2
INTRODUCTION TEST PLAN APPROVAL PROCESS ASSOCIATED & REFERENCED DOCUMENTATION Framework, Elements, Events, & User Flows Testing Project Plans Test Plans and Test Scripts TESTING STRATEGY Scope Testing Approach Test Setup Test Environment Details Responsibilities Risks that impact testing Test Suspension/Resumption Criteria Test Stop Criteria TEST CASE DESIGN AND DEVELOPMENT Test Case Design Instructions Test Case Design Deliverables TESTING TEAM Core Team Technical Support Team Assumptions, Constraints, and Exclusions TESTING TOOLS Testing Tools Assumptions, Constraints, and Exclusions
7 7 7 7 7 7 8 8 8 9 10 11 13 13 13 14 14 14 15 15 15 15 16 16 16
XXX
DD-MM-YY
Page 3 of 39
8 8.1 9 10 10.1 11 11.1 11.2 11.3 11.4 11.5 12 12.1 12.2 12.3
KEY EXTERNAL DEPENDENCIES Assumptions, Constraints, and Exclusions METRICS COLLECTION CLASSIFICATION OF ISSUES Assumptions, Constraints, and Exclusions UNIT TESTING Purpose Responsibility Environment Exit Criteria Assumptions, Constraints, and Exclusions INTEGRATION TESTING Purpose Responsibility Environment
17 17 18 19 19 20 20 20 20 20 20 21 21 21 21 21 21 22 22 22 22 23 24 24 24 24 25 25
12.4 Test Data 12.4.1 Mainframe test data 12.4.2 Local test data 12.5 12.6 12.7 12.8 13 13.1 13.2 13.3 Test Execution Process Exit Criteria Additional Information Assumptions, Constraints, and Exclusions SYSTEM (QA) TESTING Purpose Responsibility Environment
XXX
DD-MM-YY
Page 4 of 39
13.4.2 13.5 13.6 13.7 13.8 14 14.1 14.2 14.3 14.4 15 15.1 15.2 15.3 15.4
25 25 26 26 26 28 28 28 28 28 29 29 29 29 29 30 30 31 32 32 33 33 33 35 35 35 35 35
Test Execution Process Exit Criteria Additional Information Assumptions, Constraints, and Exclusions MAINFRAME TESTING Purpose Responsibility Environment Test Execution Process LOAD TESTING Purpose Scope Responsibility Environment
15.5 Testing Methodology 15.5.1 Load testing 15.5.2 Endurance testing 15.5.3 Planned testing cycles 15.6 15.7 15.8 15.9 16 16.1 16.2 16.3 16.4 Metrics to be measured Test Deliverables Assumptions, Limitations and Constraints Exit Criteria REGRESSION TESTING Purpose Responsibility Environment Exit Criteria
XXX
DD-MM-YY
Page 5 of 39
USER ACCEPTANCE TESTING Purpose Responsibility Environment Test Data Test Execution Process Exit Criteria Additional Information Assumptions, Constraints, and Exclusions SOFT LAUNCH
36 36 36 36 36 36 37 38 38 39
XXX
DD-MM-YY
Page 6 of 39
1 Introduction
XXX is in the process of re-engineering and re-designing the XXX.com website. XXX has been working with XYZ Private Limited to complete the architectural and detailed design of the new XXX.com site. At this time the development phase of the project is underway and the site launch is planned for March 2005. Quality Assurance Testing is the joint responsibility of the XYZ and Business team from Budget. The purpose of this document is to provide an overview of the testing process for the XXX.com project. This document will be distributed to all Project Managers for review. It is the responsibility of the Project Managers to distribute this document to the appropriate team members for review where necessary.
XXX
DD-MM-YY
Page 7 of 39
4 Testing Strategy
4.1 Scope
The following types of testing will be conducted for this project: Unit Integration System Regression Load User Acceptance
Participation from all development areas will be required. Each Development Project Plan should account for development participation in each phase of testing. It is anticipated that the level of developer involvement will decrease as the testing progresses.
Iteration: One (1) complete end-to-end A-Pass and one (1) complete end-to-end B-Pass across all modules. Number of iterations for Unit Testing to be determined by Development Project Manager(s)
XXX
DD-MM-YY
Page 8 of 39
Unit test script creation and execution is the responsibility of the development staff(s) Number of iterations for Integration Testing will be on an as needed basis within the Integration Testing Cycle. This will be determined by the Testing Project Manager and Customer Application Project Manager(s) during the Integration Testing Cycle. Integration Testing will include all (N) test scripts during the A-Pass and (E) test scripts during the B-Pass. Number of iterations for System Testing will consist of up to three. Should issues arise that justify additional iterations, the testing timeline will increase by five (5) days per iteration. System Testing will include all (N) & (DN) test scripts during the A-Pass and (E) & (DE) test scripts during the B-Pass User Acceptance Testing: Test scripts to be created by QA team with the help of Business Analyst and User Acceptance Group (Business). Load Testing will consist of a select group of (N) scripts that accurately represent a cross section of functionality against a predetermined load. Regression Testing will be created from the (N), (DN), (E), & (DE) test scripts.
Open Deploy Host 2 App server (WebLogic 8.1) Personalization Engine MUX app.
Person DB
(Oracle 9i)
Phase
Unit Testing
Budget.com
Local Box
Mainframe
Budget Highway Test Budget Highway ATR Budget Highway ATR Budget Highway ATR
11/15 to 11/30
Integration Testing
Integration Environment
11/15 to 3/30
Bug fixes
Local Box
12/1 to 1/14
QA Testing
QA Environment
12/13 to 1/14
QA Environment
1/17 to 2/11
QA Testing
Production Environment
Budget Highway ATR Budget Highway ATR Budget Highway ATR Budget Production
1/17 to 2/11
Limited UAT
QA Environment
2/14 to 3/4
Final UAT
3/7 to 3/18
Soft Launch
Limited testing
XXX
DD-MM-YY
Page 10 of 39
4.5 Responsibilities
No. Area/Section Responsible for testing Next Steps / Important Notes Ensure that test cases cover all requirements listed in signed-off documents - Booking Engine, Non-booking Engine, and NFRs.
1 Functionality based on use cases -- Booking engine, non-booking engine, BCD admin tool, nonfunctional requirement 2 External data feeds
Steven to produce necessary documentation that covers all aspects of data feeds and their processing (for e.g., CRON job names, where (servers), their schedules etc. This needs to be done for both QA & production. Review the list of splash pages and include them in the overall testing project plan. 1. Check with the business if there is a master list with the list of external websites from where XXX.com website is invoked. 2.Discuss with the technical team and business on the list of URL parameters that will be supported in the new XXX.com webs Business needs to complete this list. This activity will start in the month of November. Review the list of static content pages prepared by business and include it in the overall testing project plan. Amit to have a preliminary discussion with Hans to understand the functionalities. Request Hans to create basic test scenarios. Amit to include "load test" of Fast Break in test planning. Ask Indigio team to come up with test plan, test cases and include them in the overall testing project plan. Planning to get test results and updates during the testing phase. Alfredo confirmed that the mainframe team will perform the unit testing and QA for all mainframe transaction changes (PSR items). Alfredo confirmed that the mainframe group will perform the regression testing of all mainframe transactions that are used in Budget.com. Amit to pass on relevant XXX.com test cases to E-Dialog. For e.g., Reservation Confirmation, Reservation Reconfirmation emails being sent etc. Get their validation
3 Splash pages hosted under XXX.com for partners 4 Requests from other sites with specific URL parameters to the XXX.com website
6 Fast break front end application and the admin tool 7 Indigio managed admin tools -affiliate management tool, location correction admin tool 8 Testing of modified transactions the new mainframe
9 Regression testing of the all the mainframe transactions 10 Sending Emails and Email campaign management E-dialog
XXX
DD-MM-YY
Page 11 of 39
No.
Area/Section
11 Reporting -- basic testing 12 Reporting -- extensive testing including analytics reports 13 Outage component
Review the tagging requirements from Indigio and also include them in the master testing project plan Ask Indigio team to come up with test plan, test cases and include them in the overall testing project plan. Planning to get test results and updates during the testing phase. Need to discuss with the technical team and also the IBM further regarding the testing. Planning to get test results and updates during the testing phase. Ask Indigio team to come up with test plan, test cases and include them in the overall testing project plan. Planning to get test results and updates during the testing phase. Get detailed plan from business outlining test scenarios, test data, group responsible for testing and also the schedule. Include it in the overall testing project plan. TBD
Note: Lead System Tester (XYZZ) will be "accountable" for completion of testing activities listed below. However, responsibilities for executing the tests below will be with folks identified below.
XXX
DD-MM-YY
Page 12 of 39
Probability
Medium
Impact
High
Contingency plan
The testing time may be lost and to compensate for this, additional resources would be added appropriately. A database backup strategy should be in place so that loss of data can be prevented. Test the functionality not involving data feed until migration. Local test setup should be in place. Except scenarios involving mainframes, others can be executed locally.
Medium
Low
Medium Low
Low Medium
XXX
DD-MM-YY
Page 13 of 39
Splash Pages list Generate test cases using Equivalence partitioning and Boundary value analysis techniques. Mention trace-ability for the test cases from Use Case document. Number of steps to execute the test case should not be more than 20. We can split such scenarios in number of test cases for better tracking purpose. Test case execution steps need to be in detail so that any tester can complete the test case without any System knowledge. Whenever required mention the business rules and formulas under expected result column for reference.
XXX
DD-MM-YY
Page 14 of 39
6 Testing Team
6.1 Core Team
Testing Project Manager: Lead System Tester: System Testers: User Acceptance Group Coordinator:
XXX
DD-MM-YY
Page 15 of 39
7 Testing Tools
7.1 Testing Tools
Manual: Test Cases will be created via Excel Templates. Refer Templates section under Sharepoint for Test case template. Automated: Automated scripts will be created/executed using the following: Quick Test Pro by Mercury Load Runner by Mercury
Defect Tracker: PVCS Tracker will be used as a defect tracking tool. The testing tool related decision is pending for budget approval. The above listed tools are the proposed testing tools.
XXX
DD-MM-YY
Page 16 of 39
Completion of Development & Unit Testing Completion of HTML Integration of HTML into development code Completion of Informative Pages Data migration for testing purposes Data feed documentation from Casey Miller for validating nightly feed data in application.
XXX
DD-MM-YY
Page 17 of 39
9 Metrics collection
Detailed defect analysis shall be done for the reported defects and test case execution status shall be reported for each module. The metrics to be collected during test life cycle are: 1. Defect location Metrics Defects raised against the module shall be plotted on a graph to indicate the affected module. 2. Severity Metrics Each defect has an associated severity (Critical, High, Medium and Low), which is how much adverse impact the defect has or how important the functionality that is being affected by the issue. Number of issues raised against severity shall be plotted on a graph. By examining the severity of a projects issues, the discrepancies can be identified. 3. Defect Closure Metrics To indicate progress, the number of raised and closed defects against time shall be plotted on a graph. 4. Defect Status Metrics It will indicate the number of defects in various states like, new, assigned, resolved, verified, etc. 5. Re-opened bugs The number of defects re-opened by testing team once they are fixed by development team shall be reported & percentage shall be calculated with respect to total number of defects logged. 6. Test case progression trend: This trend shall indicate the progress of test execution module wise. It shall state the number of test cases planned, executed, passed and failed. These metrics shall be collected and presented as test summary report after each test cycle. Also, these shall be part of weekly status report. Refer Templates section under Sharepoint for Metrics Analysis template.
XXX
DD-MM-YY
Page 18 of 39
10 Classification of Issues
The following standard will be used by all involved parties to identify issues found during testing : Severity 1: Critical Issues: Application crashes, returns erroneous results, or hangs in a major area of functionality and there is no work around. Examples include the inability to navigate to/from a function, application timeout, and incorrect application of business rules. Severity 2: High Functional Issues: Functionality is significantly impaired. Either a task cannot be accomplished or a major work around is necessary. Examples include erroneous error handling, partial results returned, and form pre-population errors. Severity 3: Medium Functional Issues: Functionality is somewhat impaired. Minor work around is necessary to complete the task. Examples include inconsistent keyboard actions (e.g. tabbing), dropdown list sort errors, navigational inconsistencies, and serious format errors causing usage issues (e.g. incorrect grouping of buttons). Severity 4: Low Functional Issues: Functionality can be accomplished, but either an annoyance is present, or efficiency can be improved. Cosmetic or appearance modifications to improve usability fall into this category. Examples include spelling errors, format errors, and confusing error messages. Examples of each severity level to be delivered to participants prior to Integration Testing.
XXX
DD-MM-YY
Page 19 of 39
11 Unit Testing
11.1 Purpose
The purpose of Unit Testing is to deliver code that has been tested for end-to-end functionality within a given module and normal interfacing between dependent modules in the development environment.
11.2 Responsibility
Testing will be the responsibility of the individual developers. Ultimate signoff for promotion into Integration Testing will be the responsibility of the Development Project Manager(s). Configuration management, builds, etc. will be the responsibility of the Configuration Management Team at the direction of the Development Project Manager.
11.3 Environment
Refer section 4.4 Test Environment Details
Specifically excluded from the Unit Test exit criteria are: Comprehensive data validation Exhaustive test of various entry and exit points across modules Comprehensive testing for abnormal situations and error handling combinations
XXX
DD-MM-YY
Page 20 of 39
12 Integration Testing
12.1 Purpose
The purpose of Integration Testing is to deliver code that has been comprehensively tested for Normal (N) and Exception (E) conditions across all modules in the Development environment.
12.2 Responsibility
Testing Team holds the primary responsibility for the execution of Normal (N) and Exception (E) test scripts. All N & E type test scripts will be completed prior to the start of Integration Testing. The N & E test scripts will be executed for the following modules: Homepage Reservations Rates Analytics Profile Management Locations Personalization Search Admin tools Visitor Management Content Management Administration
Configuration management, builds, etc. will be the responsibility of the Configuration Management Team at the direction of and agreement the Development Project Managers and Testing Project Manager. Ultimate sign-off of Integration Testing and promotion into System Testing resides with the Testing Project Manager.
12.3 Environment
Refer to Section 4.4 Test Environment Details.
XXX
DD-MM-YY
Page 21 of 39
12.4.2
Test Lead will identify test data for local database (Oracle 9i). With help of development team, Test Lead will ensure test data is set up before the start of integration testing. A Testing Data Repository Document will be delivered on or before Integration Testing. Specific reference will be made in the N & E Test Scripts to the data types listed in the Testing Data Repository Document.
XXX
DD-MM-YY
Page 22 of 39
XXX
DD-MM-YY
Page 23 of 39
13.2 Responsibility
The Testing Team holds the primary responsibility for the executions of Normal (N), Exception (E), Data Normal (DN), and Data Exception (DE) test scripts. Test scripts will include field form validation and display rules as stated in the Elements section of the Page Specifications for eCommerce XXX.com Redesign. The N, E, DN, & DE test scripts will be executed for the following modules: 1. Homepage 2. Reservations 3. Rates 4. Analytics 5. Profile Management 6. Locations 7. Personalization 8. Search 9. Admin tools 10. Visitor Management 11. Content Management 12. Administration The System Testing Team will be comprised of individuals from the Testing Staff. Configuration management, builds, etc. will be the responsibility of the Configuration Management Team at the direction of the Development Project Managers and requires the agreement of the Testing Project Manager. Ultimate sign-off of System Testing and promotion into User Acceptance Testing resides with the Testing Project Manager.
13.3 Environment
Refer section 4.4 Test Environment Details
XXX
DD-MM-YY
Page 24 of 39
System Test Script execution will be completed as per the following Operating System/Browser matrix:
Windows XP IE 6.0 IE 5.5 IE 5.0 Mozilla 1.7.2 Netscape 7.1 AOL 5.0 C C U
Windows 2000 C
Windows 98
Mac OS/9
U U U U
C Complete test cases suite will be executed on OS/Browser combination U Only critical functionalities and UI test cases will be executed on OS/Browser combination Blank OS/Browser combination will not be tested
13.4.2
Test Lead will identify test data for local database (Oracle 9i). With help of development team, Test Lead will ensure test data is set up before the start of integration testing. A Testing Data Repository Document will be delivered on or before System Testing. Specific reference will be made in the N, E, DN, & DE Test Scripts to the data types listed in the Testing Data Repository Document. Additional specific data may be required. Should this be the case, the data will be listed on the corresponding test script.
XXX
DD-MM-YY
Page 25 of 39
Issue Closure: The Testing Team will review the issues, which have been resolved, to verify and close/re-instate the issues and resolution priority. All effort will be made to close the resolved issues as soon as possible. Issue Tracking: The Testing Team will be responsible for the administration of the tracking tool.
Specifically excluded from the System Test exit criteria are: Security Testing System Crash/Restart Testing o o o o DB crash iPlanet crash Hardware DB capacity/resources
DD-MM-YY
Page 26 of 39
Testing of Personalization engine will be limited to business rules created by developer. Visitor tracking details will be verified only at jsp level by View Source as Reporting tool has not been finalized.
XXX
DD-MM-YY
Page 27 of 39
14 Mainframe testing
14.1 Purpose
The purpose of Mainframe testing is to deliver the stable code for new functionality Rate Shop and also to verify if existing functionality works fine with new XXX.com application.
14.2 Responsibility
Mainframe testing will be carried out by Mainframe QA team at XXX. It will be scheduled and coordinated by XYZ Test team according to test execution dates for System testing, UAT & PAT.
14.3 Environment
Mainframe modules will reside in the Budget Highway Acceptance Test Region (ATR).
XXX
DD-MM-YY
Page 28 of 39
15 Load Testing
15.1 Purpose
The purpose of Load Testing is to deliver code that has been load tested and is ready for promotion into the Production Environment.
15.2 Scope
Load Testing will consist of a select group of (N) scripts that accurately represent a cross section of functionality. Scripts will be executed to generate up to 1500 user peak load levels. Tests will be executed for 1500 concurrent users at load levels of 50, 100, 200, 500, 1000 and 1500. The test execution would be completed when the 2000 user load is ramped up or any failure condition necessitates stopping the test. The team would monitor the test execution and record the timings, errors for report preparation.
15.3 Responsibility
The creation and execution of the Load Testing Scripts is the responsibility of the Testing Team. Ultimate authority rests with the Testing Project Manager, who will be in close contact with User Acceptance Group.
15.4 Environment
The Mercury Load Runner tool will be physically located on a server at Denver. For the purpose of test execution, Load Runner tool will be pointed to the System Testing Environment, which will become the Production Environment upon Implementation. XYZ Test team will access Load Runner using remote client tool to execute the scripts. Offshore Test team will be allocated one VU Gen license to create scripts offline. Description Budget Application Servers Controller Load Generator DB Server IP Address
Web Server Details pertaining to Network: Network Card setting 100 MBPS Duplex Bandwidth of LAN 100 MBPS
XXX
DD-MM-YY
Page 29 of 39
15.5.1.1
Approach
Serviceability
- Determine the serviceability of the system for a volume of 1500 concurrent users. - Measure response times for users Steps 1. Virtual users estimation: Arrive at a maximum number of concurrent users hitting the system where the system response time is within the response time threshold and the system is stable. This number would be the virtual user number and should be higher by a factor of x times the average load. 2. Virtual users profiles and their distribution for client operations:
UB Program member Unaffiliated consumer XXX Partner PD member FB member
Homepage load Login/Logout Rate Request response Rate Request response for multi BCD rate shop Create a booking Modify/Cancel booking One click booking
3.
Load simulation schedule: Schedule for concurrent user testing with a mix of user scenarios and the acceptable response times: On Dial-up (56 Kbps) On Broadband
XXX
DD-MM-YY
Page 30 of 39
XXX Employee
Travel Agent
Homepage Load Log-in/ Log-out Rate Request-Response Rate Request Response for multi-BCD rate shop Create a booking Modify / Cancel Booking One-Click Booking
19 seconds NA As Broadband considered more relevant. NA As above NA As above NA As above NA As above NA As above is
NA As Dial-up is considered more relevant. Sub second Sub 20 seconds Sub 20 seconds 1 min: 30 seconds (inclusive of mandatory intermediate steps) 1 min: 3 seconds (inclusive of mandatory intermediate steps) 30 seconds
Statistics The graph with y-axis representing response times and x-axis concurrent users will depict the capability of the system to service concurrent users. The response times for slow users will provide worst-case response times
15.5.2
Endurance testing
Validate systems behavior for continuous hours of operation for projected load conditions. Number of continuous hours of operation is to be discussed with Business Approach Endurance testing check resource usage and release namely; CPU, Memory, Disk I/O and network (TCP/IP sockets) congestion for continuous hours of operation Determine the robustness - check for breakages in the web server, application server and data base server under CHO conditions. Steps 1. Arrive at a base line configuration of the web server and application server resources i.e. CPU, RAM and Hard disk for the endurance and reliability test. 2. The test would be stopped when one of the components breaks. A root cause analysis is to be carried out based on the data collection described under the server side monitoring section. Client side monitoring Failure rate -- web server responses/timeouts/exceptions and incomplete page downloads Response time degradation under peak load numbers (concurrent users)
Server side monitoring Collect CPU, Disk and Memory usage for analysis
XXX
DD-MM-YY
Page 31 of 39
Check for application server slow down/freeze/crash Check for resource contention/deadlocks Database server load and slow down Web server crashes
Collect data for analysis to tune the performance of web server, application server and database server If there is an alarm support in the tool through an agent, check for alerts when the activity level exceeds preset limits. Result The result of this test will be a proof of confidence for Continuous Hours of Operation. The data collected in this phase would give pointers to improve the reliability of the system and fix any configuration, component parameters for reliable performance. If there is a load balancing configuration deployed, check if it is able to distribute the requests
15.5.3
Load testing will be done on XXX.com Web Application against a range of operational conditions and factors including network bandwidth, data volumes and transaction frequency. The test cycles shall be run on the network measuring performance from 50 users to 1500 users. The test cycle shall be run for 50 users initially (lets say incrementing 5 users per 5 seconds till it reaches 50 concurrent users). The test shall be stopped if application crashes before reaching 50 users and issue shall be reported to development team. The response time shall be noted for 50 concurrent users before stopping the test. If the response time is exceeding the benchmark limit, load test shall be stopped until development team fixes the issue. If the response time is well within benchmark limit, fresh test cycle shall be run with an aim to reach 100 concurrent users. The same process shall be used until 1500 concurrent users target is met within acceptable response time. The response times will be noted for the following user loads within the same test cycle: 50 users 100 users 200 users 500 users 1000 users 1500 users
The first cycle of Load testing will be carried out on QA environment and second cycle on Production environment during System testing phase.
XXX
DD-MM-YY
Page 32 of 39
OS Level Primary Metrics: Processor Usage Memory Usage Disk I/O Rates
Constraints If Load test scripts shall be executed from offshore, network delay shall add up in response times.
XXX
DD-MM-YY
Page 33 of 39
Testing Project Manager sign-off Meet the exit criteria for the Phase in which the Load Test is executed
XXX
DD-MM-YY
Page 34 of 39
16 Regression Testing
16.1 Purpose
Deliver code that has been regression tested and is ready for promotion into the Production Environment. Regression Testing will consist of a majority of (D), (N), (DN), and (DE) type test scripts.
16.2 Responsibility
The creation of the Regression Testing Scripts is the responsibility of the Testing Team. Regression Test Scripts will be created and executed using Mercury Quick Test Pro. The execution of the Regression Testing Scripts is the responsibility the Testing Team.
16.3 Environment
The Mercury Quick Test Pro software will be physically located in XYZ, Bangalore. For the purpose of test execution, QTP will be pointed to the System Testing Environment.
XXX
DD-MM-YY
Page 35 of 39
17.2 Responsibility
User Acceptance Testing is to be executed by the User Acceptance Group (Business). Management of the User Acceptance Testing Phase will be the responsibility of the Testing Project Manager via the User Acceptance Group Coordinator. The test scripts used during User Acceptance Testing are to be created by Test Team with the help of Business Analyst and User Acceptance Group. Test scripts should accurately reflect the functionality documented in the Booking Engine Use Cases EBR - Budget.com, Non-Booking Engine Use Cases XXX.com Redesign EBR and Page Specifications for Ecommerce XXX.com Redesign. Ultimate authority rests with the Testing Project Manager, who will be in close contact with the User Acceptance Group Coordinator. Configuration management, builds, etc. will be the responsibility of the Configuration Management Team at the direction of the Development Project Manager and requires the agreement of the Testing Project Manager.
17.3 Environment
Refer section 4.4 Test Environment Details
XXX
DD-MM-YY
Page 36 of 39
Issue Tracking: The Testing Team will be responsible for the administration of the tracking tool.
UAT group tester logs defect in PVCS Tracker and assigns to UAT coordinator
Yes
No Test team verifies the defect New application version is released into UAT env. with Release Notes UAT group tester verifies fixed defect Yes Defect passed? Close defect
Specifically excluded from User Acceptance Test exit criteria are: Security Testing System Crash/Restart Testing o DB crash
XXX
DD-MM-YY
Page 37 of 39
o o o
XXX
DD-MM-YY
Page 38 of 39
18 Soft Launch
TBD (will enter details after discussing with Business)
XXX
DD-MM-YY
Page 39 of 39