Software Quality Assurance Plan
Software Quality Assurance Plan
<Project Name>
<Department Name>
<Version Number>
SRS Template P a g e 1 | 16
Document history
Date Version Author Reviewed Approved Description
by by
P a g e 2 | 16
Table of Contents
1. Introduction 5
1.1. Background 5
1.2. Purpose/Objective 5
2. Scope of Work 6
3. Test Strategy 6
3.3 Tools 11
3.4 Defect Management 11
3.5 Environments 11
3.5.1. Environment Details 11
4. Resource 12
5. Project Milestones 13
6. Deliverables 13
7. Communication 13
8. Assumptions 14
9. Risk & Mitigation 14
10. Glossary of Terms 15
P a g e 3 | 16
List of Tables
Table 1: User Interface Testing Details ..............................................................................................................7
Table 2: Functional Testing Details .................................................................................................................... 8
Table 3: Integration Testing Details ................................................................................................................... 9
Table 4: Regression Testing Details ................................................................................................................... 9
Table 8: User Acceptance Testing Details ....................................................................................................... 11
Table 9: List of Tools for Testing Activities..................................................................................................... 11
Table 10: Environment Details ........................................................................................................................... 12
Table 12: Roles/Responsibilities ........................................................................................................................ 12
Table 14: Assumptions .......................................................................................................................................... 14
Table 15: Risk & Mitigation .................................................................................................................................. 15
Table 16: Glossary of Terms ................................................................................................................................ 16
P a g e 4 | 16
1. Introduction
1.1. Background
1.2. Purpose/Objective
The main purpose of the Test Strategy Document for the proposed <Application Name> System is to document
and define the “Software Quality Assurance Plan” in regards to the overall Project scope, including the following:
Identification of the various Testing Phases, Types of Testing, Testing environments, Testing guidelines
and Issue resolution
High level business process scenarios that will be covered during testing
The Testing activity will be carried over across the key phases as identified below:
Test Planning
Development Team
QA Team
P a g e 5 | 16
2. Scope of Work
Following identified functional requirements of the proposed <Application Name>, are considered to be in scope
of testing:
3. Test Strategy
The strategy to be adopted for testing the proposed <Application Name> System is based on a phase-wise
approach as detailed in the following sections.
<Mention actual project strategy. A typical Test Cycle will include the following steps:
User Interface (UI) testing will verify user’s interaction with the software. In addition, UI testing ensures that the
objects within the UI function as expected and conform to corporate or industry standards.
P a g e 6 | 16
# Item Description
2 Technique Create tests for each screen, pages and reports to verify proper navigation.
6 Special Bug fixing effort and retesting effort will impact the total effort estimated
Considerations for testing.
Table 1: User Interface Testing Details
The goals of these tests are to verify business functionality of the application. Also included are the testing of the
screens for data, navigation and functionalities of the application. Focus will be on testing the dashboard and all
the reports for data, report formats and navigation.
# Item Description
1 Test Objective Ensure proper target-of-test functionality, including navigation, data entry,
processing, workflow and retrieval
Execute each use-case flow, or function, using valid and invalid data, to
verify the following:
P a g e 7 | 16
# Item Description
6 Special Testing team will verify the business rules and functions specified
Considerations in the documents provided.
Bug fixing effort and retesting effort will impact the total effort
estimated for testing.
Integration Testing will be carried out at the end. This phase will concentrate on the integration of all of the
modules. Data flow will be validated end-to-end to ensure smooth functioning of the system as a whole.
# Item Description
1 Test Objective Ensures that the end product is working as expected with all the different
modules integrated at the end.
2 Technique Check for Interfacing points between each of the modules completed:
4 Entry Criteria Completion of the development and system Testing activities for all
the modules
Integration Test points have been identified and Test Cases are
prepared
5 Exit Criteria All the Test cases have been executed and Test coverage status is
100%
6 Special Sufficient ramp up time required for the testers to understand the
Considerations design and data flow.
P a g e 8 | 16
Table 3: Integration Testing Details
Regression Testing would be carried out in the application in different iterations. A regression suite would be
built for each of the module and Testing will be carried out to ensure that all the functionalities implemented
earlier are as they were.
# Item Description
1 Test Objective Ensure that the functionalities implemented in each of the iterations for
different modules/functionalities remains intact after the implementation
of subsequent modules.
2 Technique Functionalities for each of the modules will be implemented in each of the
iteration, as defined earlier. So it is imperative that modules implemented
in the later iterations will be tested along with some of the functionalities
from earlier iterations.
5 Exit Criteria All the Test cases are passed with no severe bugs/issues
6 Special None
Considerations
Table 4: Regression Testing Details
The standard top-down approach that follows along with the sequential steps involved for a typical
performance/scalability testing engagement are depicted in the following diagram.
P a g e 9 | 16
This approach is based on business perspective unlike bottom-up approach, which takes into consideration the
technical perspective. In the top-down approach, focus is on response time for any transaction mix and user
volume, irrespective of the Hits/sec or Throughput.
<Department name> decides on the approach depending on the business goal and objective for any client like
system integrity, performance benchmarking, performance enhancement and performance problem diagnosing,
which drives the perspective of conducting this particular test type.
User Acceptance Testing (UAT) focuses mainly on the functional requirements of the application and performed
by the real users of the system.
# Item Description
4 Entry Criteria System Testing along with all other Testing is complete.
Test data specific to user and role are available to the testing team.
Users are identified to perform the Testing
5 Exit Criteria For each known actor type the appropriate function or data are
available
P a g e 10 | 16
# Item Description
6 Special Test data should be available specific to user/group level and role.
Considerations
Bug fixing effort and retesting effort will impact the total effort
estimated for testing
3.3 Tools
The following tools will be employed for the Testing activities involved in the project:
SVN –
Defect management would be taken care in Team Foundation Server (TFS). All the defects and issues during the
Testing lifecycle would be logged in TFS. All the developers and Testers would be provided access to the respective
Project in TFS accordingly to work on the Defects.
3.5 Environments
The details of the environment, where the testing will be conducted, are as follows:
P a g e 11 | 16
# Test Type Environment
4. Resource
The following table describes the Roles / Responsibilities, as regards to the testing activities, for the proposed
<Application Name> system:
# Item Description
Responsibilities:
- Test Execution
- Mitigate Risks
2 Test System Ensures test environment and assets are managed and
Administrator maintained.
Responsibilities:
- Administer Test Management System
Table 8: Roles/Responsibilities
P a g e 12 | 16
5. Project Milestones
The Configuration Manager will be responsible for coordinating the migration of each build to each of the
environments and ensuring that all documentation for each build is complete and stored in the appropriate
directories. The configuration manager will also coordinate with external interfacing systems to reduce the risk
of any downtime resulting from code incompatibility introduced as new builds are implemented.
6. Deliverables
- Strategy document outlining the high level approach for testing, resources, timeline,
environment etc., along with detailed Test plans for functional, unit and end to end process
testing
Test Cases
- Document containing the set of conditions or variables under which a tester will examine all
aspects including inputs and outputs of a system along with a detailed description of the
steps that should be taken, the results that should be achieved, and other elements that
should be identified
7. Communication
The following is the proposed communication protocol for the proposed <Application Name> system testing:
Meeting between test participants and stakeholder members associated with Functional Test to review
test results and discuss next steps.
All operational issues will be tracked by the functional test team. These would also be documented as
part of the final test report.
All functional changes for e.g. a change in a query that affect functionality will be validated and approved
by the application development team.
P a g e 13 | 16
The changes that are made will be communicated to be verified and used as a checklist for functional
regression testing.
Weekly meeting to determine requirements fulfilment for tests planned in the following week.
8. Assumptions
The functional testing strategy for the proposed <Application Name> system has been created and documented
based on the following assumptions:
# Item Assumptions
1 Application Each build should be made available to the QA at least one day
prior to Test Execution as scheduled in plan, for one round of
smoke testing before the final run of test execution
2 Data and Database The input data to be provided by the respective application teams
The database is equivalent to production in terms of data volume,
hardware and software configuration
Application team to help functional test team write test cases for
validating the correct execution of a given scenario and for
profiling
Table 9: Assumptions
The following are the typical risks and the mitigation from the proposed <Application Name> system functional
testing perspective:
1 Application, Ambiguous Requirement High Test Team members to discuss with the
Database, Development Team to understand the
Infrastructure requirement and application whereabouts
P a g e 14 | 16
# Item Risks Severity Mitigation
2 Project Risks Test Failures Medium Use Friday nights and weekends to make
up
Test pre-requisites not ready Medium Execute other non-dependent test cases
System Crashes during test Medium Action for Immediate restore and recovery
activity
Re-Test due to Reports not High Some Key tests to be repeated to assess
available impact of reports
# Term Description
2 Conversion Confirms the accuracy of the conversion procedures needed to initially load
Testing the data into the system. Also validates the usage of the data during day-to-
day production activity. Performed during the System test level.
3 Defect/Bug The deviation of an actual result from the expected result during the
application testing. A flaw in the software with potential to cause a failure
which is raised by a tester and is meant for a developer to fix. If the defect
cannot be resolved only by developers, then the item would be considered
an issue.
Defect Severities:
P a g e 15 | 16
# Term Description
4 End-to-End Additional interface testing from the beginning to the end of a process
Testing including all upstream and downstream impacted systems that receive data,
whether direct or indirect from the primary system. Performed during the
System test level.
5 Entry Criteria Metrics specifying the condition that must be met in order to begin testing
at the next stage or level.
6 Environment The collection of hardware, software, data and personnel that comprise a
level of test.
7 Exit Criteria Metrics specifying the conditions that must be met in order to promote a
software product to the next stage or level.
9 Integration The objective of Integration Testing is to test the interaction of related data
Testing interface components in order to confirm that these components function
properly when integrated together. This serves to identify and resolve major
interface defects before starting System Testing. Typically conducted by the
development team.
10 Interfacing Downstream or upstream system that may require change due to the
System primary system.
13 QA Quality Assurance
P a g e 16 | 16